Dupe uploads check

We had a post come in this morning on mediawiki-l about an extension for adding a hash-based duplicate file check on upload.

A similar check had been recently added to MediaWiki core, but only on the file description page — it wouldn’t stop your upload while you were in the process. Since all the required backend was there I’ve gone ahead and added it in as a built-in feature for MediaWiki 1.13:

(Note that since we can’t get the file content hash until you upload, there’s still no way to give the warning before you actually upload it. But at least now it lets you cancel at that point instead of having to ask a sysop to come delete your file!)

German FlaggedRevs tested for 10 minutes

Ok, so we finally got the FlaggedRevs for German Wikipedia config set up… then turned it off after a few minutes.

We did, alas, encounter a few problems, which didn’t come up as much in earlier testing, but came up *hard* in a few minutes around 3am at Wikipedia. :)

Floating UI boxes and floating infoboxes don’t mix well.

The nice small versioning marker is really nice, but that’s way too disruptive, and we’ll need to get it worked out one way or another.

Second, some of the reporting pages weren’t working, in part due to some last-minute tweaks to the DB layout to make it easier to deploy. (This should be fixed now.)

Third, the “redirected from” subtitles are being broken, which’ll disrupt some general editing functionality in an unpleasant way. An example on de.labs test wiki.

Once the UI bits are fixed up, we’ll give it another test run… und FlaggedRevs kommt wieder!

Top 10 Wikimedia DB errors

I did a quick look last night through our database error logs for the last week or so, breaking them down by function and error type. Here’s the top ten function-err loci:

Hits Function errno Error
620 Article::updateCategoryCounts 1213 Deadlock found when trying to get lock; Try restarting transaction
240 Article::insertOn 1062 Duplicate entry ‘N-XXX’ for key 2
41 Article::doDeleteArticle 1213 Deadlock found when trying to get lock; Try restarting transaction
26 LinksUpdate::incrTableUpdate 1213 Deadlock found when trying to get lock; Try restarting transaction
19 TitleKey::prefixSearch 1030 Got error 28 from table handler
9 Title::invalidateCache 1213 Deadlock found when trying to get lock; Try restarting transaction
9 2013 Lost connection to MySQL server during query
8 User::saveSettings 1205 Lock wait timeout exceeded; Try restarting transaction
8 TitleKey::prefixSearch 2003 Can’t connect to MySQL server on ‘XXX’
7 Job::pop 1213 Deadlock found when trying to get lock; Try restarting transaction

A large chunk of our DB errors are from conflicting transactions; the number one spot is currently taken up by updates to category counts, which is often part of an expensive page deletion transaction.

We’re often pretty lazy about rerunning database transactions when they’re rolled back, throwing an error and making the end-user resubmit the change. This is kind of lame, but at least the transaction rollback theoretically keeps the database consistent.

The number two spot seems to be for conflicting page creations — possibly due to automatic resubmissions after a slow save operation.

There’s a few “disk full” errors, which were probably due to a transitory error on one DB box.

KDE WTF

Is this dialog showing success or failure?

screenshot-build-search-indices-kde-help-center.png

Look closer…

screenshot-build-search-indices-kde-help-center-1.png

Wha? I’m still not sure what’s going on, and I still don’t seem to have a search index in the KDE Help Center. Sigh.

Update: htdig wasn’t installed, which it didn’t report very well. After installing I can apparently build the index, but search still fails.

Again, the error doesn’t get reported well in the UI — it just echoes the khc_htsearch.pl command line instead of explaining what went wrong:

$ khc_htsearch.pl –docbook –indexdir=/home/brion/.kde/share/apps/khelpcenter/index/ –config=kde_application_manuals –words=multi-file+search –method=and –maxnum=5 –lang=en
Can’t execute htsearch at ‘/srv/www/cgi-bin/htsearch’.

Sigh…..

GParted rocks

Did some upgrades on my girlfriend’s Windows PC today… The techs who originally set up her computer gave her an unconscionably small C: drive, a tiny 10 gig slice of an already-modest 40 gig drive. Even with careful discipline trying to put things on the D: partition, 10 gigs doesn’t go very far. Shared DLL installs, gobs of temporary files, cached updaters for all manner of software, etc all fill that stuff up and it was running out of room constantly.

My secret weapon to fix this was to be an Ubuntu Linux live CD, which conveniently comes with GParted.

I took a 200 gig drive left over from my dear departed Linux box and hooked it up, figuring I could back up the old data over the network, overwrite it with a raw disk image from the 40 gig drive, and then resize the NTFS partitions to a livable size.

Easy!

Well, sort of. :)

It turns out I could have saved myself some trouble at the command line by copying the partitions across drives with GParted itself instead of goin’ at it all old-school with dd. (Neat!)

I had two sticking points, though.

First, it didn’t seem to let me move the extended (D:) partition to a different place on the drive. That meant there was no room to expand the C: partition, which was the point of the exercise.

I ended up having to create a copy of the D: partition, which it let me put in the middle of the drive, and then delete the old partitions. Kind of roundabout, and it changed the partition type from extended to primary, but Windows doesn’t seem to care about that so keep those fingers crossed…

My second snag was due to Ubuntu’s user-friendliness. As soon as the new partition was created, the system mounted it — which caused the NTFS cloning process to abort, warning that it can’t work on a mounted filesystem.

Nice.

Had to go into the system settings and disable automatic mounting of removable media… luckily that’s easy to find in the menus. If you know it’s going to be there, at least. :)

iProduct vs Veronica Mars

So I finally gave in and picked up an Apple TV unit; that frees up my Mac Mini from TV duty to be my main home computer, while letting the Apple TV concentrate on being a media player.

The good: unit is very compact, setup is pretty straightforward, and picture looks good once I adjust the ungodly color saturation my TV defaults to on the component input.

The bad: at least for the shows I tested (Veronica Mars season 3), video playback is totally broken at HD resolutions!

At 720p playback stutters very badly, with very jerky motion and sound out of sync from picture by about a second.

At 1080i I don’t even *get* picture during playback, just sound. (Menus display fine.)

At 480p everything looks great, though, and the currently available content doesn’t need more than that, so I’m leaving it there for now.

A quick Google scan doesn’t show any other obvious complaints of this problem, so I’m not sure if I’ve got a bogus unit or if it’s something funky with the Veronica Mars encoding that might not be a problem with other shows…

Update: At some point it started working fine. *shrug*