Finally got Motion reinstalled, playing about:
Assembling furniture is fun! Wait, that’s a lie.
So I finally gave in and picked up an Apple TV unit; that frees up my Mac Mini from TV duty to be my main home computer, while letting the Apple TV concentrate on being a media player.
The good: unit is very compact, setup is pretty straightforward, and picture looks good once I adjust the ungodly color saturation my TV defaults to on the component input.
The bad: at least for the shows I tested (Veronica Mars season 3), video playback is totally broken at HD resolutions!
At 720p playback stutters very badly, with very jerky motion and sound out of sync from picture by about a second.
At 1080i I don’t even *get* picture during playback, just sound. (Menus display fine.)
At 480p everything looks great, though, and the currently available content doesn’t need more than that, so I’m leaving it there for now.
A quick Google scan doesn’t show any other obvious complaints of this problem, so I’m not sure if I’ve got a bogus unit or if it’s something funky with the Veronica Mars encoding that might not be a problem with other shows…
Update: At some point it started working fine. *shrug*
For the record, this is the most awful software upgrade procedure I’ve experienced.
- The previous versions of various Apple pro media apps such as Motion don’t run at all on newer, Intel-based machines. BOO!
- The various individual products have been discontinued in favor of the Final Cut Studio bundle which includes a bunch of them. You can no longer get just one. So to run your old copy of Motion on your new MacBook, you have to upgrade to the entire bundle… BOO!
- …which they offer a huge cross-grade discount on! YAY!
- To get your new installation media, you have to mail in your original installation DVD… which will naturally get lost in the mail. BOO!
The good news is, if you have enough documentation you can talk them into replacing your lost media so you can send in for the upgrade again.
Trying to set up Windows Server 2003 (trial download) to confirm & fix bug 3000 for MediaWiki running under IIS.
[[Wikipedia:Apricot|Apricots]]: YUMMMMMMM! Easy to slice in half and remove the pit, and verrrry delicious. Dried apricots are also nice and last longer in the cupboard.
[[Wikipedia:Peach|Peaches]]: Similar to its smaller cousin the apricot, but IMHO they’re harder to work with. The pitting vs deliciousness ratio is unfavorable.
[[Wikipedia:Blueberry|Blueberries]]: Pretty awesome when you pick them yourself in the forest. Prepackaged, though, I find them kinda… boring and tasteless. Maybe I just got boring mass-produced Chilean blueberries, though.
[[Wikipedia:Blackberry|Blackberries]]: Upside: yum! Downside: full of little crunchy seeds.
I tossed together a silly WordPress plugin to special-case links into [[leuksman|my wiki pages]] such as my [[gimp mac helper]] tools, as well as to MediaWiki’s bug tracker (eg, bug 1) and SVN repository (r12345 was nice).
SVN anonymous checkout: http://svn.wikimedia.org/svnroot/mediawiki/trunk/silly-regex-linker/
Only useful if you’re me at the moment, but maybe I’ll generalize it for fun.
For a long time I’ve found SSHKeychain an invaluable little app on my Macs, making remote logins with keys relatively painless.
When I upgraded to an Intel-based MacBook last month, I was saddened to find that there wasn’t a Universal release; the last PPC release didn’t run on Intel; and development seems to have stopped altogether. :(
The good news is that it’s open source, and further one of the last code checkins, early in 2006, had been to add Universal binary build support. So I went ahead and built the thing for my own use.
I did though find that it crashed intermittently when waking from sleep. After a little debugging I found the problem; some variables were initialized badly so if all your keychains were locked it would crash. Fun! Easy to fix, though.
I mailed the patch to the dormant developers mailing list and the author, hopefully it’ll get rolled in and other people will get to use it…
A while ago I picked up Motion 2 on a lark to replace the ancient copy of After Effects I occasionally used to do little animation bits. Finally escaped the wiki for a couple hours and got some chance to play with it some more:
1.2MB Ogg Theora (640×360, no sound) (download)
Stumbled on this while searching for Theora transcoding recommendations.
bzip2 is hideously slow; while looking around for ways to possibly speed it up I stumbled on pbzip2, which exploits the block-based nature of bzip2 compression to parallelize it for potentially linear speedups with an increased number of CPUs.
There are two downsides: first, there are some compatibility issues with third-party decompressor software (probably resolvable), and second it only scales until you run out of local CPUs.
For kicks I'm prototyping a distributed version, dbzip2. In theory, a lot of fast machines on a LAN should be able to speed things up even further.
I've been testing with a Wikipedia dump file that's about a gigabyte, compressing down to about a quarter that.
On my dual-core G5 Power Mac, regular single-threaded bzip2 compresses the file in about 356 seconds (3.122 MB/s input processing).
pbzip2 set for two threads runs about 50% faster; I clocked it at 234 seconds (4.750 MB/s).
My dbzip2 prototype does similar with two local threads, though a touch less efficiently:
Wrote 1295 blocks in 250.9 seconds (4.430 MB/s in, 0.988 MB/s out)
And I can squeeze a few more percentage points out by using remote compressors on the other machines I had lying around:
Wrote 1295 blocks in 188.8 seconds (5.887 MB/s in, 1.313 MB/s out)
Most of the data went to the two local threads on my Power Mac:
local thread: processed 447 blocks in 188.7 seconds (2.033 MB/s in, 0.458 MB/s out)
local thread: processed 444 blocks in 188.7 seconds (2.019 MB/s in, 0.451 MB/s out)
My old Linux box, an Athlon XP 2400+ on 100 Mbit ethernet, took a respectable share:
rdaneel:12345: processed 237 blocks in 188.7 seconds (1.078 MB/s in, 0.238 MB/s out)
CPU usage was pretty high though not maxed out (80-90%), likely due to the tenth of a second spent transferring each 900 KB input block and its compressed output over a 100 Mbit network.
Running the whole file locally it can process 1.344 MB/s.
My old Powerbook (1 GHz G4) and newer Mini (1.5 GHz Core Solo) sit on wireless, which is a lot slower:
verda-majo.local:12345: processed 61 blocks in 188.7 seconds (0.277 MB/s in, 0.060 MB/s out)
philo.local:12345: processed 106 blocks in 188.7 seconds (0.482 MB/s in, 0.106 MB/s out)
These two were clearly limited by the slower network, with CPU usage around just 20-30%. Still, every bit helps!
This shows promise for quickly compressing large files when fast machines are available on a fast network. Slow networks strongly reduce the benefits, but on switched gigabit ethernet things should be nicely CPU-limited even with several fast machines.
The main remaining issues:
- Whether the bzip2 streams can be stitched together for improved compatibility with decompressor apps
- Whether similar distribution can be applied to 7zip