Safely embedded JavaScript modules for MediaWiki?

At the Zürich Hackathon I’ve been poking a number of things — more notes to come later — but one fun one is that I’ve continued work on one of my older experiments, getting a demo running on Wikimedia Labs and starting to enhance it for MediaWiki’s relatively new ContentHandler system to create custom structured ‘pages’.

Screen Shot 2014-05-11 at 1.20.04 PM

(TL;DR if you don’t want to click through to the docs on the extension: an isolated iframe is used to sandbox script code from the wiki’s own web environment.)

Currently the ‘JSApplet’ namespace content handler just takes straight JavaScript source, but I’m planning to make it a structured bundle which contains:

  • an HTML scaffold
  • a CSS chunk
  • a JavaScript chunk
  • references to other scripts to include
  • references to SVG or raster image files to include
  • print or non-JS fallback content

Then, a custom view/edit handler for the content type can provide an interface to edit those bits and run an immediate preview — kind of like an embedded JSFiddle.

ContentHandler also allows for a custom transclusion mode — so scripts could perhaps be invoked like ‘{{JSApplet:Mandelbrot}}” instead of having to use an XML-like tag extension manually. Not sure if that’s the best plan, but it’s a thought. :D

I’m also thinking about how to make this work on mobile — in theory these things should be able to work with iframes in embedded web views, but it may require adding special support in the client.

For times when the script can’t be executed, some sort of static fallback content should be shown — still thinking about best ways to support that cleanly and make it something that people will do by default. Hmm, ideas ideas. :)

 

ogv.js now in WebGL and Flash flavors

Two major updates to ogv.js Theora/Vorbis video/audio player in the last few weekends: an all-Flash decoder for older IE versions, and WebGL and Stage3D GPU acceleration for color conversion and drawing.

Try the demo and select ‘JS + WebGL‘, ‘Flash’, or even ‘Flash + GPU‘ from the drop-down! (You can also now try playback in the native video element or the old Cortado Java applet for comparison, though Cortado requires adding security exceptions if your browser works with Java at all these days. :P)

Flash / Crossbridge

The JS code output by emscripten requires some modern features like typed arrays, which aren’t available on IE 9 and older… but similar features exist in Flash’s ActionScript 3 virtual machine, and reasonably recent Flash plugins are often available on older IE installations.

The Flash target builds the core C codec libraries and the C side of the OgvJs codec wrapper using Crossbridge, the open-source version of Adobe’s FlasCC cross-compiler. This is similar in principle to the emscripten compiler used for the JavaScript target, but outputs ActionScript 3 bytecode instead of JavaScript source.

I then ported the JavaScript code for the codec interface and the player logic to ActionScript, wrapped it all up into a SWF using the Apache Flex mxmlc compiler, and wrapped the Flash player instantiation and communication logic in a JavaScript class with the same interface as the JS player.

A few interesting notes:

  • Crossbridge compiler runs much slower than emscripten does, perhaps due to JVM startup costs in some backend tool. Running the configure scripts on the libraries is painfully slowwwwwwwwww! Luckily once they’re built they don’t have to be rebuilt often.
  • There are only Mac and Windows builds of Crossbridge available; it may or may not be possible to build on Linux from source. :( I’ve only tested on Mac so far.
  • Flash decoder performance is somewhere on par with Safari and usually a bit better than Internet Explorer’s JS.
  • YCbCr to RGB conversion in ActionScript was initially about 10x slower than the JavaScript version. I mostly reduced this gap by moving the code from ActionScript to C and building it with the Crossbridge compiler… I’m not sure if something’s building with the wrong optimization settings or if the Crossbridge compiler just emits much cleaner bytecode for the same code. (And it was literally the same code — except for the function and variable declarations the AS and C code were literally identical!)
  • ActionScript 3 is basically JavaScript extended with static type annotations and a class-based object system with Java-like inheritance and member visibility features. A lot of the conversion of JS code consisted only of adding type annotations or switching HTML/JS-style APIs for Flash/AS ones.
  • JS’s typed array & ArrayBuffer system doesn’t quite map to the Flash interfaces. There’s a ByteArray which is kind of like a Uint8Array plus a data stream interface, with methods to read/write values of other types into the current position and advance it. There are also Vector types, which have an interface more like the traditional untyped Array but can only contain items of the given type and are more efficiently packed in memory.

GPU acceleration: WebGL

WebGL is a JavaScript API related to OpenGL ES 2, exposing some access to GPU programming and accelerated drawing of triangles onto a canvas.

Safari unfortunately doesn’t enable WebGL by default; it can be enabled as a developer option on Mac OS X but requires a device jailbreak on iOS.

However for IE 11, and the general fun of it, I suspected adding GPU acceleration might make sense! YCbCr to RGB conversion and drawing bytes to the canvas with putImageData() are both expensive, especially in IE. The GPU can perform the two operations together, and more importantly can massively parallelize the colorspace conversion.

I snagged the fragment shader from the Broadway H.264 player project’s WebGL accelerated canvas, and with a few tutorials, a little documentation, and a lot of trial and error I got accelerated drawing working in Firefox, Chrome, and Safari with WebGL manually enabled.

Then came time to run it on IE 11… unfortunately it turns out that single-channel luminance or alpha textures aren’t supported on IE 11’s WebGL — you must upload all textures as RGB or RGBA.

Copying data from 1-byte-per-pixel arrays to a 3- or 4-byte-per-pixel array and then uploading that turned out to be horribly slow, especially as IE’s typed array ‘set’ method and copy constructor seem to be hideously slow. It was clear this was not going to work.

I devised a method of uploading the 1-byte-per-pixel arrays as pseudo-RGBA textures of 1/4 their actual width, then unpacking the subpixels from the channels in the fragment shader.

The unpacking is done by adding two more textures: “stripe” textures at the luma and chroma resolutions, which for each pixel have a 100% brightness in the channel for the matching packed subpixel. For each output pixel, we sample both the packed Y, Cb, or Cr texture and the matching-size stripe texture, then multiple the vectors and sum the components to fold down only the relevant channel into a scalar value.

It feels kinda icky, but it seems to run just as fast at least on my good hardware, and works in IE 11 as well as the other browsers.

On Firefox with asm.js-optimized Theora decoding and WebGL-optimized drawing, I can actually watch 720p and some 1080p videos at full speed. Nice! (Of course, Firefox can already use its native decoder which is faster still…)

A few gotchas with WebGL:

  • There’s a lot of boilerplate you have to do to get anything running; spend lots of time reading those tutorials that start with a 2d triangle or a rectangle until it makes sense!
  • Once you’ve used a canvas element for 2d rendering, you can’t switch it to a WebGL canvas. It’ll just fail silently…
  • Creating new textures is a lot slower than uploading new data to an existing texture of the same size.
  • Error checking can be expensive because it halts the GPU pipeline; this significantly slows down the rendering in Chrome. Turn it off once code is working…

GPU acceleration: Flash / Stage3D

Once I had things working so nicely with WebGL, the Flash version started to feel left out — couldn’t it get some GPU love too?

Well luckily, Flash has an OpenGL ES-like API as well: Stage3D.

Unluckily, Stage3D and WebGL are gratuitously different in API signatures. Meh, I can work with that.

Really unluckily, Stage3D doesn’t include a runtime shader compiler.

You’re apparently expected to write shaders in a low-level assembly language, AGAL… and manually grab an “AGALMiniAssembler” class and stick it in your code to compile that into AGAL bytecode. What?

Luckily there’s also a glsl2agal converter, so I was able to avoid rewriting the shaders from the WebGL version in AGAL manually. Yay! This required some additional manual hoop-jumping to make sure variables mapped to the right registers, but the glsl2agal compiler makes the mappings available in its output so that wasn’t too bad.

Some gotchas with Stage3D:

  • The AS3 documentation pages for Stage3D classes don’t show all the class members in Firefox. No, really, I had to read those pages in Chrome. WTF?
  • Texture dimensions must be powers of 2, so I have to copy rows of bytes from the Crossbridge C heap into a temporary array of the right size before uploading to the GPU. Luckily copying byte arrays is much faster in Flash than in IE’s JS!
  • As with the 2d BitmapData interface, Flash prefers BGR over RGB. Just had to flip the order of bytes in the stripe texture generation.
  • Certain classes of errors will crash Flash and IE together. Nice!
  • glsl2agal compiler forced my texture sampling to linear interpolation with wrapping; I had to do a string replace on the generated AGAL source to set it to nearest neighbor & clamped.
  • There doesn’t appear to be a simple way to fix the Stage3D backing buffer size and scale it up to fit the window, at least in the modes I’m using it. I’m instead handling resizing by setting the backing buffer to the size of the stage, and just letting the texture render larger. This unfortunately uses nearest-neighbor for the scaling because I had to disable linear sampling to do the channel-packing trick.
  • On my old Atom-based tablet, Stage3D drawing is actually slower than software conversion and rendering. D’oh! Other machines seem faster, and about on par with WebGL.

I’ll do a little more performance tweaking, but it’s starting to look like it’s time to clean it up and try integrating with our media playback tools on MediaWiki… Wheeeee!

 

ogv.js now with synchronized sound

My ogv.js JavaScript Theora video player project now has synchronized audio and video, which makes it possible to watch videos of people talking without going mad. :D

Screen Shot 2014-03-06 at 4.57.30 AM

Try it out!

This works both with Web Audio in Firefox, Chrome, and Safari and with the Flash audio shim for IE; basically we have to keep track of the audio playback position and match up decoding frames with that. Took a little poking in the ActionScript code but it’s now working!

This brings us much closer to being able to integrate ogv.js as a fallback video player for Wikimedia on IE 10/11 and Safari 6/7. Thanks to the guys hanging out in #xiph channel who encouraged me to keep poking at this! :D

Additionally there’s now an override selector for the video size, so you can try decoding larger than 360p versions, or switch a slow machine down to the little 160p versions.

I’ve also started investigating an all-Flash version using Adobe’s Crossbridge, which if it works would be a suitable replacement for the Cortado Java applet on old browsers (think of all those IE 6/7/8/9 systems out there!). I seem to be able to build the ogg libraries but haven’t gone beyond that yet… will be interesting to poke at.

ogv.js sound and video now working on iOS

Thanks to some audio-related fixes and improvements from a contributor, I’ve started doing some new work on ogv.js, the emscripten-powered JavaScript Ogg Theora/Vorbis decoder.

In addition to Internet Explorer 10/11 (via Maik’s Flash shim), I now have audio working on iOS — and smaller video sizes actually play pretty decently on a current iPhone 5s as well!

See demo:

Try it out on your computer or phone!

Older iOS 7 devices and the last generation of iPod Touch are just too slow to play video reliably but still play audio just fine. The latest 64-bit CPU is pretty powerful though, and could probably handle slightly larger transcodes than 160p too.

ogv.js now with sound in Internet Explorer thanks to… Flash?

One of my fun little side projects is ogv.js, a prototype audio/video decoder in JavaScript to allow playback of Wikimedia Commons’ media files on browsers that don’t natively support the royalty-free Ogg Vorbis, Theora or WebM codecs, but may have trouble with our old ‘Cortado’ Java applet compatibility solution thanks to the modern trend of disabling plugins (and Java in particular).

Mostly this means Internet Explorer and Safari — Chrome and Firefox handle the files natively. However Internet Explorer was limited by the lack of support for the Web Audio API, so could not play any sound. I’d hypothesized that a Flash shim could be used — Windows 8 ships with the Flash plugin by default and it’s widely installed on Windows 7 — but had no idea where to start.

Open source to the rescue!

One of the old maintainers of the Cortado applet, maikmerten, took an interest. After some brief fixes to get the build scripts working on Ubuntu, he scrounged up a simple ActionScript audio shim, with source and .swf output, and rigged up the ogv.js player to output audio through that if there was no native Web Audio API.

It woooooorks!

Screen Shot 2014-02-18 at 6.54.53 AM

The ActionScript of the Flash shim is pretty straightforward, and it compiles into a nice, approx 1kb .swf file. Luckily, you can rebuild the .swf with the open-source Apache Flex SDK, so it doesn’t even rely on proprietary Flash Builder or anything. We could do with some further cleanup (for instance I don’t think we’re disposing of the Flash plugin when shutting down audio, but that’s easy to fix in a bit…) but the basics are in place. And of course getting proper audio/video sync will be complicated by the shim layer — the current code drives the clock based on the video and has choppy audio anyway, so there’s some ways to go before we reach that problem. ;)

It even works on Windows RT, the limited ARM version of Windows 8 — though the video decoding is much too slow on a first-gen Surface tablet’s Tegra 3 CPU, audio-only files play nicely.

Thanks maikmerten!

 

ogv.js update: color, sound

Last week I posted about a proof of concept experiment cross-compiling the Theora video decoder to JavaScript: ogv.js.

Well, I did some more hacking on it this weekend:

  • Color output? Check.
  • Streaming input? Check. (But no seeking yet, and buffering is aggressive on some browsers.)
  • Sound? Check. (But it’s not synced, choppy, and usually the wrong sample rate. And no IE or iOS support yet.)
  • Pretty interface to select any file from Wikimedia Commons’ Media of the Day archive? Check.
  • Laid some groundwork for separating the codec to a background Worker thread (not yet implemented)
  • See the readme for more info.

Try out the live demo with this pretty video of a Montreal subway station:

Jarry metro video screenshot

Feel free to try building or hacking the source for fun. :)

BugTender updates

Can now post comments on bugs! Auth prompt at post time. No offline queuing yet.

Bug list defaults to showing recently filed bugs. Various search and sort options, incomplete but a start. Doesn’t handle huge return lists well; server gives chronic order oldest first, we want the opposite.

Initial appcache for offline usage. Limited as there’s no persistent data cache yet, but you can load the page when offline.

Restructured bug view to put comments first.

More details in bug list items.

20111205-092116.jpg

Node.js and web workers: safe multiprocess CLI JS

I’ve been fiddling with the Node.js CLI/server-side JavaScript environment for a number of experiments, and have started on using it to build batch tests for the new MediaWiki parser work.

Our parsing experiments are being started in JavaScript so we can bundle it with the in-progress editing tools and run them on existing MediaWiki sites as a gadget for testing; as things mature, a PHP version will get made to take over from the older parser classes in MediaWiki proper.

For a simple batch test, I’ve got a JavaScript module written in Node.js that processes a Wikipedia XML data dump, runs each page’s text through the parser, and serializes it back to text to check for any round-tripping problems.

There’s a few gotchas working with Node.js, but it’s generally a pretty nice way to get started!

Getting started: running browser code in Node

The parsing code so far has been designed to run in-browser, loaded by MediaWiki’s ResourceLoader system.

Some of that code uses jQuery helper functions like $.each(); jQuery can be provided easily via an npm module, but there’s still a trick.

In the browser, global vars from every script go implicitly into the shared global namespace (usually the ‘window’ object) and get picked up by other scripts, so they need only reference ‘$’ or ‘PegParser’. But in Node.js’s module system, implicit globals end up being private to each module! Only things exposed through module.exports can be accessed from your other scripts.

The wrapped module for jQuery already handles the export, so to make it available to my parser scripts I can copy it into Node’s explicit ‘global’ namespace object, which makes things available from any module:

// For now most modules only need this for $.extend and $.each :)
global.$ = require('jquery');

I was able to get them easily export their public functions/classes to module.exports if it’s present, letting me use the same code in browser and Node:

if (typeof module == "object") {
    module.exports.PegParser = PegParser;
}

Conveniently the PEG.js library already did that for me. :)

Web workers: safe multithreading

Recent web browser standards have introduced the Web Workers system for doing multi-threaded JavaScript safely. Unlike traditional threading, the workers don’t share any direct state or context with the parent; they’re essentially separate JavaScript programs that can communicate only through JSON message-passing.

While this is in some ways limited, it does mean you can skip most of the horrible synchronization primitives and confusing explody things from low-level threading that you might be used to in Java or C++!

It’s also been implemented for Node.js, as a module that spawns workers as subprocesses communicating over Unix domain sockets. It may not sound much fancier than just spawning a subprocess and piping explicitly, but you get a pre-defined message-passing architecture that’s more suitable to structured data than a raw byte stream pipe.

To scale automated batch tests over multiple CPU cores, I moved the parsing portion of the test runner into a worker. Spawn a few of those, and as revisions come in from the wiki dump being run we pass them down to the open worker threads, pausing the input when we have several queued up to avoid overflowing on the main thread.

Another nice thing is that this same model, in theory, should work for a browser-based version of the tests. (oooh!)

Web workers in Node: mostly sane

I hit a few gotchas which took me a while to figure out…

First, you might not see all errors from the worker script. Your life will be made much simpler if you define an onerror handler in your parent thread! Most of the explosive confusing errors below will at least show up there…

Also near the top is that the local path doesn’t seem to get properly set on the worker script; you can’t for instance do a require() module load with a relative path. I’m working around this by sending the path as an ‘init’ message to the worker, which can then start loading up its other modules and files.

The most insidious though… is that you can’t modify the ‘global’ global namespace object from your worker. Whaaaa? Since I need to do that to put jQuery and my own browser-oriented modules into the global namespace, for now I just split all the actual work bits into a second module that I require() from the worker itself. Amazingly this works just fine — I just have to pass a couple functions from the worker context itself to the other module, and it seems to work. :D