From XiphWiki
Revision as of 11:06, 7 March 2014 by Martin.leese (talk | contribs) (→‎Notes: Made comment impersonal)
Jump to navigation Jump to search

At least libogg, libvorbis, and libtheora can be built to JavaScript using emscripten with only slight modifications.


Experimental work in progress script to build these: https://github.com/brion/ogv.js based on previous work for AudioCogs integration: https://github.com/devongovett/ogg.js

The build scripts make slight changes to configure scripts which may or may not be necessary, need to dive in in more detail.

There's also a slight fix needed in libtheora: https://github.com/brion/theora/commit/06a0e4acf9c35f4bd31b8788a8a573cb89262333 which is probably safe to upstream (check this!)


Playback demo at: https://brionv.com/misc/ogv.js/demo/

Video works in all current browsers, audio works if Web Audio API is there (latest Firefox, Safari, Chrome), or with a Flash audio shim for IE.


  • libtheora
    • Must pass --disable-asm
    • needs a slight fix to a function signature to quiet an error from emscripten compiler
  • libvorbis
    • experiments with using tremor instead of libvorbis showed it produces significantly smaller JS code. Performance, however, seems to be better with libvorbis, so the switch to tremor was not pursued.
  • Internet Explorer
    • works on IE 10 and 11 only
    • Flash audio shim works pretty well
  • Safari
    • works on OS X 10.8/Safari 6.1, OS X 10.9/Safari 7, and iOS 7
    • Safari JIT crash seems to have been resolved by moving to the new emscripten LLVM backend
    • Audio must be triggered from a UI event to work on iOS (no autoplay)

Performance seems adequate on recent-ish desktops/laptops in the latest browsers, but is woefully poor on most mobile phones and tablets. The 64-bit iPhone 5s more or less hits decode speed targets at ~360p video; other iOS devices struggle to play 160p.

HTML5 integration

Current playback demo simply outputs to a <canvas> element, doing YUV to RGB conversion in software.

It's not clear to me that playback could be integrated into an actual <audio> or <video> element, but similar JavaScript interfaces could be wrapped around a <canvas>.

WebRTC also does not appear to allow easily plugging in custom JS codecs, but audio and video frames could be sent over the data channels as ArrayBuffers.

Live encoding may be possible connected to getUserMedia, I have not attempted this yet.

Related projects