Page MenuHomePhabricator

Support MPEG DASH for multiple webm files
Open, LowPublic

Description

Author: mdale

Description:
Googles page here outlines mpeg DASH support for webm:
http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-a-webm-dash-presentation

This requires some minor changes to the encode pipeline, and the inclusion of a JavaScript library to the player component to support adaptive streaming.

Note chrome canary will shortly support the peer to peer data channel, which can also be a source for appending chunks of webm streams, so getting these components in place also could facilitate p2p distributions experiments for "short tail" content for example; fundraiser landing page, or google doodle pages with video.


Version: unspecified
Severity: enhancement

Details

Reference
bz42591

Event Timeline

bzimport raised the priority of this task from to Low.Nov 22 2014, 1:08 AM
bzimport set Reference to bz42591.

jgerber wrote:

adding sample_muxer as another file required for the encoding might not be so good, should check if we can get this merged into avconv as an option.

Also this requires audio and video to be in separate files. that most likely requires some changes to the way transcodes are handled.

MPEG-DASH style streaming would also be good for the ogv.js player (or a future webm.js variant) in non-WebM-supporting browsers, as this would enable things that are difficult when loading a large video file via XMLHTTPRequest:

  • true streaming in Safari (currently can only progressive download into an in-memory buffer)
  • quick seeking to any position in the file
  • adaptive switching of resolutions based on CPU speed

The smaller file chunks should also be friendlier to our caching infrastructure, in theory.

Update ping:

ogv.js is now able to stream a large file via multiple Range: requests, and will be able to do adaptive streaming with them in the future, but these are expensive with our current infrastructure (swift backend + varnish frontend)...

  • Faidon reports there have been some usage spikes that have taken out swift in the past!
  • smaller pieces will allow the Varnish layer to cache pieces of popular files, making usage spikes more survivable
  • smaller pieces will allow the swift and varnish layers to shard the file over multiple servers MUCH more efficiently

Things to check:

  • what's the chunk threshold that Varnish can deal with well?
  • is there suitable tooling for chopping up the transcodes?
    • if it exists for webm, does it also exist for ogv?
  • what's best handling of seeking?

Another warning: if we just add this on to our existing single-chunk transcodes we'll be doubling the total disk space required. Is this ok or should we trim the number of downloadable transcodes to reduce the duplication?

Video.js has a plugin to support DASH btw.

Chatted with @Matanya at wmhack2016; some notes https://etherpad.wikimedia.org/p/WikiHack16-Video

Note that it would be good to support dash-ish chunking both for transcoded output *and* for source files, as we're starting to hit up against upload size limits with long HD recordings or moderate-length 4K recordings.

For chunked source files, we'd basically abstract the raw file into a combination of chunks of the .webm stream, plus a generated .mpd manifest. When checking the file out for transcoding, we would check out all the chunks and the .mpd and pass the .mpd into ffmpeg. In theory, ffmpeg should be able to read out the subfiles. (Alternately, assemble all the chunks locally. Alternately still, stream all the chunks out of swift directly!)

Tricky things:

  • need to figure out how to hook into MediaWiki upload systems to chunk the files appropriately
  • need to figure out how to expose the raw super-big file as a direct download on File: page at top and in history (disable it? provide a link? some kind of magical swift feature for abstracting generated info?)
  • maybe more tricky things...

Will open some additional tasks after a little more thought for division of labor. :)

Will open some additional tasks after a little more thought for division of labor. :)

@brion: Did you find some time to do that? :)