Adaptive MPEG-DASH streaming and multiple stream muxing in Elliptics

We built Elliptics distributed storage quite long time ago, it is mature technology which just works when you have to store medium and large objects in geographically distributed locations.

It also supports progressive download: FLV and byte-range (tutorial, mp4) streaming, but we wanted more – native adaptive streaming with channel muxing from elliptics.

Basically, we wanted to upload multiple mp4 files into Elliptics and then stream them to client in required order like 10 seconds of the first file, then 20 seconds of the second, then 10 minutes from the first started from 10’th second while there is another track (like sound) in background which is mixed in its own way. And preferably with adaptive bitrate switching if client moves from slower to faster networks like 3g-to-wifi and vice versa.

Another example of this muxing feature is gapless music playing – you listen songs one after another without delays in between, without player reinitialization, without delay waiting for song (or its part) to be downloaded when previous one has stopped.

There are many technologies which implement streaming: HLS, HDS, MPEG-DASH, RTMP, RTSP and others. It looks like HLS is the winner for now, it is backed by Apple and is supported by iOS and Safari, but MPEG-DASH is backed by large group of vendors and is supported by all other browsers including TVs except iOS. Since Flash is going to die after youtube.com, netflix.com and other large vendors stopped streaming in that format, I believe MPEG-DASH will become more and more popular (its market share to date is rather small) and eventually only HLS and MPEG-DASH will be default streaming protocols.

There are fair number of streaming services which support both HLS and MPEG-DASH, but most of the time they require quite a lot of efforts to create fault-tolerant streaming service which would work with distributed storage. And neither of them supports audio/video muxing described above. Actually streaming technology itself partially supports this feature, for example in MPEG-DASH there is a notion of “Period”, and multiple periods would be played one after another. But this is a quite advanced feature which is not yet supported by reference player Dash.js (there are commercially available players which somewhat support this feature though). There are questions on implementation whether player can reinitialize itself to play multiple periods without stream interruption.

We decided to implement MPEG-DASH in our Elliptics streaming service to support both adaptive streaming and stream muxing. To implement this we create the whole mpeg container in runtime and only read samples data from audio/video files stored in Elliptics. To allow muxing all files in the stream must be encoded the same way though.

Using our technology one can implement 5-seconds muxing (5 seconds of the first video, then 5 second of the second, then next 5 seconds from the first and so on) in example below using following control json:

"tracks": [
{
  "bucket": "b1",
  "key": "video_1.mp4",
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_2.mp4",
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_1.mp4",
  "start": 5000,
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_2.mp4",
  "start": 5000,
  "duration": 5000
}
]

But enough words, show me the result.

Here it is, muxing 2 video and 2 sound channels in the way described above without interruption and gaps. All 4 files are stored in Elliptics storage as usual objects.

http://video.reverbrain.com/index.html

Please note that Elliptics and streaming servers are located in USA and it adds 150+ ms to get the first chunk (or if you’ve sought into the area which isn’t yet present in the cache) from Russia, otherwise it is very fast.

You can check the source of the html page above to see how muxing is being set up, you can play with different settings and watch the whole files or mix them in other ways around.

Enjoy, and stay tuned!

Comments are closed.