Adaptive MPEG-DASH streaming and arbitrary channel muxing

Ever wanted to be able to compose your own content stream without reencoding video files? Like having video stream from this file, and audio from these files, and then add another video and so on and so on?

We have created a simple service to highlight one of our technologies which allows to create adaptive MPEG-DASH stream (the same stream as youtube returns) and mix many streams from different files stored in Elliptics.

stream1

In the example above NeuralNetwork course video is being played and some of my saxophone music as audio track.

DASH stream is being created on demand from different sources, so if you want to add new sound track or link multiple videos one after another there is no need to reencode video content each time. The service we created is rather simple, it does not use every feature of the technology, in particular, playlist protection is not used in the service, i.e. you can share it with others and initializing DASH player with our URL you will be able to play it on your site. Also we do not turn on stream insertion, i.e. when you want to play the main video and put in some additional video stream (or its part) at some time offset. We have both of this features in the server, but there are no interface controls for them in the service yet.
As a side note using this service one can create gapless audio/video playing, i.e. no player reinitialization between tracks like on popular video streaming sites.

So far we only support MPEG-DASH streaming, and while almost all modern browsers support it (i.e. Chrome, Firefox, IE and Opera), Safari is out of the list. We do not yet fully support HLS, and although Apple had announced at WWDC2016 that they will switch from mpeg2ts streaming (probably being more compatible with DASH stream), we still have plans to implement HLS streaming too.

So, if you are interesting to play with the technology, you are welcome to our playground service: http://stream.reverbrain.com
Do not be afraid of mad design, just upload files, they will be tagged automatically, and create playlists!

HLS and DASH adaptive streaming and muxing from the same files from Elliptics

Our streaming engine Nulla is now capable of streaming in both HLS and MPEG-DASH formats from Elliptics storage. We do not require uploading multiple files (in mp4 and mp2ts container formats) or specially repack frames in the media files (that’s what mp4box is used for) for streaming.

Elliptics streaming service Nulla creates DASH/HLS stream in realtime from data file you have provided, it builds either mp4 or mp2ts container on demand based on the streaming offset client requests. This allows not to force clients to upload multiple format files (mp2ts for HLS, mp4 for MPEG-DASH) or repack your existing files to meet streaming demands (fragmenting stream and put indexing box in front of data).

Since we build stream from data frames and create container in realtime we can adjust presentation and decode times and build an audio/video muxer. This allows, for example, to stream one file and put another one into the middle or stream file and split it into X chunks each of which will be followed by another different file, like 5 seconds from file X, 10 seconds from Y, 15 seconds from X starting from 5-seconds offset, then 10 seconds from file Z, while audio track has own muxing and so on and so on.

This is all being controlled via siple json API and guarded from embedding into hostile sites via random URLs with limited lifetime.

We built a simple example page which shows this kind of muxing and streaming.
Depending on your browser our servers will stream either HLS (desktop Safari and iOS) or MPEG-DASH (tested in current stable versions of Chrome, Firefox and IE) from 2 video and 2 audio files uploaded quite far ago.

Enjoy!

http://video.reverbrain.com/index.html

Source code for this simple HTML page shows how simple and convenient is our API.

Next tasks is to build a cache for preprocessed chunks and distribute it among multiple geographically distributed elliptics nodes in your cluster. We also plan to add automatic transcoding of video stream into smaller bitrate which will be automatically selected by the browser (that’s why HLS/DASH are adaptive streamings), currently you have to upload files in multiple bitrates and use them in API to create appropriate playlist, this task can be automated and will be implemented in the next release.

Another next major goal is to implement live translation from application (or browser) into Elliptics, who will distribute your translation via HLS/DASH to the thousands of simultaneously watching users.

Adaptive MPEG-DASH streaming and multiple stream muxing in Elliptics

We built Elliptics distributed storage quite long time ago, it is mature technology which just works when you have to store medium and large objects in geographically distributed locations.

It also supports progressive download: FLV and byte-range (tutorial, mp4) streaming, but we wanted more – native adaptive streaming with channel muxing from elliptics.

Basically, we wanted to upload multiple mp4 files into Elliptics and then stream them to client in required order like 10 seconds of the first file, then 20 seconds of the second, then 10 minutes from the first started from 10’th second while there is another track (like sound) in background which is mixed in its own way. And preferably with adaptive bitrate switching if client moves from slower to faster networks like 3g-to-wifi and vice versa.

Another example of this muxing feature is gapless music playing – you listen songs one after another without delays in between, without player reinitialization, without delay waiting for song (or its part) to be downloaded when previous one has stopped.

There are many technologies which implement streaming: HLS, HDS, MPEG-DASH, RTMP, RTSP and others. It looks like HLS is the winner for now, it is backed by Apple and is supported by iOS and Safari, but MPEG-DASH is backed by large group of vendors and is supported by all other browsers including TVs except iOS. Since Flash is going to die after youtube.com, netflix.com and other large vendors stopped streaming in that format, I believe MPEG-DASH will become more and more popular (its market share to date is rather small) and eventually only HLS and MPEG-DASH will be default streaming protocols.

There are fair number of streaming services which support both HLS and MPEG-DASH, but most of the time they require quite a lot of efforts to create fault-tolerant streaming service which would work with distributed storage. And neither of them supports audio/video muxing described above. Actually streaming technology itself partially supports this feature, for example in MPEG-DASH there is a notion of “Period”, and multiple periods would be played one after another. But this is a quite advanced feature which is not yet supported by reference player Dash.js (there are commercially available players which somewhat support this feature though). There are questions on implementation whether player can reinitialize itself to play multiple periods without stream interruption.

We decided to implement MPEG-DASH in our Elliptics streaming service to support both adaptive streaming and stream muxing. To implement this we create the whole mpeg container in runtime and only read samples data from audio/video files stored in Elliptics. To allow muxing all files in the stream must be encoded the same way though.

Using our technology one can implement 5-seconds muxing (5 seconds of the first video, then 5 second of the second, then next 5 seconds from the first and so on) in example below using following control json:

"tracks": [
{
  "bucket": "b1",
  "key": "video_1.mp4",
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_2.mp4",
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_1.mp4",
  "start": 5000,
  "duration": 5000
},
{
  "bucket": "b1",
  "key": "video_2.mp4",
  "start": 5000,
  "duration": 5000
}
]

But enough words, show me the result.

Here it is, muxing 2 video and 2 sound channels in the way described above without interruption and gaps. All 4 files are stored in Elliptics storage as usual objects.

http://video.reverbrain.com/index.html

Please note that Elliptics and streaming servers are located in USA and it adds 150+ ms to get the first chunk (or if you’ve sought into the area which isn’t yet present in the cache) from Russia, otherwise it is very fast.

You can check the source of the html page above to see how muxing is being set up, you can play with different settings and watch the whole files or mix them in other ways around.

Enjoy, and stay tuned!