[whatwg] Extending HTML 5 video for adaptive streaming

Bob Lund B.Lund at CableLabs.com
Fri Jul 1 08:40:08 PDT 2011


Hi Aaron,

Here are some other aspects of script controlled adaptive bit rate that occur to me, perhaps you have already considered these.

1) I guess script will be responsible for maintaining its own playback buffer, monitoring buffer behavior and selecting the appropriate bit rate for new fragments. Are there any other network related events/metrics script might need to determine which bit-rate to fetch for the next segment? Is there any other information from the user agent about playback performance that script might need?

2) If a media resource is a multi-track resource then it would seem script will also have to fetch fragments for those tracks which implies that the audio element would need the append method. Timed text tracks would also need to be processed and Cues appended.

There is a new media pipeline task force in the Web and TV IG (http://www.w3.org/2011/webtv/wiki/MPTF) that is also planning to examine this topic. You may want to participate.

Regards,
Bob Lund


> -----Original Message-----
> From: whatwg-bounces at lists.whatwg.org [mailto:whatwg-
> bounces at lists.whatwg.org] On Behalf Of Aaron Colwell
> Sent: Thursday, June 30, 2011 10:59 AM
> To: whatwg at whatwg.org
> Subject: [whatwg] Extending HTML 5 video for adaptive streaming
> 
> Hi,
> 
> I've been working on an adaptive streaming prototype that uses
> JavaScript to fetch chunks of media and feeds them to the video tag for
> decoding. The idea is to let the adaptation algorithm and CDN
> interactions happen in JavaScript so that they can evolve without the
> need for browser changes. I'm looking for some guidance about the
> preferred method for adding this type of functionality. I'm new to this
> process so please bear with me.
> 
> My initial implementation is built around WebM, but I believe this could
> work for Ogg & MP4 as well. The basic idea is to initialize the video
> tag with stream initialization data (ie WebM info & tracks elements) via
> the <video> src attribute and then send media chunks (ie WebM clusters)
> to the tag via a new appendData() method on <video>. Here is a simple
> example of what I'm talking about.
> 
>   <video id="v" autoplay> </video>
>   <script>
>     function needMoreData(e) {
>       e.target.appendData(getNextCluster());
>     }
> 
>     function onSeeking(e) {
>       var video = e.target;
>       video.appendData(findClusterForTime(video.currentTime));
>     }
> 
>     var video = document.getElementById('v');
> 
>     video.addEventListener('loadstart', needMoreData);
>     video.addEventListener('stalled', needMoreData);
>     video.addEventListener('seeking', onSeeking);
> 
>     video.src = URL.createObjectURL(createStreamInitBlob());
>   </script>
> 
> AppendData() expects to recieve a Uint8Array that contains WebM cluster
> elements. The first cluster passed to appendData() initializes the
> starting playback position. Also after a seeking event fires the first
> appendData() updates the current position to the seek point.
> 
> I've also been looking at the WebRTC MediaStream API and was wondering
> if it makes more sense to create an object similar to the
> LocalMediaStream object.
> This has the benefits of unifying how media streams are handled
> independent of whether they come from a camera or a JavaScript based
> streaming algorithm. This could also enable sending the media stream
> through a Peer-to-peer connection instead of only allowing a camera as a
> source. Here is an example of the type of object I'm talking about.
> 
> interface GeneratedMediaStream : MediaStream {
>   void init(in DOMString type, in UInt8Array init_data);
>   void appendData(in DOMString trackId, in UInt8Array data);
>   void endOfStream();
> 
>   readonly attribute MultipleTrackList audioTracks;
>   readonly attribute ExclusiveTrackList videoTracks; };
> 
> type - identifies the type of stream we are generating(ie video/x-webm-
> cluster-stream or video/ogg-page-stream) init_data - Provides
> initialization data that indicates the number of tracks, codec configs,
> etc. (ie WebM info & tracks elements or Ogg header
> 
> pages)
> trackId - Indicates what track the data is for. If this is an empty
> string than multiplexed data is being passed in. If not empty trackId
> matches an id of a track in the TrackList objects.
> data - media data chunk (ie WebM cluster or Ogg page). Data is expected
> to have monotonically increasing timestamps, no gaps, etc.
> 
> Here are my questions:
> - Is there a preference for appendData() vs new MediaStream object?
> - If the MediaStream object is preferred, should this be constructed
> through Navigator.getUserMedia()? I'm unclear about what the criteria is
> for adding this to Navigator vs allowing direct object construction.
> - Are there existing efforts along these lines? If so, please point me
> to them.
> 
> Thanks for your help,
> 
> Aaron



More information about the whatwg mailing list