On Fri, Aug 20, 2010 at 11:08 AM, Ian Hickson <span dir="ltr"><<a href="mailto:ian@hixie.ch">ian@hixie.ch</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">On Tue, 25 May 2010, Silvia Pfeiffer wrote:<br>
><br>
> We've in the past talked about how there is a need to adapt the bitrate<br>
> version of a audio or video resource that is being delivered to a user<br>
> agent based on the available bandwidth on the network, the available CPU<br>
> cycles, and possibly other conditions.<br>
><br>
</div><div class="im">> It has been discussed to do this using @media queries and providing<br>
> links to alternative versions of a media resources through the <source><br>
> element inside it. But this is a very inflexible solution, since the<br>
> side conditions for choosing a bitrate version may change over time and<br>
> what is good at the beginning of video playback may not be good 2<br>
> minutes later (in particular if you're on a mobile device driving<br>
> through town).<br>
><br>
</div><div class="im">> Further, we have discussed the need for supporting a live streaming<br>
> approach such as RTP/RTSP - but RTP/RTSP has its own "non-Web" issues<br>
> that will make it difficult to make it part of a Web application<br>
> framework - in particular it request a custom server and won't just work<br>
> with a HTTP server.<br>
><br>
> In recent times, vendors have indeed started moving away from custom<br>
> protocols and custom servers and have moved towards more intelligence in<br>
> the UA and special approaches to streaming over HTTP.<br>
><br>
</div>> Microsoft developed "Smooth Streaming", Apple developed "HTTP Live<br>
> Streaming" and Adobe recently launched "HTTP Dynamic Streaming". (Also<br>
> see a comparison at). As these vendors are working on it for MPEG files,<br>
<div class="im">> so are some people for Ogg. I'm not aware anyone is looking at it for<br>
> WebM yet.<br>
><br>
</div><div class="im">> Standards bodies haven't held back either. The 3GPP organisation have<br>
> defined 3GPP adaptive HTTP Streaming (AHS) in their March 2010 release 9<br>
</div>> of 3GPP. Now, MPEG has started consolidating approaches for adaptive<br>
> bitrate streaming over HTTP for MPEG file formats.<br>
<div class="im">><br>
> Adaptive bitrate streaming over HTTP is the correct approach towards<br>
> solving the double issues of adapting to dynamic bandwidth availability,<br>
> and of providing a live streaming approach that is reliable.<br>
><br>
</div><div class="im">> Right now, no standard exists that has been proven to work in a<br>
> format-independent way. This is particularly an issue for HTML5, where<br>
> we want at least support for MPEG4, Ogg Theora/Vorbis, and WebM.<br>
><br>
</div><div class="im">> I know that it is not difficult to solve this issue in a<br>
> format-independent way, which is why solutions are jumping up<br>
> everywhere. They are, however, not compatible and create a messy<br>
> environment where people have to install solutions for multiple<br>
> different approaches to make sure they are covered for different<br>
> platforms, different devices, and different formats. It's a clear<br>
> situation where a new standard is necessary.<br>
><br>
> The standard basically needs to provide three different things:<br>
> * authoring of content in a specific way<br>
> * description of the alternative files on the server and their<br>
> features for the UA to download and use for switching<br>
> * a means to easily switch mid-way between these alternative files<br>
<br>
</div><div><div></div><div class="h5">On Mon, 24 May 2010, Chris Holland wrote:<br>
><br>
> I don't have something decent to offer for the first and last bullets<br>
> but I'd like to throw-in something for the middle bullet:<br>
><br>
> The http protocol is vastly under-utilized today when it comes to URIs<br>
> and the various Accept* headers.<br>
><br>
> Today developers might embed an image in a document as chris.png. Web<br>
> daemons know to find that resource and serve it, in this sense,<br>
> chris.png is a resource locator.<br>
><br>
> Technically one might reference the image as a resource identifier named<br>
> "chris". The user's browser may send "image/gif" as the only value of an<br>
> accept header, signaling the following to the server: "I'm supposed to<br>
> download an image of chris here, but I only support gif, so don't bother<br>
> sending me a .png". In a perhaps more useful scenario the user agent may<br>
> tell the server "don't bother sending me an image, I'm a screen reader,<br>
> do you have anything my user could listen to?". In this sense, the<br>
> document's author didn't have to code against or account for every<br>
> possible "context" out there, the author merely puts a reference to a<br>
> higher-level representation that should remain forward-compatible with<br>
> evolving servers and user-agents.<br>
><br>
> By passing a list of accepted mimetypes, the accept http header provides<br>
> this ability to serve context-aware resources, which starts to feel like<br>
> a contender for catering to your middle bullet.<br>
><br>
> To that end, new mime-types could be defined to encapsulate media<br>
> type/bit rate combinations.<br>
><br>
> Or the accept header might remain confined to media types and acceptable<br>
> bit rate information might get encapsulated into a new header, such as:<br>
> X-Accept-Bitrate .<br>
><br>
</div></div><div class="im">> If you combined the above approach with existing standards for http byte<br>
> range requests, there may be a mechanism there to cater to your 3rd<br>
> bullet as well: when network conditions deteriorate, the client could<br>
> interrupt the current stream and issue a new request "where it left off"<br>
</div>> to the server. Although this likel wouldn't work because a byte range<br>
<div class="im">> request would mean nothing on files of two different sizes. For<br>
> playbacked media, time codes would be needed to define range.<br>
<br>
</div><div class="im">On Tue, 25 May 2010, Silvia Pfeiffer wrote:<br>
><br>
> That's not quite sufficient, actually. You need to know which byte range<br>
> to retrieve or which file segment. Apple solved it by introducing a m3u8<br>
</div>> file format, Microsoft by introducing a SMIL-based server manifest file,<br>
> Adobe by introducing a XML-based Flash Media Manifest file F4M. That<br>
<div class="im">> kind of complexity canot easily be transferred through HTTP headers.<br>
><br>
</div><div class="im">> The idea of the manifest file is to provide matching transition points<br>
> between the different files of different bitrate to segments or byte<br>
> ranges. This information has to somehow come to the UA (amongst other<br>
> information as available in typical manifest files). I don't think that<br>
> can be achieved without a manifest file.<br>
<br>
</div><div class="im">On Fri, 28 May 2010, Jeroen Wijering wrote:<br>
><br>
> Indeed, one such key condition is the current dimensions of the video<br>
> window. Tracking this condition allows user-agents to:<br>
><br>
> *) Not waste bandwidth, e.g. by pushing a 720p video in a 320x180 video<br>
> tag.<br>
><br>
> *) Respond to changes in the video display, e.g. when the video is<br>
> switched to fullscreen playback.<br>
><br>
</div><div class="im">> Providing the different media options using <source> elements might<br>
> still work out fine, if there's a clearly defined API that covers all<br>
> scenarios. A rough example:<br>
><br>
> <video><br>
> <source bitrate="100" height="120" src="video_100.mp4" type="video/mp4; codecs='avc1.42E01E, mp4a.40.2'; keyframe-interval='00:02'" width="160"><br>
> <source bitrate="500" height="240" src="video_500.mp4" type="video/mp4; codecs='avc1.42E01E, mp4a.40.2'; keyframe-interval ='00:02'" width="320"><br>
> <source bitrate="900" height="540" src="video_900.mp4" type="video/mp4; codecs='avc1.42E01E, mp4a.40.2'; keyframe-interval ='00:02'" width="720"><br>
> </video><br>
><br>
> This example would tell the user-agent that the three MP4 files have a<br>
> keyframe-interval of 2 seconds - which of course raises the issue that<br>
> fixed keyframe-intervals would be required.<br>
><br>
> The user-agent can subsequently use e.g. the Media Fragments API to<br>
> request chunks, switching between sources as the conditions change.<br>
<br>
</div>It seems to me that we are not lacking in solutions in this space -- it<br>
would behoove us to try to leverage the existing solutions rather than<br>
making up new ones. Have the above solutions been tried in browsers?<br>
<br></blockquote><div><br><br>Apple's m3u files work in Safari. There is no implementation for WebM or Ogg yet.<br><br>Cheers,<br>Silvia.<br><br></div></div>