[whatwg] Extending HTML 5 video for adaptive streaming
Aaron Colwell
acolwell at google.com
Fri Jul 1 07:51:57 PDT 2011
Hi Robert,
comments inline.
On Thu, Jun 30, 2011 at 4:13 PM, Robert O'Callahan <robert at ocallahan.org>wrote:
> On Fri, Jul 1, 2011 at 4:59 AM, Aaron Colwell <acolwell at google.com> wrote:
>
>> I've also been looking at the WebRTC MediaStream API and was wondering if
>> it
>> makes more sense to create an object similar to the LocalMediaStream
>> object.
>> This has the benefits of unifying how media streams are handled
>> independent
>> of whether they come from a camera or a JavaScript based streaming
>> algorithm. This could also enable sending the media stream through a
>> Peer-to-peer connection instead of only allowing a camera as a source.
>> Here
>> is an example of the type of object I'm talking about.
>>
>
> I think MediaStreams should not be dealing with compressed data except as
> an optimization when access to decoded data is not required anywhere in the
> stream pipeline. If you want to do processing of decoded stream data (which
> I do --- see
> http://hg.mozilla.org/users/rocallahan_mozilla.com/specs/raw-file/tip/StreamProcessing/StreamProcessing.html),
> then introducing a decoder inside the stream processing graph creates all
> sorts of complications.
>
> Nice spec. If I understand correctly, your position is that MediaStreams
should only represent uncompressed media? In the case of camera/mic data
they represent the uncompressed bits before they go to the codec for
transmission over a PeerConnection or before they are rendered by a
<audio>/<video>. In the case of standard <audio>/<video> playback they would
represent the uncompressed audio before it is sent to the audio card and the
uncompressed video before it is blitted on the screen. From a stream
processing point of view I can see how this makes sense. I was just
thinking that LocalMediaStream is just a wrapper around a source of media
data and all I was doing was providing a mechanism to provide media data
from JavaScript instead of from hardware.
I think the natural way to support the functionality you're looking for is
> to extend the concept of Blob URLs. Right now you can create a binary Blob,
> mint a URL for it and set that URL as the source for a media element. The
> only extension you need is the ability to append data to the Blob while
> retaining the same URL; you would need to initially mark the Blob as "open"
> to indicate to URL consumers that the data stream has not ended. That
> extension would be useful for all sorts of things because you can use those
> Blob URLs anywhere. An alternative would be to create a new kind of object
> representing an appendable sequence of Blobs and create an API to mint URLs
> for it.
>
> I thought about that, but I saw an earlier WHATWG thread<http://lists.whatwg.org/htdig.cgi/whatwg-whatwg.org/2011-June/032221.html> which
lead me down this MediaStream path. Using MediaStreams made more sense to me
because my use case felt similar to the live capture case except that I'm
using compressed media and it comes from JavaScript instead of hardware.
Also MediaStream already had a way to pass stream URLs to <audio> & <video>
for camera and remote peer stream data so I figured I could just leverage
that.
> Note that with my API proposal above, you can get a MediaStream from a
> media element that's using any URL and send that through a PeerConnection.
>
I see that. Interactions with PeerConnection were not a primary concern for
me. I was only mentioning it as a side benefit of using MediaStream.
Thanks for your comments. I appreciate them.
Aaron
More information about the whatwg
mailing list