[whatwg] How to handle multitrack media resources in HTML
jer.noble at apple.com
Fri Apr 8 12:43:37 PDT 2011
On Apr 7, 2011, at 11:54 PM, Ian Hickson wrote:
>> The distinction between a master media element and a master media
>> controller is, in my mind, mostly a distinction without a difference.
>> However, a welcome addition to the media controller would be convenience
>> APIs for the above properties (as well as playbackState, networkState,
>> seekable, and buffered).
> I'm not sure what networkState in this context. playbackState, assuming
> you mean 'paused', is already exposed.
Sorry, by playbackState, I meant readyState. And I was suggesting that, much in the same way that you've provided .buffered and .seekable properties which "expose the intersection of the slaved media elements' corresponding ranges", that a readyState property could similarly reflect the readyState values of all the slaved media elements. In this case, the MediaController's hypothetical readyState wouldn't flip to HAVE_ENOUGH_DATA until all the constituent media element's ready states reached at least the same value.
Of course, this would imply that the load events fired by a media element (e.g. loadedmetadata, canplaythrough) were also fired by the MediaController, and I would support this change as well.
Again, this would be just a convenience for authors, as this information is already available in other forms and could be relatively easily calculated on-the-fly in scripts. But UAs are likely going to have do these calculations anyway to support things like autoplay, so adding explicit support for them in API form would not (imho) be unduly burdensome.
Jer Noble <jer.noble at apple.com>
More information about the whatwg