[whatwg] [media] startOffsetTime, also add startTime?

Philip Jägenstedt philipj at opera.com
Tue Apr 3 02:28:56 PDT 2012


Thanks for the spec changes, startDate is now in a state where I'd be  
happy to implement it! More comments inline:

On Tue, 03 Apr 2012 02:21:43 +0200, Ian Hickson <ian at hixie.ch> wrote:

> On Fri, 9 Mar 2012, Philip Jägenstedt wrote:
>> On Thu, 08 Mar 2012 19:16:40 +0100, Ian Hickson <ian at hixie.ch> wrote:
>> > On Thu, 8 Mar 2012, Philip Jägenstedt wrote:

>> I really don't know what startOffsetTime is intended for. AFAICT it's a
>> piece of metadata that you could just as well provide out-of-band, but
>> for convenience it is exposed via the DOM API. I think it could be handy
>> to have and would like to implement it, but I don't understand if it's
>> any different from other metadata like producer or location of a video.
>
> The startOffsetTime is useful for controllers who want to display a
> controller with real times, e.g. like TiVo's DVR UI, even when the
> underlying media resource has some more or less arbitrary timeline.
>
> e.g. if a TV station starts broadcasting on some Friday at 2pm, that  
> would
> be its zero time for its timeline, but eight months later, a user joining
> that stream doesn't care that the stream is 21 megaseconds old -- they
> just want to see 14:20 as the time that corresponds to what was streaming
> at 2:20pm.

This makes sense, and the new spec example makes it clearer.

>> It could also do with a good example. The spec says:
>>
>> "If the media resource specifies an explicit start time and date, then
>> that time and date should be considered the zero point in the media
>> timeline; the timeline offset will be the time and date, exposed using
>> the startOffsetTime attribute."
>>
>> I interpret this as a date at currentTime=0 in the spec's definition of
>> currentTime
>
> Right.
>
>
>> and currentTime=-initialTime (unless media fragments are used) in the
>> Opera/Firefox definition of currentTime.
>
> Not sure what this means.

In current Opera and Firefox the timeline is always normalized to start at  
0, so the time that corresponds to 0 in the original timeline would be at  
a negative currentTime. We will have to change this at the same time as  
implementing startDate, since otherwise everything will be a mess...

>> > > Finally, what about initialTime? It can be set to a non-zero value
>> > > at two points in the spec:
>> > >
>> > > "Establish the media timeline for the purposes of the current
>> > > playback position, the earliest possible position, and the initial
>> > > playback position, based on the media data."
>> > >
>> > > "If either the media resource or the address of the current media
>> > > resource indicate a particular start time, then set the initial
>> > > playback position to that time and"
>> > >
>> > > Does any format expose something like this in-band? I don't know of
>> > > any that do and how to implement this, so the only thing that
>> > > remains is exposing the start time of media fragments. This seems
>> > > rather useless to me, so unless someone has already implemented
>> > > initialTime and explain what it means, I suggest dropping it from
>> > > the spec.
>> >
>> > The address of the current media resource can indicate a particular
>> > start time if you implement media fragments.
>>
>> Yes, but why do we need to expose that in the DOM API, what is the use
>> case?
>
> Allows controllers to trivially implement UI to jump back to where the
> stream started, while still showing the full seekable range.

Unless I'm missing something, initialTime is just the initial value of  
currentTime, so this is already easy. Also, if media fragments are not  
used, just setting currentTime=0 will clamp and seek to the earliest  
position. However, I've never actually seen such UI for <video>, do you  
have a real world example? It seems to me like this is a <1% use case that  
is already easy to solve and that it's not worth adding an API to go from  
easy to trivial.



> On Tue, 13 Mar 2012, Philip Jägenstedt wrote:
>>
>> "In the absence of an explicit timeline, the zero time on the media
>> timeline should correspond to the first frame of the media resource. For
>> static audio and video files this is generally trivial. For streaming
>> resources, if the user agent will be able to seek to an earlier point
>> than the first frame originally provided by the server, then the zero
>> time should correspond to the earliest seekable time of the media
>> resource; otherwise, it should correspond to the first frame received
>> from the server (the point in the media resource at which the user agent
>> began receiving the stream)."
>>
>> There are multiple problems here, and I think it's responsible for some
>> of the confusion.
>>
>> * What is an "explicit timeline"? For example, does an Ogg stream that
>> starts with a non-zero timestamp have an explicit timeline?
>
> If there's a timestamp in the resource, then yes, it has an explicit
> timeline. That seems self-evident, but if you can think of a way that I
> could clarify this, I would be happy to do so.
>
> An example of a video resource without an explicit timeline would be
> a multipart/x-replace JPEG stream. There, the time between the frames is
> determined by the server's transmission rate, and the data itself has no
> timing information.

AFAIK, no browser supports any format for <video> that does not have  
timestamps. I don't think there's any practical need to say how to handle  
this until some implementor actually wants to do it, but if you really  
want to I would have been less confused if the lack of "explicit timeline"  
were portrayed as an exception, using something like multipart/x-replace  
as an example.

>> * Does "For streaming resources ..." apply only in the absence of an
>> explicit timeline, or in general? In other words, what's the scope of
>> "In the absence of an explicit timeline"?
>
> I've updated the second sentence to explicitly state that it also only
> applies in the absence of a timeline.

Thanks, that's much better!

>> * Why does the spec differentiate between static and streaming resources
>> at all?
>
> If you receive the entire file, there's no complication with respect to
> streaming to a point before the first rendered frame. The distinction is
> not intended to be normatively detectable, it's only intended to
> distinguish the easy case from the harder case. Again, if you think
> there's some way I could clarify that, please let me know.

IIUC, the spec is trying to handle resources that have no timestamps, are  
not (known to be) finite and where "the user agent will be able to seek to  
an earlier point than the first frame originally provided by the server",  
i.e. with server-side seeking. Do such resources actually exist? I don't  
see how they could, because how could the server seek without some concept  
of timestamps?

All in all, simply demanding that all formats used have a timeline mapping  
seems like a good way to deal with this, for now at least.

>> These definitions can be tweaked/clarified in one of two ways:
>>
>> 1. currentTime always reflects the underlying timestamps, such that a
>> resource can start playing at a non-zero offset and seekable.start(0)
>> could be non-zero even for a fully seekable resource. This is what the
>> spec already says, modulo the "streaming resources" weirdness.
>>
>> 2. Always normalize the timeline to start at 0 and end at duration.
>>
>> I think that the BBC blog post is favoring option 2, and while that's
>> closest to our implementation I don't feel strongly about it. A benefit
>> of option 1 is that currentTime=300 represents the same thing on all
>> clients, which should solve the syncing problem without involving any
>> kinds of dates.
>
> The spec definitely intends #1 if the format supports it. I don't think  
> #2
> makes sense for many cases (e.g. broadcast TV, any case where you can
> seek to before the first rendered frame), and more importantly, if you
> connect to a stream and then later start discarding earlier data, you end
> up in #1 even if you started in #2 so I see no benefit to going out of  
> our
> way to start in #2.

I (now) agree, and will try to align Opera with #1 when we poke at this  
next.

>> Make it pedantically clear which of the above two options is correct,
>> preferably with a pretty figure of a timeline with all the values
>> clearly marked out.
>
> I would be happy to add such a diagram, but I have no idea how to do it,
> given the bazillions of edge cases here.
>
> If anyone wants to make such a diagram, I recommend doing it by writing
> code for this tool:
>
>    http://software.hixie.ch/utilities/js/canvas/
>
> ...and then sending me the code. :-)
>
> (Ideally, using little parameterised functions for any repeated bits, so
> it's really easy to adjust.)

Odin, you make some diagrams, do you think any of those could be ported to  
a script?

-- 
Philip Jägenstedt
Core Developer
Opera Software



More information about the whatwg mailing list