[whatwg] [media] startOffsetTime, also add startTime?
ian at hixie.ch
Tue Apr 3 10:13:12 PDT 2012
On Tue, 3 Apr 2012, Philip Jägenstedt wrote:
> > >
> > > It could also do with a good example. The spec says:
> > >
> > > "If the media resource specifies an explicit start time and date,
> > > then that time and date should be considered the zero point in the
> > > media timeline; the timeline offset will be the time and date,
> > > exposed using the startOffsetTime attribute."
> > >
> > > I interpret this as [...] currentTime=-initialTime (unless media
> > > fragments are used) in the Opera/Firefox definition of currentTime.
> > Not sure what this means.
> In current Opera and Firefox the timeline is always normalized to start
> at 0, so the time that corresponds to 0 in the original timeline would
> be at a negative currentTime.
I still don't really understand what you mean by "start" here.
The idea is that all the times are unsigned, though. So if there's any way
to seek to one of these times that are before what you're calling the
"start", then yeah, it'll be a mess, because the naive approach of simply
drawing a seek bar from 0 to duration (rather than seekable.start(0) to
duration) will fail.
> We will have to change this at the same time as implementing startDate,
> since otherwise everything will be a mess...
So long as startDate gives the Date at media timeline's 0 point, it
doesn't really matter exactly what the media timeline is.
> > > > > Finally, what about initialTime? [...]
> > >
> > > Yes, but why do we need to expose that in the DOM API, what is the
> > > use case?
> > Allows controllers to trivially implement UI to jump back to where the
> > stream started, while still showing the full seekable range.
> Unless I'm missing something, initialTime is just the initial value of
> currentTime, so this is already easy.
Only if the controller is around when the video is created. Don't forget
that one of the design principles of this API is that you should be able
to hook up a controller at any time and have it be able to provide a
> Also, if media fragments are not used, just setting currentTime=0 will
> clamp and seek to the earliest position. However, I've never actually
> seen such UI for <video>, do you have a real world example? It seems to
> me like this is a <1% use case that is already easy to solve and that
> it's not worth adding an API to go from easy to trivial.
Yeah, that's probably fair. I've removed initialTime.
> > An example of a video resource without an explicit timeline would be a
> > multipart/x-replace JPEG stream. There, the time between the frames is
> > determined by the server's transmission rate, and the data itself has
> > no timing information.
> AFAIK, no browser supports any format for <video> that does not have
> timestamps. I don't think there's any practical need to say how to
> handle this until some implementor actually wants to do it, but if you
> really want to I would have been less confused if the lack of "explicit
> timeline" were portrayed as an exception, using something like
> multipart/x-replace as an example.
I've made this more explicit using some notes.
BTW, browsers do support formats that do not have explicit timelines or
even explicit timings. Animated GIFs only have inter-frame timings,
there's no explicit timeline. (A frame's position is implied by the number
of delays that come before it.) And the usual way of sending MJPEG
streams, namely multipart/x-mixed-replace, has no explicit timings
whatsoever. <video> is designed such that these formats could be supported
with the media API.
> > > * Why does the spec differentiate between static and streaming
> > > resources at all?
> > If you receive the entire file, there's no complication with respect
> > to streaming to a point before the first rendered frame. The
> > distinction is not intended to be normatively detectable, it's only
> > intended to distinguish the easy case from the harder case. Again, if
> > you think there's some way I could clarify that, please let me know.
I've removed the confusing bit about static resources vs streaming
resources, so hopefully this will be clearer now.
> IIUC, the spec is trying to handle resources that have no timestamps,
> are not (known to be) finite and where "the user agent will be able to
> seek to an earlier point than the first frame originally provided by the
> server", i.e. with server-side seeking. Do such resources actually
> exist? I don't see how they could, because how could the server seek
> without some concept of timestamps?
You could seek to them using frame numbers.
I'm not aware of such a format currently. I've added a note to that effect
to the spec.
> All in all, simply demanding that all formats used have a timeline
> mapping seems like a good way to deal with this, for now at least.
There are formats supported by browsers that do not have timelines. I
don't think we should exclude those ab initio.
Just covering all the bases in the spec doesn't mean we require anything
of browsers, but it does mean that if a browser wants to go beyond the
call of duty and support, say, animated GIFs, they can do so in an
unambiguous way without having to invent ways around the spec's limitations.
Ian Hickson U+1047E )\._.,--....,'``. fL
http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,.
Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
More information about the whatwg