[whatwg] [media] startOffsetTime, also add startTime?
Ian Hickson
ian at hixie.ch
Thu Mar 8 10:16:40 PST 2012
(Oops, sorry. Missed these e-mails in my earlier reply.)
On Thu, 8 Mar 2012, Philip Jägenstedt wrote:
> On Wed, 07 Mar 2012 11:56:42 +0100, Odin Hørthe Omdal
> <odinho at opera.com> wrote:
> >
> > startOffsetTime seem to leave people confused, I often have to explain
> > it, and yesterday I read the spec[5] and old emails and got confused
> > myself. It hasn't been implemented after almost 2 years.
>
> We (Opera) have wanted to implement this for a long time, but it has
> been stalled by the fact that the spec is confusing to the point that we
> haven't been able to agree on what it's actually trying to say. Let's
> fix that.
I'm happy to make it clearer, but it seems clear to me. What are your
interpretations, so that I can explicitly rule out in the spec the ones
that are not intended?
> I agree that it would be useful to expose the constant by which
> timestamps are adjusted
Time stamps should not be adjusted.
> to guarantee that that currentTime starts at 0 and ends at duration.
That is not what the spec requires.
> I think that both a name like startTime (or initialTime) would suggest
> that it is the initial value of currentTime, which it is not.
initialTime is the initial value of currentTime.
> I suggest the property offsetTime, defined as the stream time in seconds
> which currentTime and duration are relative to.
I don't understand what this means. The currentTime is relative to the
media timeline, which is UA-defined and "should" be based on the media
timeline.
> In practice it would often be understood as the "time since the server
> began streaming" and would be useful to sync live streams with
> out-of-band content simply by letting the out-of-band content be
> relative to the start of the stream.
That "should" be zero. I can change that to a "must" if you like; it's
a "should" because in some cases (e.g. MJPEG) you don't know what the
media timeline is or how to interpret it, so there's no way to do it.
> No round-trip with Date representations should be necessary in the
> common case.
The startOffsetTime attribute is intended for display, no? Why would you
round-trip with it?
> As hinted above, I don't think that startOffsetTime should really be the
> first choice for trying to sync live streams.
Indeed.
> However, knowing the date of a video is still useful, potentially even
> for the streaming case, so we do want to expose the DateUTC field from
> WebM. However, startOffsetTime is a bad name for it, since it's not
> using the same unit as currentTime. I suggest offsetDate, to go with
> offsetTime.
I don't mind renaming startOffsetTime if people think that would help. I
don't think "offsetDate" is any clearer though.
How about "mediaTimelineOriginDate"?
> Finally, what about initialTime? It can be set to a non-zero value at
> two points in the spec:
>
> "Establish the media timeline for the purposes of the current playback
> position, the earliest possible position, and the initial playback
> position, based on the media data."
>
> "If either the media resource or the address of the current media
> resource indicate a particular start time, then set the initial playback
> position to that time and"
>
> Does any format expose something like this in-band? I don't know of any
> that do and how to implement this, so the only thing that remains is
> exposing the start time of media fragments. This seems rather useless to
> me, so unless someone has already implemented initialTime and explain
> what it means, I suggest dropping it from the spec.
The address of the current media resource can indicate a particular start
time if you implement media fragments.
On Thu, 8 Mar 2012, Philip Jägenstedt wrote:
>
> currentTime=-offsetTime, an origin time that you can't actually seek to
> in the streaming case.
Whether you can seek there or not depends entirely on the protocol and
server. It's not a given that you can't seek to it.
> We discussed the concatenation of two clips and how to represent the
> date. At least chained WebM and chained Ogg should be able to represent
> this.
The spec requires ("must") that in the case of chained clips with
discontinuous timelines, the first clip's timeline be extended to cover
the others, and any data regarding the timeline in the subsequest clips is
dropped.
> To reduce the possibility for confusion about what date is represented
> and to allow the recording date to be preserved in editing, how about
> exposing currentDate instead?
What's the use case?
On Thu, 8 Mar 2012, Odin Hørthe Omdal wrote:
>
> Ah, but that is up to the user agent to decide how to show the time
> code. The currentTime should be normalized from 0 until duration.
I don't really understand what this means, but for some interpretations,
I disagree.
I agree that duration should be a time on the media timeline (and not a
length of time independent of timeline). I'm not sure what you mean by 0.
> > In addition, I wonder if negative values for currentTime are legal.
> > For instance, when streaming a Formula 1 race that starts at 17.00, I
> > would not be surprised to see negative currentTime if I join the
> > stream before the race starts.
>
> They are not, and shouldn't be.
The spec doesn't actually disallow it, though it does discourage it. I
could explicitly disallow a timeline with negative components.
> currentTime is always normalized to 0 -> duration.
I don't think the spec supports this assertion.
--
Ian Hickson U+1047E )\._.,--....,'``. fL
http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,.
Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
More information about the whatwg
mailing list