[whatwg] Timing API proposal for measuring intervals

Chris Rogers crogers at google.com
Mon Jul 11 12:23:33 PDT 2011

On Fri, Jul 8, 2011 at 4:37 PM, Robert O'Callahan <robert at ocallahan.org>wrote:

> , On Fri, Jul 8, 2011 at 2:54 PM, James Robinson <jamesr at google.com>
> wrote:
> > On Thu, Jul 7, 2011 at 7:36 PM, Robert O'Callahan <robert at ocallahan.org
> >wrote:
> >
> >>
> >> Using this value as a clock for media synchronization sounds appealing
> but
> >> is complicated by audio clock drift. When you play N seconds of audio,
> it
> >> might take slightly more or less time to actually play, so it's hard to
> keep
> >> media times perfectly in sync with another timing source. Just something
> to
> >> keep in mind.
> >>
> >
> > True.  On OS X, however, the CoreVideo and CoreAudio APIs are specified
> to
> > use a unified time base (see
> >
> http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CVTimeRef/Reference/reference.html
> )
> > so if we do end up with APIs saying "play this sound at time X", like
> Chris
> > Roger's proposed Web Audio API provides, it'll be really handy if we have
> a
> > unified timescale for everyone to refer to.
> >
> Is that unified time base in sync with the system clock, and how accurate
> is
> it? I'm concerned about the possibility that it's not feasible to keep the
> audio hardware clock in sync with the system clock, at least on some
> platforms. In that case, we probably need to keep media playback and
> animations in sync with the audio hardware clock, and we could even expose
> that via some DOM API, but you might not want to use the same clock for
> other purposes, such general performance timing for example ... I've heard
> the audio clock drift is often significant.
> I'm not sure if this is a real problem or not, I just want to make sure.
> Rob

Hi Robert, I think the clock that James is proposing is effectively the
system clock (such as mach_absolute_time() or QueryPerformanceCounter()).  I
completely agree with your concerns about clock drift of this type of clock
with the audio hardware clock, since they're almost always running off
different crystals.  That said, I still think it's useful to have a
high-resolution system clock which is monotonically increasing.  On Mac OS X
the mach_absolute_time() system clock is used extensively in the media APIs,
CoreVideo, CoreAudio, CoreMIDI, etc. to synchronize events according to a
common time base:



CoreMIDI ( see MIDITimeStamp: A host clock time ----  typedef UInt64
MIDITimeStamp; )


In the CoreAudio case, the AudioTimeStamp contains *both* the host-time
(system clock) and the sample time (based on audio hardware).  This creates
a relationship between the two clocks.  As an example of how these two
clocks can be used together for synchronization, audio applications use the
high-resolution timestamp of incoming MIDI messages to schedule audio
synthesis to happen with very low jitter by doing sample-accurate scheduling
when rendering the audio stream.

Because of clock-drift, the system clock that James is proposing cannot
*directly* be the same clock as what I'm proposing in the Web Audio API
AudioContext .currentTime attribute.  But there are ways to translate
between the two in very useful ways.

I would be in favor of introducing a monotonically increasing
high-resolution system clock such as James is proposing, as long as we keep
in mind some of these subtleties.


More information about the whatwg mailing list