[whatwg] outputting audio from java-script
Toni Ruottu
toni.ruottu at iki.fi
Tue Feb 9 06:54:20 PST 2010
hello
I have been working around a "game music" composing application called
Kunquat, which is completely unrelated to web. Discussing various
details of the said desktop application has however, awaken me to
think about related issues. At times we have discussed implementing
similar pieces of music software for the browser environment. With
Google Chrome OS coming up, being able to create music in a browser
environment is becoming increasingly, but similar interfaces are also
needed for implementing games, or writing players for formats
unsupported by browsers.
I am excited about html5 supporting audio files on web sites, at the
same time I am worried about how the more general need of generating
sound. It seems to me that it has been forgotten from the
specification. Lets consider a case where one wants to create a simple
web instrument that will produce a sine wave, send it to the speakers,
and let user alternate the parameters used for producing the
continuous sound. With the current model, one wishing to do so is in
trouble. First of all she will need a java-script library for turning
the wave into an ogg file. Then she needs to turn the vorbis file into
a data url and add it as an audio element to the web page. This is
still tolerable, but trying to add the next chunk of sound with
another audio element at the exact right time for the sound to be
continuous is the real killer.
The lack of a simple audio outputting method leads to hacks, such as
the one used in jsnes ( http://benfirshman.com/projects/jsnes/ ).
Jsnes is a web application used for playing old NES games. It
currently outputs sound by having a separate flash application read
the sound from some variable. Other than that the application is
java-script. Starting flash for playing the sound seems to drain lots
of resources atleast on my computer. I bought my computer ~1 year ago.
Yet I can not play the games with sound turned on using Chrome, which
is supposed to have the best js-engine currently available. Without
sound the game runs fine with plenty of cpu cycles left, but turning
sound on changes the situation completely. Thus I believe that a
standard way for producing sound would help both software users and
developers.
I suggest that we add a a function for playing sound. The function
should probably be frame specific. Maybe
window.outputaudio(someaudio). The parameter "someaudio" would be a
list of two channels [left,right] with both left and right being
equally long lists of frames represented as floats. The function
should add the sound into a hidden circular buffer, where it would get
automatically consumed by a player. In case someaudio would not fit
into the free space, the function should block until enough space has
been made available, and return only after someaudio has been
successfully transfered into the buffer. Adding more than half of the
buffer at once should result in an error, as using such big chunks may
cause problems that may disrupt continuous playback.
There are at least two parameters open for discussion here: the size
of the hidden buffer, and amplitude of the sound samples. We could
either settle for some fixed values, or have some way of configuring
this. If we settle for fixed values we could use e.g. 96 kHz for the
amplitude. This is what is used in DVDs. I'm not entirely sure about a
decent fixed buffer size, but basically a bigger buffer should never
cause problems to an application. Now what is big enough, is another
question. The buffer would in theory have to be reserved for each
window, multiplying the amount with the amount of windows.
I'd be happy to see responses regarding the parameters, the API, and
the bureaucracy required for getting the feature into html5
specification.
best regards, --Toni Ruottu
More information about the whatwg
mailing list