[whatwg] Audio canvas?
Mathieu HENRI
p01 at opera.com
Wed Jul 16 06:41:14 PDT 2008
Dr. Markus Walther wrote:
>
> >> My understanding of HTMLMediaElement is that the currentTime, volume
> >> and playbackRate properties can be modified live.
> >>
> >> So in a way Audio is already like Canvas : the developer modify things
> >> on the go. There is no automated animations/transitions like in SVG
> >> for instance.
> >>
> >> Doing a cross fade in Audio is done exactly the same way as in Canvas.
>
> That's not what I described, however. Canvas allows access to the most
> primitive element with which an image is composed, the pixel. Audio does
> not allow access to the sample, which is the equivalent of pixel in the
> sound domain. That's a severe limitation. Using tricks with data URIs
> and a known simple audio format such as PCM WAVE is no real substitute,
> because JavaScript strings are immutable.
>
> It is unclear to me why content is still often seen as static by default
> - if desktop apps are moved to the browser, images and sound will
> increasingly be generated and modified on-the-fly, client-side.
Agreed.
Having an equivalent of the ImageData for Audio would open some really
interesting possibilities.
> > And if you're thinking special effects ( e.g.: delay, chorus, flanger,
> > pass band, ... ) remember that with Canvas, advanced effects require
> > trickery and to composite multiple Canvas elements.
>
> I have use cases in mind like an in-browser audio editor for music or
> speech applications (think 'Cooledit/Audacity in a browser'), where
> doing everything server-side would be prohibitive due to the amount of
> network traffic.
Mind you I have the same use cases.
> --Markus
>
--
Mathieu 'p01' HENRI
JavaScript developer, Opera Software ASA
More information about the whatwg
mailing list