[whatwg] BinaryEncoding for Typed Arrays using window.btoa and window.atob
cshu01 at gmail.com
Mon Aug 5 18:15:47 PDT 2013
I think the atob implementation is able to enlarge the buffer size
from c++ side if necessary. And during the decoding process, I thought
the algorithm first decodes base64 string into a binary string and
casts/copies it into the desired type based on the input
ArrayBufferView. So it is the user who decides the type of the output
not the encoded string. I think it's the same thing for the current
atob function. The encoded string has no information about what the
type of the output would be. Do you think the above is correct?
There have been some email exchanges about the different approaches of
binary encoding. I was told that the binary encoding part was taken
out of the encoding spec on purpose and the bota/atob approach was
preferred. But people are also talking about bringing it back to the
encoding spec. My bottom line is to have something working so I can
use it in my project. btoa/atob seems a natural enhancement and
low-hanging fruit to me.
> Chang, in your proposal for modifying atob, how does the developer
> know how many characters were written into the outgoing
> What happens if the ArrayBufferView argument isn't large enough to
> hold the decoded string?
> During the decoding process, is the type of the ArrayBufferView
> significant? In your example, what would happen if an Int16Array were
> passed instead of an Int32Array?
> The Encoding spec at http://encoding.spec.whatwg.org/ seems to have
> handled issues like these. Perhaps a better route would be to fold
> this functionality into that spec.
More information about the whatwg