[whatwg] BinaryEncoding for Typed Arrays using window.btoa and window.atob
Kenneth Russell
kbr at google.com
Mon Aug 5 17:41:56 PDT 2013
On Mon, Aug 5, 2013 at 2:04 PM, Simon Pieters <simonp at opera.com> wrote:
> On Mon, 05 Aug 2013 22:39:22 +0200, Chang Shu <cshu01 at gmail.com> wrote:
>
>> I see your point now, Simon. Technically both approaches should work.
>> As you said, yours has the limitation that the implementation does not
>> know which view to return unless you provide an enum type of parameter
>> instead of boolean to atob. And mine has the performance issue. How
>> about we don't return the 'binary' string in case the 2nd parameter is
>> provided in my case?
>
>
> That works for me.
Chang, in your proposal for modifying atob, how does the developer
know how many characters were written into the outgoing
ArrayBufferView?
What happens if the ArrayBufferView argument isn't large enough to
hold the decoded string?
During the decoding process, is the type of the ArrayBufferView
significant? In your example, what would happen if an Int16Array were
passed instead of an Int32Array?
The Encoding spec at http://encoding.spec.whatwg.org/ seems to have
handled issues like these. Perhaps a better route would be to fold
this functionality into that spec.
-Ken
More information about the whatwg
mailing list