[whatwg] StringEncoding: encode() return type looks weird in the IDL

Glenn Maynard glenn at zewt.org
Sun Aug 5 09:29:03 PDT 2012

On Sun, Aug 5, 2012 at 1:15 AM, Boris Zbarsky <bzbarsky at mit.edu> wrote:

> encode() should return a Uint8Array in the IDL, in my opinion.  Right now
> the prose says that it does, while the IDL has ArrayBufferView, which
> doesn't make much sense to me.

My recollection is this was to allow returning Uint16Array (or, more
specifically but currently unresolved, Uint16LEArray and Uint16BEArray) for
encoding to UTF-16 and UTF-16BE.

I guess the brokenness of Uint16Array (eg. the current lack of
Uint16LEArray) could be sidestepped by just always returning Uint8Array,
even if encoding to a 16-bit encoding (which is what it currently says to
do).  Maybe that's better anyway, since it avoids making UTF-16 a special
case.  I guess that if you're converting a string to a UTF-16 ArrayBuffer,
you're probably doing it to quickly dump it into a binary field somewhere
anyway--if you wanted to *examine* the codepoints, you'd just look at the
DOMString you started with.

Glenn Maynard

More information about the whatwg mailing list