[whatwg] Endianness of typed arrays

Boris Zbarsky bzbarsky at MIT.EDU
Wed Mar 28 02:13:40 PDT 2012


On 3/28/12 2:04 AM, Jonas Sicking wrote:
> Consider a big-endian platform where both the CPU and the GPU is
> big-endian. If a webpage writes 16bit data into an ArrayBuffer and
> then sends that off to the GPU using WebGL, the data had better be
> sent in big-endian otherwise the GPU will interpret it wrong.
>
> However if the same page then writes some 16bit data into an
> ArrayBuffer and then looks at its individual bytes or send it across
> the network to a server, it's very likely that the data needs to
> appear as little-endian or site logic might break.
>
> Basically I don't know how one would write a modern browser on a
> big-endian system.

What one could do is to store the array buffer bytes always as little 
endian, and then if sending to the GPU byte-swap as needed based on the 
API call being used (and hence the exact types the GPU actually expects).

So basically, make all JS-visible state always be little-endian, and 
deal in the one place where you actually need native endianness.

I believe that was substantially Robert's proposal earlier in this thread.

-Boris



More information about the whatwg mailing list