[whatwg] Endianness of typed arrays
callow_mark at hicorp.co.jp
Wed Mar 28 02:32:18 PDT 2012
On 28/03/2012 18:13, Boris Zbarsky wrote:
> What one could do is to store the array buffer bytes always as little
> endian, and then if sending to the GPU byte-swap as needed based on
> the API call being used (and hence the exact types the GPU actually
> So basically, make all JS-visible state always be little-endian, and
> deal in the one place where you actually need native endianness.
Then, if you are on a big-endian system an app will not be able to read
& write int, float, etc. into the int32Array, float32Array, etc. "Typed"
in TypedArrays will no longer have any meaning.
BTW, if the CPU & GPU differ in endianness it is the responsibility of
the OpenGL driver to handle it. When you tell GL you are passing it,
e.g. GL_FLOATs, the values are expected to be in CPU byte-order.
More information about the whatwg