[whatwg] Hardware accelerated canvas
cabanier at gmail.com
Sun Sep 2 15:24:42 PDT 2012
On Sun, Sep 2, 2012 at 2:24 PM, Ian Hickson <ian at hixie.ch> wrote:
> On Sun, 2 Sep 2012, Erik Möller wrote:
> > As we hardware accelerate the rendering of <canvas>, not just with the
> > context, we have to figure out how to best handle the fact that GPUs
> loose the
> > rendering context for various reasons. Reasons for loosing the context
> > from platform to platform but ranges from going into power-save mode, to
> > internal driver errors and the famous long running shader protection.
> > A lost context means all resources uploaded to the GPU will be gone and
> > to be recreated. For canvas it is not impossible, though IMO
> > expensive to try to automatically restore a lost context and guarantee
> > same behaviour as in software.
> > The two options I can think of would be to:
> > a) read back the framebuffer after each draw call.
> > b) read back the framebuffer before the first draw call of a "frame" and
> > a display list of all other draw operations.
> > Neither seem like a particularly good option if we're looking to actually
> > improve on canvas performance. Especially on mobile where read-back
> > performance is very poor.
> > The WebGL solution is to fire an event and let the js-implementation
> deal with
> > recovering after a lost context
> > http://www.khronos.org/registry/webgl/specs/latest/#5.15.2
> > My preferred option would be to make a generic context lost event for
> > but I'm interested to hear what people have to say about this.
> Realistically, there are too many pages that have 2D canvases that are
> drawn to once and never updated for any solution other than "don't lose
> the data" to be adopted. How exactly this is implemented is a quality of
> implementation issue.
It would be interesting to hear what other implementors have done to work
around this. Chrome has code to do most of canvas in hardware and they are
able to run it on devices with limit resources.
I do think that Erik's request has merit. With regular images, you can
always remove it from memory if needed and retrieve it from the history or
reload it from the original source.
An implementor could save a bitmap representation of the canvas to its
cache but this is probably slow if it has to be read back from the GPU and
the image could potentially be large.
If there was a callback for context loss and if the user had set it, a
browser could throw the entire canvas out and ask for it to be re-rendered
if the canvas is shown again. This would even make sense if you don't have
a HW accelerated canvas.
There would be no backward compatibility issue either. If the user doesn't
set the callback, a browser would have to do something reasonable to keep
the canvas bitmap around.
More information about the whatwg