[whatwg] Proposal for separating script downloads and execution

Kyle Simpson getify at gmail.com
Thu Feb 17 08:39:55 PST 2011

> The problem with prefetching immediately on src set is that you have no 
> idea when or whether the node will get inserted.  So you have to keep the 
> data alive... for how long?  From a user's point of view, that's a memory 
> leak pure and simple, if the node never gets inserted into the DOM.

Memory leak in the sense that the page is holding onto more memory than it 
*potentially* needs to. But not memory leak in the sense that this memory 
stays around after the page unloads/reloads, right? I dunno if I'd call that 
a "memory leak" as much as I'd call it a "higher memory utilization", or 
maybe "potential memory waste".

How much memory does a 25k JavaScript file take up while sitting in this 
queue? Is it roughly 25k, or is it a lot more? Compared to the 100-300 MB of 
memory that a Firefox instance takes up on my computer, what percentage of 
that would be (or would be increased) if Firefox were also holding onto even 
a large amount (~200k) of not-yet-used JavaScript code in memory?

Also, we have to consider whether the intended usage of this feature, by 
developers, is to unnecessarily waste bandwidth and memory and never use the 
scripts, or if it's in good-faith to eventually use them. Does that mean 
there will never be any memory waste? No. But I don't think it'll be the 
norm, at least based on the interested parties in this discussion.

I'd venture to guess that right now, there's a pretty small amount of code 
out there which is creating script elements en masse but not appending them 
to the DOM. Can't imagine really what that use-case would be (sans the 
preloading we're discussing). The likelihood is that the majority of tools 
that would be doing this new technique would be doing so intentionally, with 
the clear intent to later use that script.


More information about the whatwg mailing list