[whatwg] Proposal for separating script downloads and execution
bzbarsky at MIT.EDU
Thu Feb 17 10:18:01 PST 2011
On 2/17/11 12:23 PM, Kyle Simpson wrote:
>> My worries are cases where a page inadvertently makes you hold on to
>> tens or hundreds of megabytes of js, not about the 200k case.
> being loaded onto pages? Even "tens of megabytes" seems quite
Think 10,000 <script> elements all pointing to the same 25KB script. If
you're forced to preload the script at src-set time, that's 25MB of data.
And if the argument is that the scripts can share the data, I don't see
what guarantees that. Doing that while obeying HTTP semantics might be
pretty difficult if you don't have very low-level control over your
> but somehow they avoided that flaw in IE, where it should be killing them.
Who says they avoided it in IE?
Who says they're running the same code in IE and other browsers, for
>> You're assuming scripts mean to do everything they do. That's not a
>> good assumption, unfortunately.
> Here's what I'm assuming: more than not, this feature will be used
Sure. That doesn't mean we shouldn't worry about the edge cases. It
might be we decide to ignore them, after careful consideration. But we
should consider them.
> I haven't seen any examples of existing sites where the "millions of
> script nodes" phenomena is happening right now, which would be potential
> landmines for this newly suggested "preloading" functionality. The fear
> of it being theoretically possible seems much more intense than any
> evidence or logical reasoning for it being probable.
I've seen sites creating tens of thousands of script nodes, certainly,
and then screwing something up (I know the latter because we got bug
reports about it). If I find the relevant bugs I'll post them here, but
honestly that's a low priority for me right now.
> I also am on record as saying that I think it's a bad idea to avoid a
> useful (to some) feature for fears that others (probably the minority)
> will abuse or misuse it.
Yes, and I'm on record saying that I need to think about my users and
protecting them from the minority of incompetent or malicious web
developers. We just have slightly different goals here.
> To my knowledge, that process worked ok, and I think it's a decent model for going forward.
Just to be clear, that process, on our end, was a huge engineering-time
sink. Several man-months were wasted on it. We would very much like to
avoid having to repeat that experience if at all possible.
And yes, as you said, it worked "ok" in the sense that sites were fixed.
It could have gone much worse.
> Unless someone can show that the majority of sites (or even just
> something greater than a tiny fraction) are going to choke. And if that
> can be shown, I welcome it. But I'd also be extremely curious as to how
> those same sites are (probably) surviving fine in IE, which has been
> doing this preloading for a decade or more.
Does IE obey HTTP semantics for the preloads? Has anyone does some
really careful testing of IE's actual behavior here?
More information about the whatwg