[whatwg] Proposal for separating script downloads and execution
Nils Dagsson Moskopp
nils at dieweltistgarnichtso.net
Tue Feb 8 22:40:56 PST 2011
"Kyle Simpson" <getify at gmail.com> schrieb am Tue, 8 Feb 2011 23:27:35
> implemented (and if the code is significantly different between
> mobile and desktop). But I can say that even if their code is
> substantially the same, I could still see it quite plausible that the
> device itself locks up (not the browser) if there's just simply too
> much going, taxing its limited CPU power. Heck, I have times when my
> powerhouse desktop freezes for a brief moment when I have a lot going
> on. Exhausting the CPU is not a difficult thing to imagine happening
> on these tiny devices.
Anecdotal evidence is not data. Also, today these “tiny devices”
usually have some serious processing power.
> I can also see it quite plausible that mobile OS's are not as capable
> of taking advantage of multi-threading (maybe intentionally forbidden
> from it in many instances, for fear of battery life degradation).
I can not follow here.
> Perhaps it's simply not possible to multi-thread the parsing of
> really am completely guessing here)
Mighty conjecture, chap. Multithreading is even possible on
microcontrollers like the Atmel ATmega32 — so why should a modern
operating system running on reasonable hardware not be able to do it?
>, then it's not exactly a "quality
> concerned, but more an issue of how the mobile OS is designed and
> integrated with the device hardware.
It would still be a quality of implementation issue. Mobile operating
systems are constantly evolving, too — just look at Android.
> Regardless, considering such
> things is way outside the scope of anything that's going to be useful
> for web developers in the near-term dealing with these use-cases.
“X is possible and one cannot know that X is not true, so X is true.” is
a common sceptic proposition and not a sound argument outside of
> Even if you're right and the fault really lies with the implementor
> for this discussion to go down. No matter how good the mobile
> such a way as to overload the device. That is a virtual certainty.
In related news, some current desktop browsers choke at the single page
version of the HTML spec, all while handling a vast amount of other
pages just fine. So, no matter how good the desktop engines get, I
promise you … that efforts to simply speed up stuff will not be
fruitless for a vast majority of pages.
> I don't want to cause browsers to be less performant or hold them
> back from improving. I want to help developers have an option to
> increase performance in those cases where the browser's automatic
> processes to do so happens to fall short. I believe there must be a
> way to achieve both goals simultaneously.
Interesting, and probably workable if you keep it to simple hints. But
whatever more complex heuristics developers come up with should
probably end up in browsers — making the web faster for everyone.
Or am I missing something here?
> What's VERY important to note: (perhaps) the most critical part of
> user-experience satisfaction in web page interaction is the *initial*
> page-load experience. So if it's a tradeoff where I can get my
> page-load to go much quicker on a mobile device (and get some useful
> content in front of them quickly) in exchange for some lag later in
> the lifetime of the page, that's a choice I (and many other devs) are
> likely to want to make. Regardless of wanting freedom of
> implementation, no browser/engine implementation should fight
> against/resist the efforts of a web author to streamline initial
> page-load performance.
Fun fact: I use mobile versions of some web sites, because they are much
quicker, even on the desktop. Sometimes a little minimalism can go a
> Presumably, if an author is taking the extraordinary steps to wire up
> advanced functionality like deferred execution (especially
> negotiating that with several scripts), they are doing so
> intentionally to improve performance, and so if they ended up
> actually doing the reverse, and killing their performance to an
> unacceptable level, they'd see that quickly, and back-track. It'd be
> silly and unlikely to think they'd go to the extra trouble to
> actually worsen their performance compared to before.
Counter-intuitive at first, but true: More complex code is not
necessarly faster code. More options are more options to screw up.
> Really, let's not always assume the worst about web authors. I
> believe in giving them appropriate tools to inspire them to do the
> best. If they do it wrongly and their users suffer, bad on them, not
> on the rest of us. That's not an excuse for recklessly poor
> implementation of features, but it IS a call for giving some benefit
> of the doubt from time to time.
Why do you hate formal evidence?
> > In other words, forbid the browser to start parsing the script,
> > right? How would you tell whether a browser implemented this as
> > specified?
> I could tell if the browser let me control execution pretty easily.
> As for telling if the browser were deferring parsing to a useful
> degree, I'm sure that the only way to actually determine that would
> be test a site with a particularly long script file (length being a
> rough approximation for parsing time), and see just when (between
> load and execution) the big parsing delay (as observed previously)
> was happening. If the lock up still happened during page-load right
> after the script loads, even though execution was specifically
> deferred, that would seem to be the browser being stubborn and
Browsers do not like it to be anthropomorphised. ;)
Nils Dagsson Moskopp // erlehmann
More information about the whatwg