Browser Page Load Performance

Steve Souders is currently doing more to improve the performance of web pages and web browsers than anyone else out there. When he worked at Yahoo! he was responsible for YSlow (a great tool for measuring ways to improve the performance of your site) and he wrote the book on improving page performance: High Performance Web Sites. Now he works for Google but much of what he’s up to is the same: Making web pages load faster.

I’ve been really excited about one of his recent project releases: UA Profiler. The profiler is a tool that you can run in your browser to determine the status of a number of network-performance-specific features that tie heavily to browser page load performance.

Here’s a look at the current breakdown:

We can see Firefox 3.1 taking a lead, fixing 9 out of 11 of the issues tested for. Firefox 3, Chrome, and Safari 4 all come after with 8 fixed. Firefox 2, Safari 3.1, and IE 8 next at 7. Those numbers help to give you an overall feel of the page load performance that you’ll see in a browser. (Naturally these tests don’t take any rendering or JavaScript performance numbers into account but network performance generally trumps their total runtime anyway.)

Information about network performance is important for two reasons:

  1. It informs browser vendors as to the quality of their browser. A browser fixing any of the points specified by the test will yield faster page loads.
  2. It informs web site developers as to the problems they should be taking into consideration when developing a site. For example if a browser they support doesn’t handle simultaneous stylesheet downloading perhaps their page should be re-worked.

The tests themselves can be broken down into a couple categories (Steve explains them all, in detail in the FAQ):

Network Connections

Two big things are tested here: The number of simultaneous connections that can be opened to a hostname (sub-domains count as different hostnames) and how many connections can be opened to any number of hostnames, simultaneously. These numbers can give you a good indicator of how many parallel downloads can occur (most commonly seen for downloading multiple images, simultaneously).

Additionally there is a check to see if the browser supports Gzip compression. The results aren’t too exciting here as all modern browsers support Gzip compression at this point.

Parallel Downloads

All browsers are capable of downloading images in parallel (multiple images downloading simultaneously) but what about other resources (like scripts or stylesheets)?

Unfortunately it’s much harder to get scripts and stylesheets to load in parallel since their contents may dramatically change the rest of the page. The loading of these resources occur in three steps:

  1. Downloading (can be parallelized)
  2. Parsing
  3. Execution

The load order breaks down like so (sort of an advanced game of rock-paper-scissors): Scripts prevent other scripts from parsing and executing, stylesheets prevent scripts from parsing and executing.

It’s been hard for browsers to implement the parallelization of script downloading since scripts are capable of changing the contents of the page – and possibly removing adding new scripts or stylesheets to the page. Because of this browsers are starting to get better at opportunistically looking ahead in the document and pre-loading stylesheets and scripts – even if their actual use may be delayed.

Changes in this area will yield some of the largest benefits to browser page load performance, going forward, as it’s still one of the most untapped areas of improvement.


While all modern browsers support caching of resources, caching of page redirects is much less common. For example, consider the case where a user types in “” – Google redirects the user to “” but only a couple browsers cache that redirect as to not retry it later.

A similar case of redirect caching occurs for resources, for example with stylesheets, images, or scripts. Since these occur much more frequently it becomes that much more important for browsers to cache every action that they can.


This is part of the HTML 5 specification and allows for pages to specify resources which should be opportunistically downloaded in case they should be used in the future (the common example of image rollovers could be used here).

There’s a full page describing how to use them on the Mozilla developer wiki but it isn’t that hard to get started. It’s as simple as including a new link element in the top of your site:

<link rel="prefetch" href="/images/big.jpeg">

And that resource will be downloaded preemptively.

Inline Images

The final case that the profiler tests for is the ability of a browser to support inline images using a data: URI. Data URIs give developers the ability to include the image data directly within the page itself. While this saves an extra HTTP request it’s important to note that the resource will not be cached (at least not as external resource – it may be cached as part of the complete page). The use of this technique will vary on a case-by-case basis but having a browser support it is absolutely important.

Going forward it will become increasingly important to have publicly-visible tests like the UA Profiler that are able to encourage browser vendors to act quicker at implementing critical browser functionality. Anything that’s able to, even indirectly, improve the performance of the browsing experience for users of the web is absolutely critical, in my book.

Posted: November 24th, 2008

Subscribe for email updates

18 Comments (Show Comments)

Comments are closed.
Comments are automatically turned off two weeks after the original post. If you have a question concerning the content of this post, please feel free to contact me.

Secrets of the JavaScript Ninja

Secrets of the JS Ninja

Secret techniques of top JavaScript programmers. Published by Manning.

John Resig Twitter Updates

@jeresig / Mastodon

Infrequent, short, updates and links.