Steve Souders is currently doing more to improve the performance of web pages and web browsers than anyone else out there. When he worked at Yahoo! he was responsible for YSlow (a great tool for measuring ways to improve the performance of your site) and he wrote the book on improving page performance: High Performance Web Sites. Now he works for Google but much of what he’s up to is the same: Making web pages load faster.
I’ve been really excited about one of his recent project releases: UA Profiler. The profiler is a tool that you can run in your browser to determine the status of a number of network-performance-specific features that tie heavily to browser page load performance.
Here’s a look at the current breakdown:
Information about network performance is important for two reasons:
- It informs browser vendors as to the quality of their browser. A browser fixing any of the points specified by the test will yield faster page loads.
- It informs web site developers as to the problems they should be taking into consideration when developing a site. For example if a browser they support doesn’t handle simultaneous stylesheet downloading perhaps their page should be re-worked.
The tests themselves can be broken down into a couple categories (Steve explains them all, in detail in the FAQ):
Two big things are tested here: The number of simultaneous connections that can be opened to a hostname (sub-domains count as different hostnames) and how many connections can be opened to any number of hostnames, simultaneously. These numbers can give you a good indicator of how many parallel downloads can occur (most commonly seen for downloading multiple images, simultaneously).
Additionally there is a check to see if the browser supports Gzip compression. The results aren’t too exciting here as all modern browsers support Gzip compression at this point.
All browsers are capable of downloading images in parallel (multiple images downloading simultaneously) but what about other resources (like scripts or stylesheets)?
Unfortunately it’s much harder to get scripts and stylesheets to load in parallel since their contents may dramatically change the rest of the page. The loading of these resources occur in three steps:
- Downloading (can be parallelized)
The load order breaks down like so (sort of an advanced game of rock-paper-scissors): Scripts prevent other scripts from parsing and executing, stylesheets prevent scripts from parsing and executing.
It’s been hard for browsers to implement the parallelization of script downloading since scripts are capable of changing the contents of the page – and possibly removing adding new scripts or stylesheets to the page. Because of this browsers are starting to get better at opportunistically looking ahead in the document and pre-loading stylesheets and scripts – even if their actual use may be delayed.
Changes in this area will yield some of the largest benefits to browser page load performance, going forward, as it’s still one of the most untapped areas of improvement.
While all modern browsers support caching of resources, caching of page redirects is much less common. For example, consider the case where a user types in “http://google.com/” – Google redirects the user to “http://www.google.com/” but only a couple browsers cache that redirect as to not retry it later.
A similar case of redirect caching occurs for resources, for example with stylesheets, images, or scripts. Since these occur much more frequently it becomes that much more important for browsers to cache every action that they can.
This is part of the HTML 5 specification and allows for pages to specify resources which should be opportunistically downloaded in case they should be used in the future (the common example of image rollovers could be used here).
<link rel="prefetch" href="/images/big.jpeg">
And that resource will be downloaded preemptively.
The final case that the profiler tests for is the ability of a browser to support inline images using a data: URI. Data URIs give developers the ability to include the image data directly within the page itself. While this saves an extra HTTP request it’s important to note that the resource will not be cached (at least not as external resource – it may be cached as part of the complete page). The use of this technique will vary on a case-by-case basis but having a browser support it is absolutely important.
Going forward it will become increasingly important to have publicly-visible tests like the UA Profiler that are able to encourage browser vendors to act quicker at implementing critical browser functionality. Anything that’s able to, even indirectly, improve the performance of the browsing experience for users of the web is absolutely critical, in my book.