Optimising web page access speed

Posted by Emmanuel in web development on March 26, 2009 – 18:00
Post a comment

headbodyThanks to high-speed internet networks and efficient caching techniques, web pages can now be transmitted around the world in a fraction of a second. However, the spread of Web 2.0 technology over the past years has generalised the extensive use of large CSS sheets and AJAX code, and web pages that rely on several hundreds of kbytes of Javascript code are not uncommon these days. If no care is taken, risks that a web site with rich, dynamic content generates waiting times of many seconds for every page accessed are high, even from a high-speed connection. Despite all its bells and whistles, browsing such a web site may lead to a rather frustrating experience.

Hopefully, ways to optimise web-page access efficiency exist. A good start is to use a network performance measurement tool. There are several plug-ins for web browsers around. On Firefox, the combination of Firebug and Yslow will give you a wealth of statistics each time you access a web-page, and provide hints about what parts of the served content require most attention. Putting aside optimising server-side generation of dynamic content and cache management, one way to significantly reduce access time is to compress web-page content, especially CSS sheets and Javascript programs. Both of them are ASCII files, and may greatly benefit from compression. Compression can be achieved through

  • minification“, which consists of making the code more compact by stripping unnecessary characters and reducing the length of local symbols. Popular offline minification tools include JSMin and Yahoo’s YUI Compressor. Both tools do a great job at reducing the file sizes (gains of 30% are not uncommon). The processed files are almost unreadable, but hopefully perform otherwise exactly the same. To avoid maintaining both uncompressed and compressed versions of the same code, on-the-fly minification tools have been written in PHP: see for instance JSMin-php or Minify!.
  • generic file-compression. The possibility to send compressed content has been part of the HTTP protocol since 1999 (HTTP 1.1). Web servers can compress content on-the-fly before sending it to browsers that support it. For content consisting mostly of ASCII text ((X)HTML, CSS or JavaScript), reduction in file size, and therefore bandwith increase can be large: a compression ratio of 5× is pretty typical. Compression and decompression put more stress on servers and clients; but compression is generally beneficial with modern processors, at least for websites under moderate load. Problems with compressed content seem to occur for some older browsers and proxies. Fortunately,  compression server modules such as Apache2‘s mod_deflate can be configured precisely to compress specific file types for specific clients.

Both compression techniques applied to the AstrOmatic website, reduce the total number of bytes sent for AstrOmatic pages (before caching!) from an average 800kB to a more reasonable 400kB (including 310kB of image data  😆 ).

This entry was written by Emmanuel, filed under web development.
Bookmark the permalink or follow any comments here with the RSS feed for this post.
Post a comment or leave a trackback: Trackback URL.

Post a Comment

You must be logged in to post a comment.