« How Google's Instant Previews Reduces HTTP Requests | Main | Stuff the Internet Says on Scalability For November 12th, 2010 »
Monday
Nov152010

Strategy: Biggest Performance Impact is to Reduce the Number of HTTP Requests

Low Cost, High Performance, Strong Security: Pick Any Three by Chris Palmer has a funny and informative presentation where the main message is: reduce the size and frequency of network communications, which will make your pages load faster, which will improve performance enough that you can use HTTPS all the time, which will make you safe and secure on-line, which is a good thing.

The benefits of HTTPS for security are overwhelming, but people are afraid of the performance hit. The argument is successfully made that the overhead of HTTPS is low enough that you can afford the cost if you do some basic optimization. Reducing the number of HTTP requests is a good source of low hanging fruit.

From the Yahoo UI Blog:

Reducing the number of HTTP requests has the biggest impact on reducing response time and is often the easiest performance improvement to make.

From the Experience of Gmail

…we found that there were between fourteen and twenty-four HTTP requests required to load an inbox… it now takes as few as four requests from the click of the “Sign in” button to the display of your inbox.

So, design higher granularity services where more of the functionality is one the server side than the client side. This reduces the latency associated with network traffic and increases performance. More services less REST?

Other Suggestions for Reducing Network Traffic:

  • DON’T have giant cookies, giant request parameters (e.g. .NET ViewState). 
  • DO compress responses (gzip, deflate). 
  • DO minify HTML, CSS, and JS. 
  • DO use sprites. DO compress images at the right compression level, and DO use the right compression algorithm for the job. 
  • DO maximize caching.

Reader Comments (3)

[...]Today Todd Hoff from highscalability.com posted an entry in his journal that has some really awesome tips for improving preformance. [...]
increase performance by limiting http requests

Awesome read. I am really enjoying your blog. Keep up the good work!

November 15, 2010 | Unregistered Commentertomfmason

And how do we reconcile content websites served via https to logged in users and advertising, which is mostly served from nont-https sources? Some browsers would throw a warning dialog for every add on a page. Or third party tracking such as ComScore, Nielsen?

What if you have a https version for logged in users, but implement an automatic login based on a persistent cookie/token? If the user first connection is to a http version of the site (the one accessible to non-logged in users), then such a token would be sent in the clear?

Lots of things to think about...

November 15, 2010 | Unregistered CommenterM Freitas

Steve Souders (a former Yahoo, and current Googler) wrote a book about this:
http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309/
The YUI blog suggestions are results from Steve's work (and his team) while he was at Yahoo.
YSlow and Google's Page Speed are also part of Steve's work.

November 16, 2010 | Unregistered CommenterChris

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>