Download speed used to be one of the ways you could tell a real web pro from a graphic designer that knew how to make things pretty. One of our excercizes used to be to make pages in a single table (this predated CSS), but carefully spanning rows and columns to move your content around, a pain to develop, but fast to render, since Netscape’s browser used to be terrible at “embedded tables.” Obviously, this is archaic (along with worrying about 28.8k modem download speeds), but the concept of page that is fast to download and fast to render never went away. When Google officially announced that it would take page load into account, people started to finally pay attention. One of the best “starting points” it’s Yahoo’s Performance Guide.
However, the process of making a website fast is pretty straight forward:
- Home Page and other enterances: VERY fast and simple
- Limit third party items that might cause delays via DNS or download
- Prevent things that can get VERY slow from being on these pages
One and two are the ones most often paid attention to, but #3 is potentially the biggest impact and most ignored. For example, adding gzip to your server cuts file transmission size, that saves time, and is nice, and can get 80 ms loads to 20 ms loads, but #3 is where page loads can move from 100 ms to 10-15 seconds. For example, if you query the database to build your navigation, it’s easier to manage your navigation, but a hiccup at the database level (that locks that table), and your site hangs loading. A solution like Memcached moves your “read only” data out of the database and into RAM. You can still manage it in the database, and the site will update relatively quickly, but there is no reason to consult the database multiple times for information that changes infrequently.
Getting the 50% – 200% improvements are great, but a real focus on the few things that can explode out of control will serve you better in the long run.