In April, 2010, Google announced they would begin using site speed as a metric in determining site ranking. While this no doubt upset a great number of people who have spent countless days tweaking their site’s wording and links for better SEO, it does make a great deal of sense. How many times have you found yourself hitting the “Back” button to go to an alternative site because a page wouldn’t load? Google is in the business of helping people find what they are looking for, if people are not finding what Google gave to them as an answer, then that answer is useless. If Google were to persist in providing poor answers, soon Google wouldn’t exist. Their solution? Reduce the ranking (the value) of poorly performing sites.
Let’s say you’re not a big company and can’t afford all new hardware, how are you supposed to compete with those who can just throw money at the problems?Just like with most “David and Goliath” scenarios: you out-maneuver and out-think your opponents. Matt Cutts, the head of Google’s Webspam team, had this to say about this very question:
I want to pre-debunk another misconception, which is that this change will somehow help “big sites” who can affect to pay more for hosting. In my experience, small sites can often react and respond faster than large companies to changes on the web. Often even a little bit of work can make big differences for site speed. So I think the average smaller web site can really benefit from this change, because a smaller website can often implement the best practices that speed up a site more easily than a larger organization that might move slower or be hindered by bureaucracy.
The time spent generating the HTML document affects overall latency, but for most Web sites this back-end time is dwarfed by the amount of time spent on the front end. If the goal is to speed up the user experience, the place to focus is on the front end. Given this new focus, the next step is to identify best practices for improving front-end performance.
What does that mean? It means that just by making minor adjustments to the programming of your website, you can drastically improve your site’s performance. Here is Steve Souders’ original list of is a list of 14 best practices (each are explored in detail in his book.)
- Make Fewer HTTP Requests
- Use a Content Delivery Network (CDN)
- Add an Expires Header to outgoing files
- Gzip Web Page Components
- Put Stylesheets at the Top
- Put Scripts at the Bottom
- Avoid CSS Expressions
- Reduce DNS Lookups
- Avoid Redirects
- Remove Duplicate Scripts
- Configure ETags
- Make AJAX Cacheable
Most of these practices require minimal effort to implement; a couple, such as item #2, are a little more involved; and each item which is implemented will play a part in making your site faster.
It should be noted that when you increase the performance of your website you won’t just help your ranking on Google, your site may also experience the following side-effects: 1) increased traffic; 2) increased pageviews; 3) increased sales; 4) increased cash flow. And you might just experience one more side-effect: a few less sleepless nights.
Tools and Resources
- Google Libraries API
- Google Page Speed Firefox Extension
- Google Webmaster Tools
- Yahoo!’s YSlow Firefox Add-on
- Best Practices for Speeding Up Your Web Site
- High Performance Web Sites - Steve Souders, O’Reilly Publishing
- Even faster Web Sites - Steve Souders, O’Reilly Publishing
- Steve Souder’s Website
- Google Incorporating Site Speed in Search Rankings - Matt Cutts