One of the hardest parts of SEO is knowing what to measure. How do you know if your latest round of changes is working? How do you separate all the noise and interrelated activity? Most would say look at your rankings.
That’s easier said than done. Search results are not fixed. They change on an individual visitor basis. They are impacted by user testing search engines might perform in areas outside your geographic location. These tests are relevant to overall rankings and can help determine whether you rank on the first page or not. So what other statistics should you monitor?
1. Crawl rates for your SEO
Despite the contrarians, I still contend that the amount of time search engine bots spend on your site relates to how your site is viewed. Do bots visit your site because you added links, developed fresh content, or social indicators signal popularity? It doesn’t matter. Page visits by bots show its finding something considered important enough to visit a percentage of your pages on a daily basis. If that percentage increases your SEO efforts, you are being successful. From the first week of November 2010 to the same week a year later, we saw weekly pages crawled increased by 10.5 million. Over the course of the same time period, organic traffic increased by 191,633 weekly visitors.
Crawl rates also serve as a red herring for site issues. Introduce bad code slowing response time and bot crawl rates will show a problem before rankings impact your SEO. We introduced an error in our site in June 2011. At the time, we didn’t have the Bing bot in our internal reporting. We missed the error for several weeks. However, looking back at in Bing Webmaster tools you can see the drop.
2. Keywords driving traffic and SEO
I like to look at the number of different keywords driving traffic to Angieslist.com. It indicates how we rank for long-tail searches and sheds light on overall rankings. An important part of this measurement is the removal of brand searches. If a user appends your brand name to the search, they were looking for your site. This doesn’t qualify as an organic search lead.
3. Number of entry pages from search
To determine keyword variety, this can be crucial. The more entry pages from search engines tend to determine if you have more top-ranked pages for topic-targeted keywords.
4. Average page load time
We know page load time is important as to rankings and user experience. At Angie’s List, we have noticed that as page load time increases, crawl rates decline. So we have added a policy to load test every release in order to maintain site performance. We are also currently working through common recommendations to improve overall site performance. Even if it is just a perceived improvement, we are reordering content to better utilize the client’s ability to download components simultaneously.
5. Sitemap status
If the bots are reporting a problem with your sitemap, it’s very likely you have a problem with your site or bad sitemap files. Either situation sends out a bad signal and should be corrected. You can view the status of your sitemap files by logging into Google and Bing Webmaster Tools.
6. 404-page count
If the bots are detecting a large quantity of 404 errors, you need to investigate the source. Did you make a recent site structure change? Do you have a mod rewrite error? Do you have an error in your sitemap files with links to bad pages?
You might be asking – what about rankings? We track rankings for our major keyword phrases and brand name search to make sure that we notice any major shifts. However, keyword rankings are a long-term indicator of success, not something that tells if there’s a problem with the code you released two days ago. It’s important to know and your C-suite will be most interested in its value, but as a developer, it is not a stat that I monitor on a daily or even weekly basis.
Chris Carrel is a Senior Software Engineer at Angie’s List. With over 15 years of experience, he has spent the last two focused on SEO.