Why Google SERP CTR studies are a waste of time

June 20, 2011 – 8:09 pm

Historically, one of the most quoted, used and abused SEO metrics was ranking. It is a cornerstone of reports that are sent to clients, one of the first things we go to check for clients’ sites and usually cause of alarmed phone calls when the ranking drops. Over the time, SEOs have learned to steer away from rankings as the major reported metric, although I have yet to find someone in the industry that doesn’t check them.

spacer

Another exercise that we like to engage in is predicting how much traffic ranking in the top 10 will bring to the website. It quickly became obvious that not every position in the top 10 brings the same amount of traffic. It is quite self-evident that the positions 1-3 will bring more traffic than positions 7-10 but how much more? Effort involved in pushing the site from #5 to #2 can be significantly larger than effort to bring the site from #50 to #10, so is the traffic/conversions worth the effort?

For this need, people have started investigating how the traffic divides across the top 10 locations. There were several methods used in these investigations, the two most prominent ones being eye-tracking and averaging clicks over locations data that comes from analytics or search engine/ISP (mistakenly) released data. I have reviewed a number of such studies and their predictions varied: the first 3 positions get in the neighbourhood of 60-80% of clicks, while the rest divides over the positions #4-#10. You can already see how the data in these studies shows a high degree of variability – 60%-80% is a pretty wide range.

All of that sounds well and good on the paper and the numbers can be transformed into pretty charts. More importantly we have numbers that we can translate into business goals and all clients like to hear that: “if we reach #3, we will get X visitors and if we put in more effort (AKA increase the SEO budget), we can get to the #1 position which will mean X+Y visitors, conversions, whatnot”. However, when one starts looking at the numbers out in the wild, the picture that emerges is significantly different. The numbers do anything but fall into precalculated percentage brackets and the division by percentages fluctuates so wildly that pretty soon, you find yourself tossing your predictions into the bin and all you can say is that improving your rankings will get you more traffic by an unknown factor. This is something that has not escaped the original researchers of CTR distribution: they are constantly trying to improve the accuracy of their findings by focusing the research on similar type of queries (informational, transactional, brand, etc.) or similar type of searchers, but there is a much greater number of parameters that influence the spread of clicks over different positions. I will try to outline a number of such parameters, as I encountered them with some of our own or our clients’ websites:

Universal search (local, shopping, images, video, etc.)

Introduction of different types of results in SERPs was Google’s attempt to cover as many user intentions as possible. This is particularly true of short search queries: when someone searches for [flowers] it is hard to tell whether they want pictures of flowers, instructions of how to care for flower pots, places to buy flowers or time-lapse videos of blooming flowers. Therefore, Google tries to include as many types of results as possible, hoping to provide something for everyone. iProspect’s study done in 2008, shows that 36%, 31% and 17% of users click on Image, News and Video results respectively, which means these results will seriously skew the CTR distribution over the SERPs. This is particularly true of queries with local intent where Google Places results are featured prominently in SERPs. Compare the following 3 SERPs (click to enlarge):

spacer spacer spacer

There is no chance that the #3 position gets the same percentage of clicks on each of these SERPs and the same holds true for every other position. Since serving of universal search results depends on geo-specific (IP geolocation) and personalized (location, search history) parameters, there is no way to accurately predict how the clicks will distribute over the different positions in SERPs

Search intent

Different people mean different things when searching and shorter the query, wider the scope of possible intentions is. We can be pretty sure that someone searching for [online pizza order NY] is looking to order dinner tonight, but person searching for [pizza] may be searching for a recipe, in which case a pizza joint with online ordering website located at #1 will not get the recipe seeking clicks, which will either perform a second search or browse deeper into the SERP and on to 2nd, 3rd page of results. If the majority of searchers are looking for a recipe, then the #1 located pizza vendor will most definitely not get 40-50% of clicks. We have seen this happening with one of the clients that was ranked at #1 position for a single word keyphrase. Google Adwords Keyword tool predicts 110,000 exact match monthly searches in the US. During the month of May, while being ranked at #1 at Google.com for that keyword, the site received about 7,000 US Organic visits. That is 6.36% CTR on a #1 position. Google Webmaster Tools shows a 10% CTR on #1 position for that keyword. Compare that to 50.9% CTR that the eyetracking study from 2008 predicts for #1 position, which means site located at #1 should be receiving around 56K monthly visits and you can get a feel for the gap between the prediction and reality.

spacer

The problem with guessing intent of single query keywords is that it cannot be done before reaching any significant traffic creating positions and that task can be enormous. Low CTRs on the first position also put the whole perceived market value of single keyword exact match domains under question – the predicted amount of traffic that is expected from reaching top locations for the EMD keyword should take into account the difficulty to predict the intent of searchers.

One of the possible benefits of ranking for single-query keywords is appearance in SERPs even if your results are not clicked through. If your Title is unique and memorable, there are higher chances that people will click through it from a second search query SERPs if they have seen your site in the first query. For example, if you rank for [widgets] and your title says “The Cheapest Blue Widgets in the World!”, then there are higher chances of those visitors that perform a second search for [blue widgets] to click on your site, even if you are located in lower positions (granted that your title is not pushed below the fold by Local, Video, Image or other Onebox results). Yahoo has a nice keyword research tool that shows the most common previous and next queries after your keyword of choice (I think it is only relevant for US searches)

spacer

Brands outranking you

According to one of the CTR studies, dropping from #3 to #4 should cost you about 30% of your traffic (if the 3rd place gets 8.5% and 4th place gets 6.06% of total traffic), but is it always the case? Let’s see how drop from #3 to #4 and then back to #3 played on one of the SERPs:

spacer

The SERP for this keyword has no local results nor universal search (images, videos, news). Pay attention to the difference in numbers between the original #3 traffic and the later #3 traffic. The initial drop to #4 cost this site 76% of the original traffic (instead of 30% as the CTR studies predict) and the return to #3 got it back to about 47% of the traffic the site got when at #3 originally. So what happened? The site got outranked by a famous brand. Seeing the brand name in the title and and URL of the #1 listing, enticed searchers to click through that listing in much higher numbers than the 42% that the study predicts. Again, it is not only where you are ranked, it is also who is ranked with you that influences your CTR percentages. My guess is that even when a big , well-known brand doesn’t outrank you, but is located just below your listing, your CTR percentages will also be negatively affected. Seems like Google’s preference for brands is based in actual user behaviour. It would be interesting to see whether there are any significant differences in bounce rates from brand listings too.

Optimized Listing

There are many ways to make your listing on SERP more attractive to clicks.  The linked article outlines the most important ones, however I would like to add some additional points to three of those tactics:

  • Meta Description – In an eye tracking study, performed by a team including, among others, Google’s Laura Granka, (published in a book called Passive Eye Tracking, in a chapter Eye Monitoring in Online Search) it was found that the snippet captured the most attention of the searchers (43% of eye fixations), followed by the Title and the URL (30% and 21% of fixations respectively). The remaining 5% distributed over the other elements of the listing, such as Cached Page link. This finding shows the importance of controlling the content and appearance of your snippet for your main targeted keywords, as an eye-catching snippet can tip the scales of CTR in your favour and increase the number of clicks you get, relatively to your site’s ranking.
  •  

  • Rich Snippets – the above figure of 43% of attention dedicated to the snipped probably becomes even higher when we talk about rich snippets. Google and other search engines have been investing a lot of efforts into encouraging adding a layer of classification onto the data presented on web pages and in case the information is labelled in an accepted way, it will be presented already in the snippet in SERPs. Check out the following SERP:spacer
    The showing of review grade stars in the snippet itself, makes this listing more attractive than the neighbors and it probably increases the CTR far beyond the few percent that the position #6 would get it, according to CTR studies. Similar situation can be observed with other microformatted data, such as event dates, recipes, airplane tickets, etc. Rich snippets will become even more widespread if and when Google starts accepting websites that implement the Schema.org tagging.
  •  

  • Optimized Titles – adding a call to action to the Title or making it stand out in some other way, can increase the CTR percentages of a listing, beyond what you would usually get due to your ranking. This strategy, however, became less influential since Google started deciding what the best Title of your site should be, according to, among other things, anchor text of incoming links. As these changes will be implemented within Google’s discretion, it becomes very hard to predict how many people will click through to your site.

These are only some of the reasons that CTR % can differ greatly from the value predicted by the eyetracking or other studies. It is important to remember that as search engines try to index and incorporate into SERPs more and more types of results, it will be harder to predict the CTR over locations. Even averaging out the figures over a large number of results will give us a number that is not at all useful for predicting the traffic volume received from each position.

(Image courtesy of iamyung)

Posted in General, Google, SEO | 32 Comments »

SEO Spider Review – Xenu on SEO steroids

February 24, 2011 – 6:35 pm

One of the first tasks when getting to a brand new SEO project is to assess the state of on-site optimization. That is the place where we usually do the first measurement of ”how bad the things really are” or “how badly the previous SEO company screwed up” . One of the most basic tools that I always used for this task is Xenu Link Sleuth. It was great for looking at possible crawling problems, getting server headers, checking alt tags on images etc.

However, the problem with Xenu was that it is not an SEO tool. As the on-site SEO changed, so did the requirements of the indexing tool. All of a sudden, I wanted to see what pages have a canonical tag, what pages have duplicate metas issues, whether any pages have robots metatag issues and Xenu just doesn’t do this.

Enter Screaming Frog SEO Spider. If Xenu had a brother that did SEO, that’s what it would look like. It is a great tool and one of the landmarks of a great tool is constant discovery of new ways it can be useful. It has saved me numerous hours, mostly at the initial site audit stage.

I will not go into the nuts and bolts of how the tool works. The interface is very intuitive and if you have used Xenu in the past, you will feel at home with SEO Spider. Furthermore, there is a helpful video that explains how it works (British accent alert!). Instead of doing a comprehensive review of how the tool works, here are a few ways SEO Spider helped me in my SEO tasks recently:

(DISCLAIMER: SEO Spider has a free version that will index up to 500 pages. Full version – individual licence –  costs £99 per year. I am paying for the full version and am not being reimbursed in any way, shape or form for writing this review):

  • Investigating distribution of onsite links – When large websites want to promote a certain section of their site, one of the ways to do this is to increase the number of links to that page/section from other pages on the site. If the promoted page is found deep within the navigational structure, this will bring it closer to the root and increase the linkjuice flow to it. An overview of internal category/product pages that have a larger than usual number of on-site links pointing to them can help us identify competitor’s SEO campaign targets. Furthermore, if we cross examine these promoted pages with the number of offsite links to internal pages, we can gain some nice insights about promoted products, targeted keywords, linking strategies and other aspects of SEO campaigns.
  • spacer

     

  • Identification of images without alts – this is especially true about images that serve as links. In addition to the capability of filtering images without ALTs, there are other options, as in filtering images that are larger than 100kbs or filtering images that have a long ALT tag.
  • spacer

  • Searching link prospects for broken links you can replace – as a part of link building process, you can go to a linking prospect site, run the spider on it and get a list of all outgoing links that result in 404. Go to the linking pages, find out what was the linking purpose and the context of the broken link. Then you can either create a similar resource on one of your sites or tweak your existing content to fit the linking intention and contact the linking prospect, alerting them about the broken link and pointing out similar helpful info on your site.
  • spacer

  • Identifying “bait and switch” links on your site – there is a possibility of webmasters creating useful content that you want to link out to. After a while, some of those websites may be taken over by other companies that want to leverage the existing links and authority and redirect them to their own resources. Thus, unwillingly, you may find your site linking to content that you did not intend to link to. In the mild case, your visitors will click through and will not find the useful content you intended them to. In worse cases your site may be flagged as linking to adult and maybe even illegal content. Hence you can get a list of all the external links that result in a redirect and check them out. A lot of the cases those will be simple canonicalization issues that webmaster took care of (adding the trailing slash, pointing to the www version of the site, reflecting a change in directory structure, etc.)
  • spacer

  • Discover noindex/nofollow metas – A great way to get an overview and identify screw-ups left behind by your developers. Leaving noindex metas sounds like a beginner mistake, but you would be surprised how many experienced SEOs stumble here, either due to their own oversight or due to forgetful developers
  • spacer

  • List all the canonicals – canonical tags are a great way to get rid of duplicate content issues on site (and across sites) but it can be used for other, more sinister purposes too (that’s for another post). Conveniently, SEO Spider will spit out all the canonicals across the site
  • spacer

  • Search inside the HTML source of pages – this is one of the recent additions to SEO Spider. It can be very useful if you want to find all the pages linking to a target page with a specific anchor text or look for tracking code within pages.  Up to 5 custom searches can be defined in the Configuration section.
  • spacer

 

These are just a few things I have found in the last few weeks. I am sure this list will just grow, so check it out at later stage, I may be adding more great ways to use SEO Spider

 

Posted in General, SEO | 24 Comments »

SEO is Dead

January 7, 2010 – 4:41 pm

In the past few years, we have seen a flurry of articles celebrating the forthcoming death of SEO (AKA linkbait) and an even larger flurry of angry rebuttal articles from the SEO community (AKA retweetbait).

So not wanting to be left out of that link party, I decided to weigh in with our clients’ opinion on that matter (click for larger version):

spacer

spacer

Have a nice day.

Posted in General, Google, Linking, SEO | 31 Comments »

Reading List #4

December 1, 2009 – 10:53 pm

In between having an awesome time at Pubcon and having not so awesome time catching pneumonia, there are quite a few articles that have gathered in my Reading List bookmark folder. So here are the articles that have caught my eye in the last few weeks.

If you think I missed something groundbreakingly important or that your article has covered one of the discussed topics in a better way, please let me know in the comments and I will make sure to review it and post it in the next Reading List.

Enjoy

www.seomoz.org/blog/web-analytics-and-segmentation-for-better-conversion-optimization

I like articles that explain different tools and techniques through concrete examples. This one explains 4 different metrics that can be analyzed through creating segments in Google Analytics

www.seo.com/blog/the-value-of-fresh-content/

A little bit of circumstantial evidence supporting the claim that Google will boost your rankings if you feed it with fresh content. Which is probably true under very specific conditions, so I wouldn’t treat this as an article outlining a controlled experiment, rather as an interesting direction to check things.

www.johnon.com/716/google-breadcrumbs-seo.html

A commentary on Google’s move of replacing the URLs in SERPs with breadcrumbs. Interesting insights.

www.seroundtable.com/archives/021217.html

Post on Bing link building guidelines, with links to the original Bing post and WMW discussion thread on the topic.

www.searchenginepeople.com/blog/discover-what-google-really-thinks-of-your-pages.html

A great post describing how to measure and analyze data describing the frequency of Google visiting your site. With the visible PR being either outdated or unimportant parameter, frequency of spidering seems to be a likely contestant for the title of the parameter that defines the importance of your site. This article describes a way to measure and compare this frequency between different pages of your site.

www.blogstorm.co.uk/5-seo-focussed-web-analytics-tools/

List of 5 not-so known analytics tools that (seem to) have been created with SEO on mind

www.searchengineguide.com/stoney-degeyter/your-seo-kungfu-is-strong.php

Outline of a site-wide Title tag tweeking process for achieving maximal rankings and increasing overall CTR for those rankings. I don’t get the kung-fu metaphors, for some reason SEOs seem to be infatuated with them.

www.jonathanstewart.co.uk/a-comprehensive-guide-to-the-vince-update/

OK, this is probably going to be my last mentioning of the Vince (or the Brand) update on these lists. Here is a comprehensive guide on everything meaningful that was published on this subject

esseoo.com/forum/on-page/why-404-pages-should-return-404-codes-for-successful-on-page-seo/#p11

A nice case study explaining why you shouldn’t automatically 301 all your 404 pages back to your homepage. I do not necessarily agree with the way they have solved the problem – they could have done it without losing links to the non-existing pages, but that is another issue

seogadget.co.uk/google-keyword-tool-external-vs-beta/

Nice comparison of the old and the new beta Google Adwords keyword tool and the differences in the numbers they output for keywords with different search volumes

searchengineland.com/6-ways-local-domains-crush-dot-coms-in-international-seo-29898

A piece describing some reasons why country code TLDs have advantage over .com, .net, .org, etc. in local results

www.hongkiat.com/blog/breadcrumb-navigation-examined-best-practices-examples/

Examination of breadcrumbs from the designer and user experience point of view

www.searchenginejournal.com/breadcrumbs/15022/

Examination of breadcrumbs from the SEO point of view. A very nice overview of all the pros and cons of different ways to utilize breadcrumbs on your site. One point that the article does not mention is duplicating links back home or to inner pages by using breadcrumbs. Remember those “second link doesn’t count” issues? Make sure that the link you want to be counted comes before breadcrumbs in the code.

www.seomoz.org/blog/30-seo-problems-the-tools-to-solve-them-part-1-of-2

Good overview of all the SEOMoz tools. I like the way that the tools are presented – through a specific SEO problem that each of them solves. Still waiting for part 2 of the post.

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.