DistilledData

Google Webmaster Tool Addition A Flop?

3 comments

There’s a lively discussion going on about the recent addition to the Top Search Queries section of Google Webmaster Tools that provides impression, clickthrough, and ranking data for organic listings. The idea of being able to get data on these aspects from Google presented in a way that shows keyword associations is a sweet dream for SEOs. However, the data has got to be accurate to be of any use. The tool has only been officially available for a few days and there’s already a ton of controversy over it and questions about the level of accuracy. And we’re not just talking people thinking that the data is missing by a few percentage points — examples range from pretty much spot on to 70% off or higher.

To read some great comments about its perceived inaccuracies, as well as what might be behind the problems, head over to Tom Critchlow’s Distilled blog. Tom titles his post “New Google Webmaster Tools Keyphrase Data Is 70% Useless.” Where does he get the 70% figure from? Nowhere. His point is that if data is inaccurate, it’s useless, and you might just as well make numbers up out of nowhere. But don’t let the sarcasm of the 70% lead to doubt about the value of Tom’s post. He has analyzed data for the Distilled website, for example, and isn’t happy with the results.

The columns on Impressions and Clickthroughs are data from the Google Webmaster Tool, while the Visits column data comes from his Google Analytics account for the same timeframe for the keywords indicated. 880 Impressions and 58 clickthroughs for Distilled, but 122 Visits? Little correlation there, that’s for sure.

Now, it may be that things just need time to settle out. After all, when something new debuts, it does need time to be tweaked and optimized to ensure it’s tracking the right parameters and consistent with data you do have about related activity. But these discrepancies are huge. Worse still, a large number of other people have posted in the comments with examples of their own, and mostly they are out of whack with the data from similar measurement by other tools or measures, including other of Google’s own sources.

For the most part, when Google comes out with a new tool or feature, we trust that the due diligence has been done and it is working accurately and as it should. People start using the tool or feature and begin making marketing decisions based on what it shows. If indeed there is some problem and all the data right now is suspect, Google needs to let everyone know before individuals spend time and money adjusting campaigns based on what might be faulty data. This has the potential to cause everyone to start questioning future announcements from Google and the quality of its new releases.

Why is this such a big deal? Precisely because the potential use of this new data is huge. The ability to look at how your keywords are performing on the organic side relative to each other and — perhaps most important of all — compared to your PPC program keywords makes it possible to judge the effect of changes in your PPC program on organic rankings, potentially resulting in savings in advertising costs and increased sales. The effect of changes made to title tags and snippets can be observed directly, and not just for position in the SERPs, but for effectiveness insofar as marketing is concerned — e.g., does changing the title tag bring more clickthroughs on the SERP?

It will take awhile before a solid judgment on the effectiveness of the new feature can be made. It’s disappointing that the initial look seems to be so inaccurate, and means that Google will have to prove that they have fixed whatever it was that caused the disconnect in comparative stats. Until that happens, it’s not wise to use the data from it to judge the effectiveness or potential of keywords when so many practitioners are questioning its accuracy. Google needs to explain what the problem is and how they plan to fix it or prove that the discrepancies are themselves not valid. Google needs to keep in mind that it may be lonely at the top, but it’s still the top, and they still want to stay there. And once lost, trust is hard to regain.

About the Author

Frances Krug has worked in market research since graduating from UCLA with an MA and CPhil in Latin American history. As an editor and online content provider for the last 7 years, she currently is Associate Editor at iNET Interactive, where she also directs Search Marketing Standard's email marketing program.

Add Your Comments

  • (will not be published)

3 Comments

  1. Francis-- great post. I was one who initially loved it...I mean, who wasn't stoked to see impression and clickthrough data in the top keyword searches report?! but even though my excitement has dwindled, and I won't be using the data for anything but testing, i still applaud google for at least trying. now if I find out it's only to distort the actual data, I'll be pretty pissed. but I'll give them the benefit of the doubt and hope it'll improve. I mean, look at the adwords keyword tool--it's certainly improved since the beginning, but it still isn't really accurate. .-= david´s last blog ..BASE tag: Hardly Used but Pretty Cool =-.

  2. Yes that is true most of time some information in webmaster tool is imperfect like keyword ranking is shows totally wrong. Normally i use webmaster tool for checking back links & knowing about crawling errors. Other then this information is wrong.

  3. noname

    Hi My background is offline marketing but have recently tinkered with websites. I,m afraid both analytics and webmaster tools are ridiculously inaccurate and absolutely useless. Then google says it is against their guidelines to use third party software to check the data you were trying to check with google. What a joke. Plus ive seen lost of sites ranking at the top of page 1 which appear to break just about every one of google's guidelines