Webmaster Tools vs. Google Trends data doesn't add up
-
I am investigating a two-month 25% drop in organic traffic from Google to a client's site. When I turned to the Webmaster Tools data for the site, there is a clear, gradual drop over the course of a couple months both in impressions and clicks. In general, the drop occurred across many pages and for a large number of queries; there wasn't a core group of keywords or pages that saw the drop...it was more sitewide. Yet, the average rankings reported by WMT were, for the top 100 or so landing pages, not significantly different.
The site hosts information about medical conditions, and I wouldn't expect any time-related variations in search volume, and this was confirmed by looking at Google Trends data for a number of the top keywords. I started to look at the data by query for all the top keywords (all ranked in the top 10), and saw the following general trend: impressions were down, rankings stayed in the top 10, and Google Trends showed either flat or rising volumes.
So I am trying to make sense of that. If the search volume trend did not decline and rankings held inside the top 10, then how could the number of impressions drop significantly? Am I trusting the WMT data too much? But the reality is that the volume of traffic measured by Google Analytics from Google organic did indeed drop the way Webmaster Tools show it.
-
You're welcome! One last suggestion I'd make is if you have the budget, to augment Moz rankings with daily rank tracking. Weekly might be enough, but if you want to turn it up a notch try Authority Labs. I find them accurate and robust for daily rank tracking.
-
We have WMT data monthly, and are setting up a Moz campaign for the site so that we'll have weekly de-personalized data moving forward. Will also take a look at the SEMrush data. Thanks for the suggestions.
-
Also, you can try to see if SEMrush has any past ranking data - depending on the volumes, you might find something: http://www.semrush.com/
-
I wouldn't call it inaccurate, it just operates a bit different. For example;
- It's only when a page ranks - the "avg position" is only when someone actually searched, and the page ranked somewhere - it does not calculate a hypothetical ranking
- It's averaged over time periods - when you are looking at average ranking - the time period can muddy numbers. Shifting in actual ranking over 2-3 months might show an "average" position of 5, but actual could have been 5, 2, 7, 15, 1, 6, etc over time
- It's averages for logged in, logged out, search plus your world etc - rank checkers give a constant number based upon trying to de-personalize. But WMT is averaging personalized rankings, G+ "search plus your world", localized rankings etc - which is a muddy number as well.
In short - tracking rankings give you a steady ranking eliminating changing variables. I'm sure WMT is accurate, but there's a lot more moving parts so it's suspect to these things and can not be looked at as "rankings" but rather an actual look at where you happened to rank when a page did show up in someone's search.
-
For that time period we don't have a second source of rankings data. Should there be reason to believe that the Webmaster Tools ranking data is not accurate?
-
I think your first step should be to verify rankings with another tool. Hopefully you've tracked rankings somewhere else? Do you see any changes there?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having the same URL in several sitemaps a problem for google?
We have 30 sitemaps, one for each language version of our site. About 5000 pages per sitemap.
Reporting & Analytics | | lcourse
To get a better idea on which pages google is not indexing, I thought about quickly generating sitemaps by page cagetories to see if there are any patterns. Any problems if I submit now new additional sitemaps dividing all our pages by product page, considering that the same pages are already in our existing sitemaps we submitted in the search console. So having same URL in more than 1 sitemap would be a problem? As a side note, we observed when adding a sitemap index that google search console in its count of total indexed pages, now counts every page twice since we submitted both the sitemap index and the individual sitemaps, so search console does not recognize in count that sitemaps in sitemaps index are identical to the ones we submitted individually in search console.0 -
Why is this tag not firing in Google Analytics?
I setup Google Tag Manager on this site- http://germanhausbarn.com I am trying to setup event tracking for the donate, newsletter, and Contact Us button at the bottom of the page. The most recent version is published, and I ran debug and it shows that they fire, but nothing is coming up in analytics. Any thoughts?
Reporting & Analytics | | EcommerceSite0 -
Google Tag Assistant for Chrome
I'm using the Google Tag Assistant for Chrome, and I noticed something really weird. No matter what pages I look at, the same two GA tags show up. It's weird. You can see the tag that is "working", and then there are two repeats. For example, when I look at this page, I see the GA tag that is working and then all the remarketing tags. Then I see UA-36732895-1 repeated twice. Anyone have any idea what this is? Thanks!
Reporting & Analytics | | PGD20110 -
Google Analytics Custom Filter
Hello, Quick question, when I create a custom filter on a profile that already has data collected will that filter apply to already collected data or will the filter only apply from the day it was created? Am doing a custom exclude IP filter. Much thanks,
Reporting & Analytics | | Unity
Davinia0 -
Google API Tools - Next Analytics
Have you used any API tools like Next Analytics and/or can you recommend another tool that is particularly useful for a SEO?
Reporting & Analytics | | KnutDSvendsen0 -
Google encryption of search results
Hello Is there any definitive information on whether the recent changes to the way Google encrypts search results for people logged into Google affects the traffic present via the Google Adwords Keyword tool? Plus, how does SEOmoz ensure/minimise the affect of personalisation/localisation etc. on the rankings data provided via the pro tool. Thanks in advance for your assistance. Kind regards Neil
Reporting & Analytics | | mccormackmorrison0 -
Disclaimer about using Google Analytic?
When using GA on my Website do I need to have a disclaimer that 'We are using Google Analtyic to Track information'?
Reporting & Analytics | | daracreative0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0