Webmaster Tools Search Queries Data Drop
-
Hi
I'm seeing a significant drop in search queries being reported for a client in GWT starting on the 7th Feb. I have seen a few articles on SERound Table etc saying that many are reporting probs like delays etc with GWT updating its data, such as these ones:
https://www.seroundtable.com/google-webmaster-tools-data-stalled-19854.html
https://www.seroundtable.com/google-webmaster-tools-analytics-data-19870.html
However these seem to suggest the problem is simply a delay with displayed data being updated, in the case im looking at the data is up to date but showing an increasing decline. When i look at Analytics data though the data is completely different. For exmaple GWT says on the 21st Feb there were 23 impressions with zero clicks but Analytics says there were 6 clicks/sessions from organic search. I take it this means that there is a likely problem with GWT data and I shouldn't worry ?
All Best
Dan
-
-
I agree. You will need to add the https version of your site to GWT. Google sees these URLs as two different URLs, so you will want to add the HTTPS version of your site.
-
Just looked into this further and the HTTP to HTTPS migration totally explains it IMO
-
Hi Monica
Im noticing even less pages in the sitemap index in GWT now so thinking could still be a genuine problem issue rather than data issue with GWT
Ive been looking a bit more in depth at ranking pages and notice that site went HTTPS in early Feb and the impressions started to drop almost straight after:Some previously well ranked HTTP pages dropped to the 50's - 60's suddenly in between the 4th - 11th Feb to be replaced in the high rankings soon after by their HTTPS equivalent/replacement. This time frame is exactly when the GWT impressions drop starts.The HTTP pages are still ranking though, so could the fact that pages are falling out of the index, combined with HTTP pages still ranking signal a possible dupe content issue causing the drop in impressions, and i should tell dev to delete or fwd/redirect the old HTTP pages now HTTPS are live ?If this is the cause, could it be a permenant problem or should deleting/redirecting solve it ?Just a thought !?------------------------------------------------------------Edit additional info:Ive just seen this article: https://www.seroundtable.com/google-webmaster-tools-http-https-19965.htmlThis makes me think need to set up a whole new webmaster tools profile for the HTTPS version and also explains loss of impressions etc for HTTP - so loss is nothing to worry about since impressions now being achieved by the HTTPS site version ?
-
ok thanks Monica
yes it does seem very strange and im very worried about that sharp fall in search impressions although cant explain it since rankings havn't fallen and neither has traffic from organic search, but impressions look like they are falling & consistently getting worse
-
You don't need to add a secondary dimension. I just look at my traffic. For example, I had 1627 Google Organic visits last week. 60% were new sessions, or unique visits. That tells me that I should see about 975 clicks for the week in GWT.
I am not sure what is causing the discrepancy. In my opinion the issue isn't with what is in GWT. The update they submitted didn't affect accuracy of data, it was merely implementing the ability to compare data sets over a period of time.
-
Server error - im not sure since marked as fixed to see if old error and will see if it comes back but had been reported since December and these GWT issues only started on 9th Feb
Similarly the sitemap warning was first observed in October so didnt think would have anything to do with it. Its a 404 to do with a 'page-sitemap.xml' with the following message:
When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.
But when i click on the example url it actually resolves with the correct page ok, so again didnt think this woud have anything to do with recent GWT issues
Panguin screenshot attached - doesn't look like anything since GA data is showing traffic and s light increase over mentioned period/after final yellow line which i presume is G update you refer too ?
Regarding: "Acquisition - Source/Medium and check the amount of unique traffic (not the total sessions, just the new sessions) from Google Organic and compare that to what you see in GWT" Here i've gone to Source/Medium > Google Organic Then i've clicked 'Other..' ? 'Users' > 'User Type'. (Is this the best route to the data you mean ?). In here New Visitor and Returning visitor are listed with New Visitor saying 406 visits (same period in GWT is 393 so v close).
-
What is your server response error? How about the sitemap warning? I would be interested in seeing if the data in Panguin Tool showed anything weird going on with the update in the middle of February. What Google has indexed vs crawled shouldn't matter to your impressions and clicks.
Usually, I have more data in GWT than in analytics. I would go to Acquisition - Source/Medium and check the amount of unique traffic (not the total sessions, just the new sessions) from Google Organic and compare that to what you see in GWT. They should be more closely related to each other. My numbers never match exactly.
-
I've never really looked to see if they actually match to be honest, its not so much a about them not matching exactly, just that GA is reporting visits from organic search (specifically from google) as per normal/as it has done for last couple of months and GWT reporting sharp decline in impressions and almost zero click thrus (i attach a screen shot of the GWT dash).
Also a weird anomaly in GWT ive just seen today is that crawl stats crawl an average of 55 pages per day (last data point the 22nd Feb) whereas the sitemap index section is showing 65 pages submitted but only 16 indexed (which i'm sure is a drop and most were previously indexed). And Google index status is showing a sudden drop in indexed pages from the 15th to the 22nd going from 64 to 30 indexed pages).
-
My opinion is that there should be more data in GWT than in GA because "not provided" is not accounted for in GWT. There is no data loss with the updates in GWT in Feb, only a longer lag time between reports.
Has you data always matched prior?
-
why not ?
surely if the GWT data is out of sync with everything else then it seems most likely the problems with GWT data doesnt it ? (especially given all the other recent articles on probs with GWT in Feb?)
-
I don't believe so.
-
I cant really due to showing who my most important client is so cant risk chance an unethical seo may see comment & who my client is and try to steal the biz, although im confident that would never happen from a mozzer, i just cant risk it since my most important client
In GA im looking at Organci Sear Traffic then looking at Google specifically and traffic all reporting fine right up to date, GWT shows no clicks over same period and ever reducing impressions (manual searches also contradict the GWT data)
So surely this is a GWT data issue ?
All Best
Dan -
Do you have a screen shot you can add by chance? Are you looking at your Source/Medium or are you looking at Channels in Analytics? If you are looking at overall organic traffic in analytics you will see a discrepancy because it will include information from Bing and Yahoo as well.
Remember also that GWT doesn't account for "not provided" searches and usually only count unique clicks.
-
Ok thanks Monica,
but its definately looking like a discrepancy between GA & GWT since GA is reporting much higher visits than GWT, if site had been hit then GA would be reporting a drop in SE traffic too, which its not.
Also since there has been all this recent talk of GWT missing data then that seems to confirm that, although seems strange still getting some data when this recent talk of data loss is referring to halts in reported data on a certain day, which then seems to catch up (wheras im looking at up to date graphs just with much reduced search queries/impressions & clicks).
The site is not e-commerce and i have looked at Panguin (thanks for that great looking tool and from a v well respected agency), but in this particular case it just verifies the GA data, i have a problem with the GWT data not GA.
many thanks
dan
-
GWT did update their reporting features and has been slow to release it to everyone. There has also been some rumors surfacing on an update the second week of February that really impacted Ecommerce websites. It had a lot to do with responsive websites. Do you use Panguin tool at all? I would start there to see if your drop in traffic has anything to do with some of the updates. If it is just misinformation in GWT, then you should see some things leveling out over the next two weeks. The update isn't complete yet, but lots of people lost days of information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Handling Pages with query codes
In Moz my client's site is getting loads of error messages for no follow tags on pages. This is down to the query codes on the E-commerce site so the URLs can look like this https://www.lovebombcushions.co.uk/?bskt=31d49bd1-c21a-4efa-a9d6-08322bf195af Clearly I just want the URL before the ? to be crawled but what can I do in the site to ensure that these errors for nofollow are removed? Is there something I should do in the site to fix this? In the back of my mind I'm thinking rel-conanical tag but I'm not sure. Can you help please?
Technical SEO | | Marketing_Optimist1 -
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
Duplicate content on user queries
Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it. The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later. All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so www.mysite.com/search/productA will return no results... Product B is also not active today... so www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!
Technical SEO | | SFMoz0 -
Keyword Difficulty Tool
Hi, When can we expect the keyword difficulty tool to be operational again? Thanks
Technical SEO | | dt18072 -
Webmaster Tools/Time spent downloading a page
Hi! Is it preferable for the "time spent downloading a page" in Google webmaster tools to be high or low? I've noticed that this metric rapidly decreased after I moved my site to WP Engine and I'm trying to figure out if it's a good or bad thing. Thanks! Jodi QK8dp QK8dp
Technical SEO | | JodiFTM0 -
Google Webmaster Tools Reporting False Links
I was looking at Google Webmaster Tools and the amount of links that are reported in there are inaccurate. They reported over 50,000 links that created a huge spike in their link graph and I checked some of the links and they don't even have the link on their site. Can anyone help with this?
Technical SEO | | TopFloor0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0 -
Search Engine Blocked by Robot Txt warnings for Filter Search result pages--Why?
Hi, We're getting 'Yellow' Search Engine Blocked by Robot Txt warnings for URLS that are in effect product search filter result pages (see link below) on our Magento ecommerce shop. Our Robot txt file to my mind is correctly set up i.e. we would not want Google to index these pages. So why does SeoMoz flag this type of page as a warning? Is there any implication for our ranking? Is there anything we need to do about this? Thanks. Here is an example url that SEOMOZ thinks that the search engines can't see. http://www.site.com/audio-books/audio-books-in-english?audiobook_genre=132 Below are the current entries for the robot.txt file. User-agent: Googlebot
Technical SEO | | languedoc
Disallow: /index.php/
Disallow: /?
Disallow: /.js$
Disallow: /.css$
Disallow: /checkout/
Disallow: /tag/
Disallow: /catalogsearch/
Disallow: /review/
Disallow: /app/
Disallow: /downloader/
Disallow: /js/
Disallow: /lib/
Disallow: /media/
Disallow: /.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /utm
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Sitemap:0