Apparent Bot Queries and Impressions in Webmaster Tools
-
I've been noticing some strange stats in Google Webmaster Tools for my forum, which has been getting spam queries with impressions and no clicks.
See the queries in the attached images. This might be a motive for the spammers or scrapers.
I set the date range to just 22 Aug - 22 Nov and I see very obviously the spike is due to impressions.
Questions:
What should/can I do?
Is Google doing something about this?
How to avoid this?
-
Ah, my mistake. I'll ask around on our content team and see if they haven anything.
-
All well and good for Analytics data, but OP is referring to Webmaster Tools spam.
Any blog posts or solutions on Webmaster Tools / Search Console spam in Search Analytics?
-
This is definitely a frustrating issue. We ran a couple blog posts this year that may be able to help:
https://mza.bundledseo.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data
https://mza.bundledseo.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
-
This is becoming disgustingly common.
The only thing I can suggest though is to Filter Queries Not Containing "privacy: your email address will not be shared" (or whatever) to get all those search queries removed temporarily from your Search Analytics. But it's a nuisance to have to do it every time...
As for preventing it in the first place, I'm not sure what you can do. The spam isn't targeting your website, it's targeting a search query. I don't know what you could do really to prevent being indexed for a particular search query.... What's your average position for these weird queries anyway?
Really interested to see if anyone can provide an actual solution to this.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Why should I reach out to webmasters before disavowing links?
Almost all the blogs, and Google themselves, tell us to reach out to webmasters and request the offending links be removed before using Google's Disavow tool. None of the blogs, nor Google, suggest why you "must" do this, it's time consuming and many webmasters don't care and don't act. Why is this a "required" thing to do?
White Hat / Black Hat SEO | | RealSelf0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
URL Structure - forward slashes, hyphen separated, query paramters
I am having difficulty evaluating pros and cons of various URL structures with respect to SEO benefits. So I can have the following 1. /for-sale-in-<city>-<someothertext>-<uniqueid>.php
White Hat / Black Hat SEO | | proptiger
So in this case a term like 'for sale in San Francisco' is directly part of the URL. </uniqueid></someothertext></city> 2. /for-sale/<city>/<someothertext>uniqueId
Here 'for sale in San Francisco' is not so direct in the URL, so I think. Also I 'heard' that forward slash URLs are somehow considered as being 'lower down' in the directory structure. </someothertext></city> 3. /for-sale/<city>/<someothertext>/?pid=uniqueId</someothertext></city> someOtherText contains keywords we are targeting. 1. Is there a preference of one format over the other? 2. Does it even matter? 3. someOtherText - does it makes sense to put keywords in the URL for just SEO purposes? I do not per se need someOtherText for functionality.0 -
Webmaster Tools Showing Bad Links Removed Over 60 Days Ago
Hello, One of my clients received the notorious message from Google about unnatural links late last March. We've removed several hundred (if not thousands) of links, and resubmitted several times for reconsideration, only to continue with responses that state that we still have unnatural links. Looking through the "links to your site" in google webmaster tools, there are several hundred sites / pages listed, from which we removed our link over 60 days ago. If you click each link to view the site / page, they contain nothing, viewable or hidden, regarding our website / address. I was wondering if this (outdated / inaccurate) list is the same as the one their employees use to analyze the current status of bad links, and if so how long it will take to reflect up-to-date information. In other words, even though we've removed the bad links, how long do we need to wait until we can expect a clean resubmission for reconsideration. Any help / advice would be greatly appreciated -
White Hat / Black Hat SEO | | Bromtec0