What is the best way to eliminate ghost traffic from Google Analytics?
-
Hey Mozzers,
I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here).
Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
-
Hi J.P.,
i have personally found this article and the solution it proposes to be the most helpful - http://help.analyticsedge.com/spam-filter/definitive-guide-to-removing-google-analytics-spam/. Hope it helps.
-
I would recommend taking a look at the following if you are struggling with ghost referral type traffic. Its a great post that was touted around last year when referral traffic spiked for a lot of people. - Hopefully this is what you are after.
https://mza.bundledseo.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Do Ghost Traffic/Spam Referrals factor into rankings, or do they just affect the CTR and Bounce Rate in Analytics?
So, by now I'm sure everyone that pays attention to their Analytics/GWT's (or Search Console, now) has seen spam referral traffic and ghost traffic showing up (Ilovevitaly.com, simple-share-buttons.com, semalt.com, etc). Here is my question(s)... Does this factor into rankings in anyway? We all know that click through rate and bounce rate (might) send signals to the algorithm and signal a low quality site, which could affect rankings. I guess what I'm asking is are they getting any of that data from Analytics? Since ghost referral traffic never actually visits my site, how could it affect the CTR our Bounce Rate that the algorithm is seeing? I'm hoping that it only affects my Bounce/CTR in Analytics and I can just filter that stuff out with filters in Analytics and it won't ever affect my rankings. But.... since we don't know where exactly the algorithm is pulling data on CTR and bounce rate, I guess I'm just worried that having a large amount of this spam/ghost traffic that I see in analytics could be causing harm to my rankings.... Sorry, long winded way of saying... Should I pay attention to this traffic? Should I care about it? Will it harm my site or my rankings at all? And finally... when is google going to shut these open back doors in Analytics so that Vitaly and his ilk are shut down forever?
White Hat / Black Hat SEO | | seequs2 -
Bounce Rate Manipulated with Direct Traffic Spikes - Thoughts?
Hi Mozzers, we're hoping to get some insight from some of the technical folks out there on what seems to be malicious targeting of a client's website. We recently discovered enormous spikes in direct traffic to the website with 90% originating from the USA and the rest coming from dozens of other countries. Nearly 100% of visits are new sessions and each only lasts a few seconds - thereby driving the bounce rate over 95%! There are other possible identifiers worth noting, including: Browser - 99% use Internet Explorer Browser Version - 89% use IE 7.0 Flash Version - 80% use 14.0 r0 Operating System - 99% use Windows See the attached "Screenshot - Traffic Spikes & Inflated Bounce Rate". Whether this is a negative SEO attack or something else, we're really hoping to get the community's input and (hopefully) possible solutions. Thanks! oYKrMu6
White Hat / Black Hat SEO | | ByteLaunch0 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0 -
Need advice on best strategy for removing these bad links.
Heres the scenario... We recently took on a new client who's previous seo company had partaken in some dodgy link building tactics. They appear to have done some blog comment spam, very poorly. The situation we are now in is this: We have a site with an internal page deemed more important than the homepage (the homepage has 60 linking root domains and the internal page 879). It looks as though the previous seo company submitted a disavow request, theres a message in webmaster tools from a few weeks back saying it had been received, but no further correspondence. I have doubts as to whether this disavow request was done correctly... Plus im not sure that Google has issued the site a warning yet as they are ranking position one for the keyword on the internal page. Our clients want us to handle this in the correct manner, whether it be to simply ignore it and wait for Google to send a warning about the links, remove the offending internal page and leave a 404, or try to disavow the links that google doesnt know about yet from 800+ websites. Suggestions for the best practice for dealing with this situation? Any advice is much appreciated, Thanks, Hayley.
White Hat / Black Hat SEO | | Silkstream0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0