A campaign ghost keeps returning to my Google Analytics - Help!
-
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site).
I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign.
Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing.
However they are still showing as sessions even in the new view (created after removing the campaign links in question).
Is there anyway to stop this happening!? I want to be able to report on sessions correctly.
Thanks,
Sam
-
Thanks Kristina,
I set-up the filter in the following way:
Filter type - Custom
Search and Replace
Filter field - Campaign nameSearch string - 'nameofmycampaignimremoving'
Replace string -
I figured by adding in the campaign name but not replacing it with anything, Google Analytics should now pick this traffic up as direct.
The campaign is no longer showing in my traffic, so I assumed it's worked.
Is there a better way of doing this?
Sam
-
Good luck!
-
Very interesting Kristina.
I think I've figured out what's going on.
Our product is a browser based CRM hosted on a secure server, so I believe someone has previously visited our site and clicked on one of the old campaigns and then is either returning directly to our /login page or is clicking on the ? buttons within our browser-based CRM which leads to our website support articles and since our browser-based CRM is on a secure server this would count as direct traffic not referral.
So as you mentioned I will set-up a filter to have these campaigns show up as direct.
Hallelujah!
Sam
-
Hi Sam,
First, to be clear, campaigns will be overwritten if visitors come from any other source, it's just direct traffic that the campaign parameter holds on to. Here's Google's direct quote from its article on campaigns and traffic sources:
Existing campaign or traffic source data for a particular user will be overwritten by new campaign or traffic source data, regardless of the configured timeout period.
If you look at the flowchart a bit below that quote, you'll see that Google starts by looking for new campaign data, then looks for traffic source data, then, if it doesn't find either of those, uses existing campaign data.
That means your theory could still be correct, but only if all of the visits you're still seeing come in are just from direct visits. You can check this theory by using the % New Sessions column - if it's 0%, you're right, these are just returning visitors, and the best I can recommend is that you set up filters to make these show up as "direct." If it's not, though, (and I'm suspecting it's not, because I doubt this would make a large enough number for you to be concerned and reach out for help), you've still got some of those campaign URLs floating around for the public.
Here's how I'd go looking for them:
- Use a third party tool like Screaming Frog or DeepCrawl to triple check that there are no internal links on your site with those old campaign parameters. CMSs can easily miss things like this, so using an outside tool that just tries to find everything helps.
- Search for the original URLs + parameters in Google to see if any affiliates or coupon sites are using those links.
- Check your old emails - did you ever send out these URLs? It's possible that people are still accessing old emails.
- Was this a campaign that could have been shared in any other way? I know that my company often shares shortened URLs, which redirect to URLs with parameters appended. Have you shared any bit.ly or other aliased URLs that are appending those parameters you've tried to get rid of?
I hope this helps! Let me know if you still have any questions, or if anything stumps you along the way.
Best,
Kristina
-
I know it's too obvious, but what about just creating a segment, filtering out that campaign traffic?
-
Thanks, bounce rate is in the early 90% so high but not 100%, exit pages also differ.
Interesting note on the tracking code.
Since if any visitor revisits (whom orginally clicked on one of the campaign links) counts as a session against the old campaign I don't think it's as complicated as people visiting through bookmarks or browser history.
Is there really nothing I can do about these old campaigns coming back ot haunt me!?
-
Hi there.
New view wouldn't help anyhow, because it's tied to the same tracking code. My guess is that either users are getting to those pages through bookmarks or browser history, or those links were indexed somehow and now you're getting hit by bots and crawlers.
Go to campaigns, see what source/medium those sessions are coming from, also check how long those sessions are and the bounce rate. If it looks like it might be crawlers - look into ghost and referral spam filtration. here is the link on how to implement - https://mza.seotoolninja.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch data for users with ajax but show it without ajax for Google
Hi, We have a thematic footer which shows similar pages links relevant to the search criteria made on a page. We want to fetch those footer similar links through ajax when users search on site but the links will be shown without using ajax when Google fetches those pages. We want to do this to improve our page load time. The links content & count will be exactly same in both cases whether Google fetches the search pages or user fetches those pages. Will this be treated as negative by Google, Can this have any negative affect on our rankings or traffic. Regards,
Web Design | | vivekrathore0 -
Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging. And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing. Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update). We noticed traffic taking a particular plunge at the beginning of June. Can anyone offer insights? Very much appreciated.
Web Design | | Novos_Jay0 -
Is it possible to redirect the main www. domain - but keep a subdomain active?
Hi Mozzers, Quick question, which I hope one of you can answer... Let's say I have a website (i) www.example.com and on that a subdomain exists, (ii) subdomain.example.com. Let's say I want to change my main domain from www.example.com to www.newwebsite.com. I'd 301 all content, use GWT to notify Google of a change of address etc etc. Having done that, is it still possible to keep the original subdomain active? So, even though www.example.com has been redirected / transferred to www.newwebsite.com, subdomain.example.com would still exist. If that is possible, what is the implication for Domain Authority? On the one hand, I have transferred the main site (so DA from that will transfer to the new site); but part of that root domain is still active. Make sense? Any answers? Thanks everyone...
Web Design | | edlondon0 -
How keywords per page to keep from being "spammy"?
Hi all, I am currently doing a marketing internship for a B2B company that does all sorts of out-sourced recruiting work. I have some experience with SEO, but not completely confident. My first question is, I know Google sees websites that load up on keywords as "spammy", so what is the appropriate number of keywords per page? Currently, I was thinking about this setup: 1 keyword for the URL 1 keyword per alt tag (1 per page, at most) 2 keywords per each title tag (approximately 4 pages that I am going to follow internally, not following the "about us" page). After that, I was thinking of adding 2-3 more keywords in each meta description and 2-3 in the body copy. That would equate to 6-8 keywords on each page, is this too many and should keywords be repeated (on the same page or across multiple pages)? Since this website is brand new (zero links), would it make sense to nofollow all of the internal links so that they homepage can gain ranking as quickly as possible within Google?
Web Design | | wlw20090 -
Google Tag Manager
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works. As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced. Best,
Web Design | | ChristopherGlaeser
Christopher0 -
Parameters - Google Web Master Tools
In Google Web Mastertools you can stipulate which paramters you want the Googlebots to ignore when crawling your site. This is common place on pages that add some form of parameterisaton to the end of the link when a web user filters the information on a page (eg. on a clothes website someone may filter the products so they only see 'blue' jumpers, rather than 'all') This is meant to be beneficial as it means Google trawls through less duplicate content. Having now set this up, what impact will this have on my search results, if any? Don't get me wrong, I'm not expecting to shoot up to no.1, but will it benefit me in any way?
Web Design | | DHS_SH0 -
3 Products & 50 Options each, How does Google handle product variant or options?
We are selling furnace filter and we might move our existing store host by BigCommerce to Americommerce or Corecommerce. Before moving the store, I have a questions about our online store structure. We are selling 3 different furnace filters, GOLD, SILVER and BRONZE Series. Each furnace filter come in about 50 different sizes, for a total of about 150 different products. The way our store is setup now, it is 150 different product, 150 different URL, 150 different page name... The way it is setup now, might look like duplicate content. All the product page are the same, all the pictures are the same, the only thing that change, is the furnace filter size in the product description. Look at those pages for example: http://www.furnacefilterscanada.com/20x20x4-Furnace-Filters/ http://www.furnacefilterscanada.com/categories/2-Inches-Thick-Filters/10x20x2-inches/ http://www.furnacefilterscanada.com/categories/2-Inches-Thick-Filters/16x25x2-inches/ Would it be better to only have 3 products and 50 variables or size options? What would be the best structure in a SEO point of view? One thing we have to keep in mind, when searching for a furnace filter, shooper will use keywords like: 16x25x4 furnace filter filter 20x20x1 air furnace filter 10x20x1 furnace filter 24x24x4 canada furnace filter Most of the Google search will included the filter size_._ How does Google handle product variant or options_?_ If I have 3 products, I will have only 3 URL and 3 different page name. I know for the shoppers, 3 products with sizes options might provide a better experience, but what about Google ranking the products? What is opinion the best online store structure in our case? Thank you for your help, preciouse time and support. BigBlaze www.furnacefilterscanada.com/
Web Design | | BigBlaze2050 -
Can google crawl text in jquery sliders?
We are redesigning our website and want to present a fair amount of text within jquery sliders. Will google crawl this text or is it treated the same way as actual script? Perhaps there is a way to just have the text as plain html but use jquery to display it?
Web Design | | Netboost0