Will Google penalize a site that had many links pointing to it with utm codes?
-
I want to track conversions using utm parameters from guest blog posts on sites other than my own site.
Will Google penalize my site for having a bunch of external articles pointing to one page with unique anchor text but utm code?
-
I agree with Kristina one 100%. She rocked with the 2 perfect solutions you ought to be looking into.
-
No, Google likes unique anchor text, since it seems more organic, and supports tracking codes as well. My only hesitation is that adding those tracking codes is going to identify your links as marketing efforts rather than completely organic links. But I think you should be okay.
The problem with tagging is that mysite.com/seo-text?utm_campaign=guest-blogs could technically be indexed by Google and become a duplicate of mysite.com/seo-text. Make sure that Google knows that you only want it to index mysite.com/seo-text by:
- Adding a cannonical tag on the page to link to the non-parameter version:
- Tell Google to disregard the utm_campaign parameter in Google Webmaster Tools.
Best,
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.
Intermediate & Advanced SEO | | pujan.bikroy0 -
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
My wordpress site generating bad links
Hello Everyone, I have wordpress site Which is from last 20 days generating links like For Example http://www.domainname.com/game/965/wiki/キャラクター図鑑_レアリティ(★★★)_【ID:675】ワッツ・ステップニー htttp://www.domainname.com/nkpghfu_13356_gvgjq_tfjhnkt_jsj_296_82566_673_567_245 This is screenshot of webmaster tools http://prnt.sc/ccwh0e can please any expert check & Tell How this Link i am getting, Also What are steps i need take for removing this Errors, As it is harming my sites Flow As well As Rankings. Thanx in Advance
Intermediate & Advanced SEO | | innovativekrishna10 -
Google adwords destination link issue
In google adwords I set destination link like this http://www.abcd.com/ and like this http/:www.abcd.com so how google adwords react on both this? And how it will show data in adwords? Thanks! Dev
Intermediate & Advanced SEO | | devdan0 -
New site now links disappearing in Open Site Explorer and GWT
We launched a new site at the beginning of December 2012 and carefully 301'd all URLs from the old site to the new (custom CMS on old site wordpress on new). Our rankings have slipped quite badly but the most worrying thing is that we used to have about 1200 backlinks according to GWT/OSE before the new site launched and now we're down to about 30. Can anyone help shed some light on this please? The site is www.littleoneslondon.co.uk A few things that might help: 1. We were getting a lot of links through our job feeds (it's a nanny recruitment site) on indeed and trovitt, for some reason no new ones from these have appeared in site explorer and all the old jobs are gone completely. 2. We had 1000s of not found errors in google webmaster tools and once these were redirected and marked as fixed this is when the links disappeared. 3. We are getting quite a few 504 errors on the site due to an old proxy redirect (/blog was hosted on a different server on the old site and has not been removed yet), this will be fixed tomorrow but could this be a factor? 4. The developer seems to have redirected all the links through wordpress directly some how (I don't see any redirect plugins but there are lots of pages called 'redirect'). There are no references in the htaccess file for any redirects other than from the /blog folder that the wordpress instance sits in. Sorry for the long post, I hope I've given any details you'd need and I really appreciate any help anyone can give. Thanks, Karl
Intermediate & Advanced SEO | | Bdig0 -
Please help on this penalized site!
OK, this is slowly frying my brain and would like some clarification from someone in the know, we have posted multiple reconsideration requests the regular "site violates googles quality guidelines" .."look for unnatural links etc" email back in March 2012, I came aboard the business in August 2012 to overcome bad SEO companies work. So far i have filled several disavow requests by domain and cleared over 90% of our backlink profile which where all directory, multiple forum spam links etc from WMT, OSE and Ahrefs and compiled this to the disavow tool, as well as sending a google docs shared file in our reconsideration request of all the links we have been able to remove and the disavow tool, since most where built in 2009/2010 a lot where impossible to remove. We managed to shift about 12 - 15% of our backlink profile by working very very hard too remove them. The only links that where left where quality links and forum posts created by genuine users and relevant non spam links As well as this we now have a high quality link profile which has also counteracted a lot of the bad "seo" work done by these previous companies, i have explained this fully in our reconsideration request as well as a massive apology on behalf of the work those companies did, and we are STILL getting generic "site violates" messages, so far we have spent in excess of 150 hours to get this penalty removed and so far Google hasn't even batted an eyelid. We have worked SO hard to combat this issue it almost feels almost very personal, if Google read the reconsideration request they would see how much work we have done too remove this issue. If anyone can give any updates or help on anything we have missed i would appreciate it, i feel like we have covered every base!! Chris www.palicomp.co.uk
Intermediate & Advanced SEO | | palicomp0 -
Getting a site to rank in both google.com and google.co.uk
I have a client who runs a yacht delivery company. He gets business from the US and the UK but due to the nature of his business, he isn't really based anywhere except in the middle of the ocean somewhere! His site is hosted in the US, and it's a .com. I haven't set any geographical targeting in webmaster tools either. We're starting to get some rankings in google US, but very little in google UK. It's a small site anyway, and he'd prefer not to have too much content on the site saying he's UK based as he's not really based anywhere. Any ideas on how best to approach this?
Intermediate & Advanced SEO | | PerchDigital0 -
Working out exactly how Google is crawling my site if I have loooots of pages
I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated Cheers
Intermediate & Advanced SEO | | soeren.hofmayer0