Google Webmaster Tools Parameters
-
We have several large ecommerce websites, and we've added some tracking parameters to GWT for google to ignore. All pages are correctly canonicaled.
Google has been ignoring the parameters and the canonicals, and still ranks many parametered pages for us. Has anyone run into this?
-
Building on Patrick's answer using Robots.txt was the fastest way for things to take effect, but if implemented wrong can impact you pretty bad.
In addition to adding the exclusions in the Robots.txt, I also put in a removal request to remove the pages with parameters from googles index using the Google Url Removal tool, this combined with the other options helped clean up my results in the google index.
Hope this helps.
-
Patrick:
I set up the "clicksource" parameter in Google Webmaster Tools in March, and we still have 2400 pages with parameters indexed.
-
Very often, it isn't enough to only try and complete this through parameter handling. As Patrick said, you can look at adding sets of 'disallows' in Robots.txt and no-index at a page level, if you want to have pages de-indexed, but be careful how you do this and make sure that Google can see and action a no-index before blocking access to the page.
-Andy
-
Hi there
When did you put these parameters in place? Sometimes it can take Google some time to crawl your site and see these requests via your GWMT. So, depending on how recently you implemented these parameters, you may still see some of these URLs ranking.
Do you have these parameters in your sitemap or internal links where they are not needed? If so, make sure you correct them.
You can also look into robots.txt opportunities with your web development team to block these parameters, although I would really research this so you don't accidentally block important URLs.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
Would google consider this the anchor text?
Hi guys, For a button based link, can you define the anchor text google will use. I have attached screenshot of what i mean. Cheers. geavig
Intermediate & Advanced SEO | | bridhard80 -
Why Is Google Webmaster Tools Pulling Zero Keyword Data?
I just linked a Google Webmaster Tools account to Google Analytics for a client, and Search Engine Optimization reports are showing up in Google Analytics as enabled, but there is zero keyword data, landing page data, etc., in the reports themselves. Has anyone encountered this?
Intermediate & Advanced SEO | | yoursearchteam0 -
Redirect to url with parameter
I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
Intermediate & Advanced SEO | | Debitoor
wiki2.website.com/how_to_build_kitchen
and
wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Why is my site not getting crawled by google?
Hi Moz Community, I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed. Does it have anything to do with my site being on HTTPS and not on HTTP? My site is under HTTPS and all my content is ajax based. Thanks
Intermediate & Advanced SEO | | en-gageinc0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Google Places not appearing
is it possible to be sandboxed for a google places page? one of our clinics has a places page, and it was doing fine (http://www.google.com/maps/place?cid=5542269234389030356) but now whenever we set our location to trinity,fl and try to search for weight loss, weight loss trinity, etc.. it doesnt come up. it only comes up if we search medi weight loss trinity. also, when we go into our google places dashboard and try to edit the pictures, it doesnt show the same pictures on the actual locations page. for example, in our dashboard we have 5 pictures, but on the actual places page, 3 pictures are showing (none of which are in our dashboard). any ideas?
Intermediate & Advanced SEO | | AustinBarton0