Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
-
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
-
Thanks for all that. Really valuable information. I have gone to Parameter handing and there were 54 parameters listed. In total, generating over 20 million unnecessary URLs. I nearly died when I saw it. We have 6,000 genuine pages and 20 million shitty ones that don't need to be indexed. Thankfully, I'm upgrading next week and I have turned the feature off on the current site, the new one won't have that feature. Phew.
I have changed the settings for these parameters that were already listed in Webmaster tools, and now I wait for the biggest re-index in history LOL!
I have submitted a sitemap now and as I rewrite page titles & meta descriptions, I'm using the Fetch as Google tool to ask for resubmission. It's been a really valuable lesson, and I'm just thankful that I wasn't hit worse than I was. Now, it's a waiting game.
Of my 6,000 URLs' on the site map submitted a couple of days ago, around 1/3 of them have been indexed. When I first uploaded it, only 126 of them were.
-
The guys here are all correct - you can handle these in WMT with parameter handling, but as every piece of text about parameter handling states, handle with care. You can end up messing things up big-time if you block areas of the site you do want crawled.
You'll also have to wait days / longer for Google to acknowledge the changes and reflect these in its index and in WMT.
If it's an option, look at using the canonical tag to self-reference: this means that if the CMS creates multiple pages with the same file on different URLs, they'll all point back to the original URL.
-
"They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later."
Google will continue to index them, until you tell them specifically not to do so. Go to GWT, and resubmit a sitemap containing only the URL's you want them to index. Additionally, do a "fetch as Google" on the same pages as your sitemap. This can help to speed up the "reindex" process.
Also, hours? LMAO it will take longer than that. Unless you are a huge site that gets crawled hourly, it can take days, if not weeks for those URL's to disappear. I'm thinking longer since it does not sound like you have redirected those links, just turned off the plugin that was used to create them. Depending on how your store is set up, and how many pages you have, it may be wise to 301 all the offending pages to their proper destination URL.
-
Check out parameter exclusion options in Webmaster Tools. You can tell the search engines to ignore these appended parameters.
-
Use a spidering tool to check out all of the links from your site, such as Screaming Frog.
Also check your XML & HTML Site Maps doesn't have old links.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Telling Google SERP's my correct currency.
Hi, I'm having a problem with Google SERP results showing my currency as USD, when it should be CAD. An example of a page with this problem is - http://www.absoluteautomation.ca/fgd400-sensaphone400-p/fgd400.htm - can anyone see where Google is getting USD from on there? I don't see it anywhere in the coding. Thanks in advance!
On-Page Optimization | | absoauto0 -
Duplicate Page Content for Product Pages
Hello, We have one website which URL is http://www.bannerbuzz.com & we have many product pages which having duplicate page content issue in SEOMOZ which are below. http://www.bannerbuzz.com/backlit-banners-1.html
On-Page Optimization | | CommercePundit
http://www.bannerbuzz.com/backlit-banners-10.html
http://www.bannerbuzz.com/backlit-banners-11.html
http://www.bannerbuzz.com/backlit-banners-12.html
http://www.bannerbuzz.com/backlit-banners-13.html We haven't any content on these pages, still getting duplicate page content errors for all pages in SEOMOZ. Please help me how can i fix this issue. Thanks,0 -
What's the best way to tackle duplicate pages in a blog?
We installed a WP blog on a website and the below result is just an example. All of them lead to the same content. What's the best way to resolve it? http://www.calmu.edu/blog/
On-Page Optimization | | Sangeeta
http://www.calmu.edu/blog/calmu-business-spotlight-veev/
http://www.calmu.edu/blog/category/business-buzz/0 -
20 x '400' errors in site but URLs work fine in browser...
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas? example - http://www.morethansport.co.uk/products?sortDirection=descending&sortField=Title&category=women-sports clothing
On-Page Optimization | | Switch_Digital0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
Duplicate content on video pages
Hi guys, We have a video section on our site containing about 50 videos, grouped by category/difficulty. On each video page except for the embedded player, a sentence or two describing the video and a list of related video links, there's pretty much nothing else. All of those appear as duplicate content by category. What should we do here? How long a description should be for those pages to appear unique for crawlers? Thanks!
On-Page Optimization | | lgrozeva0 -
SERP listing of a websites' 'categories'
Hi all, just wondering if anyone has thoughts on what I can do to encourage SERP listings that include website categories, eg http://www.google.com.au/search?q=seomoz&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a . I'm assuming search engines only display type of listings when the search query closely matches the domain name? Thank heaps!
On-Page Optimization | | TheWebSearchMarketingCompany0 -
Duplicate Content Question
On the home page of my site I have a read more link that takes you to a different URL with basically the same content, just more of it. Home Page: http://www.opwdecks.com/ Read More Link on Home Page: http://www.opwdecks.com/deckmaintain.htm I think this may be affecting my seo. Any suggestions on what I should do about this? Should I add a canonical to the home page and/or on the other page? Both pages are indexed by google. Thanks for any help or tips.
On-Page Optimization | | opwdecks0