Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
-
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
-
Thanks for all that. Really valuable information. I have gone to Parameter handing and there were 54 parameters listed. In total, generating over 20 million unnecessary URLs. I nearly died when I saw it. We have 6,000 genuine pages and 20 million shitty ones that don't need to be indexed. Thankfully, I'm upgrading next week and I have turned the feature off on the current site, the new one won't have that feature. Phew.
I have changed the settings for these parameters that were already listed in Webmaster tools, and now I wait for the biggest re-index in history LOL!
I have submitted a sitemap now and as I rewrite page titles & meta descriptions, I'm using the Fetch as Google tool to ask for resubmission. It's been a really valuable lesson, and I'm just thankful that I wasn't hit worse than I was. Now, it's a waiting game.
Of my 6,000 URLs' on the site map submitted a couple of days ago, around 1/3 of them have been indexed. When I first uploaded it, only 126 of them were.
-
The guys here are all correct - you can handle these in WMT with parameter handling, but as every piece of text about parameter handling states, handle with care. You can end up messing things up big-time if you block areas of the site you do want crawled.
You'll also have to wait days / longer for Google to acknowledge the changes and reflect these in its index and in WMT.
If it's an option, look at using the canonical tag to self-reference: this means that if the CMS creates multiple pages with the same file on different URLs, they'll all point back to the original URL.
-
"They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later."
Google will continue to index them, until you tell them specifically not to do so. Go to GWT, and resubmit a sitemap containing only the URL's you want them to index. Additionally, do a "fetch as Google" on the same pages as your sitemap. This can help to speed up the "reindex" process.
Also, hours? LMAO it will take longer than that. Unless you are a huge site that gets crawled hourly, it can take days, if not weeks for those URL's to disappear. I'm thinking longer since it does not sound like you have redirected those links, just turned off the plugin that was used to create them. Depending on how your store is set up, and how many pages you have, it may be wise to 301 all the offending pages to their proper destination URL.
-
Check out parameter exclusion options in Webmaster Tools. You can tell the search engines to ignore these appended parameters.
-
Use a spidering tool to check out all of the links from your site, such as Screaming Frog.
Also check your XML & HTML Site Maps doesn't have old links.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content - default.html
I am showing a duplicate content error in moz. I have site.com and site.com/default.html How can I fix that? Should I use a canonical tag? If so, how would i do that?
On-Page Optimization | | bhsiao0 -
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
Duplicate content, which seems not to be duplicate :S
After crawling I am used to getting a lot of duplicate content messages in Moz, which are High Priority. I do not know what to do with them, since I believe we tackled all the issues. Main point being the advise to put in a link rel=canonical. An example of a page that accordeing to the report has a duplicate. I do not see how. Can you help with that? http://www.beat-it.nl/4y6hctr24x7wdmr-ml350-p-ic-procaresvc.html duplicate sample http://www.beat-it.nl/modu-hp-a5800-acm-for-64-256-aps.html
On-Page Optimization | | Raymo0 -
Should I remove 'local' landing pages? Could these be the cause of traffic drop (duplicate content)?
I have a site that has most of it's traffic from reasonably competitive keywords each with their own landing page. In order to gain more traffic I also created landing pages for counties in the UK and then towns within each county. Each county has around 12 towns landing pages within the county. This has meant I've added around 200 extra pages to my site in order to try and generate more traffic from long tail keywords. I think this may have caused an issue in that it's impossible for me to create unique content for each town/country and therefore I took a 'shortcut' buy creating unique content for each county and used the same content for the towns within it meaning I have lots of pages with the same content just slightly different page titles with a variation on town name. I've duplicated this over about 15 counties meaning I have around 200 pages with only about 15 actual unique pages within them. I think this may actually be harming my site. These pages have been indexed for about a year an I noticed about 6 months ago a drop in traffic by about 50%. Having looked at my analytics this town and county pages actually only account for about 10% of traffic. My question is should I remove these pages and by doing so should I expect an increase in traffic again?
On-Page Optimization | | SamCUK0 -
Wordpress pages URL's redirection.
I was checking W3C Markup Validation and in report it was shown that that pages (not post or any other URL's just PAGES) at investmentcontrarians.com are 301 redirected. e.g. original URL "http://www.investmentcontrarians.com/debt-crisis" which is redirected to "http://www.investmentcontrarians.com/debt-crisis/" I know that its not that serious issue, but still want to know why only pages are being redirected and how can we avoid it.
On-Page Optimization | | NumeroUnoWebSolutions0 -
How to SEO a website that is being help back by duplicate content?
We have over 20 websites that sell property. Each website is targeted to a different country. People advertise to sell their property. The websites are not getting to page 1 for the terms we want probably because of duplication issues. If we compare one website with another country website on www.duplicatecontent.net we find it is nearly 70% between one and the other. So we trying to understand why this is. If someone wanted to sell a property in Spain we would create an advert for them but rather than putting this on the back-end of the Spain website it goes on a separate website that does on all countries. We have tried to put nofollow tags so that the country specific website gets acknowledgement of being the original website but the rankings for key-terms will not rise and the duplication % remains nearly 70%. Can anyone suggest the best way forward?
On-Page Optimization | | Feily0 -
Duplicate content - what to do?
Hi, We have a whole lot of articles on our site. In total 5232 actually. The web crawler tells me that in the articles we have a lot of duplicate content. Which is sort of nonsense, since each article is unique. Ah, some might have some common paragraphs because they are recurring news about a weekly competition. But, an example: http://www.betxpert.com/artikler/bookmakere/brandvarme-ailton-snupper-topscorerprisen AND http://www.betxpert.com/artikler/bookmakere/opdaterede-odds-pa-sportschef-situationen-pa-vestegnen These are "duplicate content", however the two article texts are not the same. The menu, and the widgets are all the same, but highly relevant to the article. So what should I do? How can i rid myself of these errors? -Rasmus
On-Page Optimization | | rasmusbang0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0