Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
-
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
-
Thanks for all that. Really valuable information. I have gone to Parameter handing and there were 54 parameters listed. In total, generating over 20 million unnecessary URLs. I nearly died when I saw it. We have 6,000 genuine pages and 20 million shitty ones that don't need to be indexed. Thankfully, I'm upgrading next week and I have turned the feature off on the current site, the new one won't have that feature. Phew.
I have changed the settings for these parameters that were already listed in Webmaster tools, and now I wait for the biggest re-index in history LOL!
I have submitted a sitemap now and as I rewrite page titles & meta descriptions, I'm using the Fetch as Google tool to ask for resubmission. It's been a really valuable lesson, and I'm just thankful that I wasn't hit worse than I was. Now, it's a waiting game.
Of my 6,000 URLs' on the site map submitted a couple of days ago, around 1/3 of them have been indexed. When I first uploaded it, only 126 of them were.
-
The guys here are all correct - you can handle these in WMT with parameter handling, but as every piece of text about parameter handling states, handle with care. You can end up messing things up big-time if you block areas of the site you do want crawled.
You'll also have to wait days / longer for Google to acknowledge the changes and reflect these in its index and in WMT.
If it's an option, look at using the canonical tag to self-reference: this means that if the CMS creates multiple pages with the same file on different URLs, they'll all point back to the original URL.
-
"They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later."
Google will continue to index them, until you tell them specifically not to do so. Go to GWT, and resubmit a sitemap containing only the URL's you want them to index. Additionally, do a "fetch as Google" on the same pages as your sitemap. This can help to speed up the "reindex" process.
Also, hours? LMAO it will take longer than that. Unless you are a huge site that gets crawled hourly, it can take days, if not weeks for those URL's to disappear. I'm thinking longer since it does not sound like you have redirected those links, just turned off the plugin that was used to create them. Depending on how your store is set up, and how many pages you have, it may be wise to 301 all the offending pages to their proper destination URL.
-
Check out parameter exclusion options in Webmaster Tools. You can tell the search engines to ignore these appended parameters.
-
Use a spidering tool to check out all of the links from your site, such as Screaming Frog.
Also check your XML & HTML Site Maps doesn't have old links.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
Do we need to worry about internal duplicate content?
Hi, I have a question about internal duplicate content. We have a catalogue of around 4000 products. Most of these do have individual descriptions but for most of the products they contain a generic summary that includes a sentence to begin with that includes each product name. We're currently working on descriptions for each product, but as you can imagine it's quite a chore. I was wondering if there are actually any penalties for this or whether we can ignore the crawl errors from the moz report? Thanks in Advance!
On-Page Optimization | | 10dales0 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
Duplicate Page Content
Hi there, We keep getting duplicate page content issues. However, its not actually the same page.
On-Page Optimization | | HamiltonIsland
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content?0 -
Duplicate Page Content Question
This article was published on fastcompany.com on March 19th. http://www.fastcompany.com/magazine/164/designing-facebook It did not receive much traffic, so it was re-posted on Co.Design today (March 27th) where it has received significantly more traffic. http://www.fastcodesign.com/1669366/facebook-agrees-the-secret-to-its-future-success-is-design My question is if google will dock us for reprinting/reusing content on another site (even if it is a sister site within the same company). If they do frown on that, is there a proper way to attribute the content to the source material/site (fastcompany.com)?
On-Page Optimization | | DanAsadorian0 -
Long or Short URLs. Who's Coming to Dinner?
This has been discussed on the forums in some regard. My situation. Example 1 Long Keyword URL: www.abctown.com/keyword-for-life-helping-keywords-everywhere-rank-better Example 2 Short Keyword URL: www.abctown.com/keyword In both examples I want to improve rankings for the "keyword" phrase. My current URL is example 1. And I've landed a page one ranking in Google (7) with that URL. In attempts to improve rankings further (top 5), I was toying with the idea of going simpler with all my URLs in favor of the example 2 model. Might this method help or hurt my current rankings? In recent articles I've read it seems that going with the simpler more human approach to my SEO efforts. Any thought would be appreciated. Cheers,
On-Page Optimization | | creativedepartment0 -
Archetecture to avoid content duplicate
Hi, I have lots of duplicate stuff and I need a better site architecture. http://www.furnacefilterscanada.com/ We are selling furnace filters. All furnace filters are sold in 50 different sizes, each sizes comes in 3 different qualities, Bronze, Silver and Gold. Total: 150 products. Right now I have created many categories and subcategories for furnace filters sizes. When the client pickup is sizes, he will end-up to the products page with 3 different options, Bronze, Silver and Gold. They can then compare the filter a select the one he wants to purchase. The problem is, it is not possible to provide different content for each filters, Gold has a description, Silver has another one and also Bronze. The only text that will change in the descriptions, is the filter size. This makes Duplicates text description. Not good when you what to index your site. The positive things to 150 different products, is the page title. example 16x25x4 furnace filters. Those exacte tem get search in Google. A new site architecture with 3 categories, Gold, Silver and Bronze & 50 variables by products (filters sizes) might not be the best options, because no filter size will be index. Can you please help me to find the best architecture in a SEO point of view? Also what about the top navigation bar menu, what is the best options in using it? Right now it is use for Legal, Contact, Policy and I fill it is a wast, those page only get less then 1% clicks. It might be more convenient to use those for categories for example, what is your recommendations in a SEO point of view? Can I create a information page in the left navigation menu and includ all the standard page, like: Policy, Legal ... If I do, will I get penalize by Google? Thank you for your help. We have puts lots of money in AdWords before, but now the next step is to come home organics. I'm using SEOmoz tools, read there new book, and I want increase traffic. I just need your help. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050