Removing pages from index
-
Hello,
I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist.The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal.
Also, should I 301, 404 or 410 these pages?
Any help would be appreciated.
Thanks,
Alex
-
yes the no content page thing is a big problem. If you have a "view all" option, and it's more than a dozen, fifteen or maybe 20 products, that should be paginated, with full indexing. Maile Oyhe even talked about that specific scenario of "view all" being good.
In my experience, all of the no-content pages should, ideally, be 301 redirected in a way that they point to the most relevant highest level category page on your site.
Since there's so many, there's no easy way to get them removed from the index other than doing the 301 then being patient as Google recrawls then re-confirms.
-
Ah - that's definitely better, if you don't go too wide. 2009 - 2010's concept of not having too many links went too far with too many people. Sites became too flat.
Categories and pagination are best served with having enough categories to cover the highest level groups, with sub-categories as appropriate, but not to the the point where there's only a few products in any single sub-category. So if you've got more than a dozen or fifteen products in a category or sub-category, pagination is perfectly valid.
Having more than six, eight or maybe ten categories at most, is also not good.
-
Alan, I think I misspoke. I meant to say that a categorically structured set of your products would be better to index than a paginated version. For example :
http://www.sunglasses.com/mens/black/productx
as opposed to http://www.sunglasses.com/products?page=233
Is it still considered wise to index both paginated results along with categorized results in this case?
-
Hi Alan,
Thanks for the info. I was going to set my page 2+ to "noindex,follow", however your reply makes sense. I will leave them indexable. I do see some competitors "rel=canonical" pagination to "view all" pages. I think I will keep my pages as is.
However, as my reply to Ryan stated, my issue is still the INDEX.
Google has thousands of "no content" pages indexed. They contain links to other "no content" pages making my site look thin. This may be the reason we lost so much ranking/traffic with Panda update.
How do I get these pages removed from the index? And do I return 301, 404 or 410 when Google comes back to reindex them?
Thanks for your help!
Alex
-
Hi Ryan,
I crawled the site, and did not find links to these pages, however it made me realize another HUGE issue. Since the paging is dynamically created, it has links to the "back" & "forward" no matter what page you are on. So, if page # 5000 is displayed, it will have links to page # 4999 and 5001. Although in my website I do not have links to pages that do not exist, all it takes is someone link to my site with "page=10000" and Google to index that page. From this point on, G will index all the PAGEs that do not exist.
Thanks again for getting me a step closer to resolving my problem.
However, the problem is still the INDEX. Google has (now realizing that its in the thousands) pages indexed with no content. These pages just contain links to other PAGING pages that have no content and my main menu/categories.
How do I get these pages removed from the index?
Thanks again!
Alex
-
For the record, that link that SSCDavis referenced includes Matt Cutts discussing faceted navigation, not pagination. Faceted navigation is different than pagination by leaps and bounds. So he (SSCDavis), with all due respect, is absolutely incorrect in his claim of what Matt said.
Maile Ohye, Senior support engineer at Google, definitely recommends allowing pagination to be indexed, if implemented properly. She even discussed this at length this week up at SMX Advanced in Seattle. Vanessa Fox, head of Nine by Blue, and former Googler (the creator of Google Webmaster Tools) agrees.
And so do I.
When performed properly, pagination (with quality optimization of paginated pages) can lead to dramatic increases in individual products indexed, higher quality visits from people further along in the buying process, and more people finding the site through an exponentially greater number of keyword phrases.
Consider this - in pagination (X number of products on the initial page, with X additional DIFFERENT products on page 2, and x additional still more and different products on page 3,etc. - by not wanting those pages indexed, you're communicating to Google - hey - we don't care about these other products enough to include them." Which means they get a false and negative understanding of how many products you have in your catalog. And THAT drives the overall strength of your catalog down.
Now, if, on the other hand, you already show ALL of your products on a top level page that is linked from the main navigation, then sure, pagination should be killed. But only if that's the case.
-
Alex,
I would highly recommend crawling your website and examining the crawl report. If Google is indexing these pages, then they got to them on your site at some point. I would proceed with the idea in mind this is a web design issue, not someone trying to ruin your rankings, as you suggested.
The crawl report will show the referrer page which can help troubleshoot the issue. When you have pages generated by a CMS or other software, there can easily be issues like the one you are experiencing. In my experience this is the most likely cause of your issue.
You mentioned there are 100s of these pages in the index. If you can determine a pattern they match, it is possible you can 301 all of them with a single rule, sending the user to your main category page or where ever you feel is best.
You can also set up a parameter specific instructions in Google WMT. I would avoid doing this until after you have reviewed your crawl report. From your Google WMT dashboard > Site Configuration > Settings > Parameter handling tab > find or add your parameter and adjust the setting as you deem fit.
-
**Edit: Please see alans answer
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect to Home Page or Sub-Page?
What do you think about 301 redirect of good expired domain to a sub-page instead of the home page? I'm doing this so I don't hurt my brand name. Let me know your thoughts please. Thank you
Intermediate & Advanced SEO | | JuanWork0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
How can a Page indexed without crawled?
Hey moz fans,
Intermediate & Advanced SEO | | atakala
In the google getting started guide it says **"
Note: **Pages may be indexed despite never having been crawled: the two processes are independent of each other. If enough information is available about a page, and the page is deemed relevant to users, search engine algorithms may decide to include it in the search results despite never having had access to the content directly. That said, there are simple mechanisms such as robots meta tags to make sure that pages are not indexed.
" How can it happen, I dont really get the point.
Thank you0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0 -
How long till pages drop out of the index
In your experience how long does it normally take for 301-redirected pages to drop out of Google's index?
Intermediate & Advanced SEO | | bjalc20110