Removing pages from index
-
Hello,
I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist.The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal.
Also, should I 301, 404 or 410 these pages?
Any help would be appreciated.
Thanks,
Alex
-
yes the no content page thing is a big problem. If you have a "view all" option, and it's more than a dozen, fifteen or maybe 20 products, that should be paginated, with full indexing. Maile Oyhe even talked about that specific scenario of "view all" being good.
In my experience, all of the no-content pages should, ideally, be 301 redirected in a way that they point to the most relevant highest level category page on your site.
Since there's so many, there's no easy way to get them removed from the index other than doing the 301 then being patient as Google recrawls then re-confirms.
-
Ah - that's definitely better, if you don't go too wide. 2009 - 2010's concept of not having too many links went too far with too many people. Sites became too flat.
Categories and pagination are best served with having enough categories to cover the highest level groups, with sub-categories as appropriate, but not to the the point where there's only a few products in any single sub-category. So if you've got more than a dozen or fifteen products in a category or sub-category, pagination is perfectly valid.
Having more than six, eight or maybe ten categories at most, is also not good.
-
Alan, I think I misspoke. I meant to say that a categorically structured set of your products would be better to index than a paginated version. For example :
http://www.sunglasses.com/mens/black/productx
as opposed to http://www.sunglasses.com/products?page=233
Is it still considered wise to index both paginated results along with categorized results in this case?
-
Hi Alan,
Thanks for the info. I was going to set my page 2+ to "noindex,follow", however your reply makes sense. I will leave them indexable. I do see some competitors "rel=canonical" pagination to "view all" pages. I think I will keep my pages as is.
However, as my reply to Ryan stated, my issue is still the INDEX.
Google has thousands of "no content" pages indexed. They contain links to other "no content" pages making my site look thin. This may be the reason we lost so much ranking/traffic with Panda update.
How do I get these pages removed from the index? And do I return 301, 404 or 410 when Google comes back to reindex them?
Thanks for your help!
Alex
-
Hi Ryan,
I crawled the site, and did not find links to these pages, however it made me realize another HUGE issue. Since the paging is dynamically created, it has links to the "back" & "forward" no matter what page you are on. So, if page # 5000 is displayed, it will have links to page # 4999 and 5001. Although in my website I do not have links to pages that do not exist, all it takes is someone link to my site with "page=10000" and Google to index that page. From this point on, G will index all the PAGEs that do not exist.
Thanks again for getting me a step closer to resolving my problem.
However, the problem is still the INDEX. Google has (now realizing that its in the thousands) pages indexed with no content. These pages just contain links to other PAGING pages that have no content and my main menu/categories.
How do I get these pages removed from the index?
Thanks again!
Alex
-
For the record, that link that SSCDavis referenced includes Matt Cutts discussing faceted navigation, not pagination. Faceted navigation is different than pagination by leaps and bounds. So he (SSCDavis), with all due respect, is absolutely incorrect in his claim of what Matt said.
Maile Ohye, Senior support engineer at Google, definitely recommends allowing pagination to be indexed, if implemented properly. She even discussed this at length this week up at SMX Advanced in Seattle. Vanessa Fox, head of Nine by Blue, and former Googler (the creator of Google Webmaster Tools) agrees.
And so do I.
When performed properly, pagination (with quality optimization of paginated pages) can lead to dramatic increases in individual products indexed, higher quality visits from people further along in the buying process, and more people finding the site through an exponentially greater number of keyword phrases.
Consider this - in pagination (X number of products on the initial page, with X additional DIFFERENT products on page 2, and x additional still more and different products on page 3,etc. - by not wanting those pages indexed, you're communicating to Google - hey - we don't care about these other products enough to include them." Which means they get a false and negative understanding of how many products you have in your catalog. And THAT drives the overall strength of your catalog down.
Now, if, on the other hand, you already show ALL of your products on a top level page that is linked from the main navigation, then sure, pagination should be killed. But only if that's the case.
-
Alex,
I would highly recommend crawling your website and examining the crawl report. If Google is indexing these pages, then they got to them on your site at some point. I would proceed with the idea in mind this is a web design issue, not someone trying to ruin your rankings, as you suggested.
The crawl report will show the referrer page which can help troubleshoot the issue. When you have pages generated by a CMS or other software, there can easily be issues like the one you are experiencing. In my experience this is the most likely cause of your issue.
You mentioned there are 100s of these pages in the index. If you can determine a pattern they match, it is possible you can 301 all of them with a single rule, sending the user to your main category page or where ever you feel is best.
You can also set up a parameter specific instructions in Google WMT. I would avoid doing this until after you have reviewed your crawl report. From your Google WMT dashboard > Site Configuration > Settings > Parameter handling tab > find or add your parameter and adjust the setting as you deem fit.
-
**Edit: Please see alans answer
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Changing Canonical Tags on Indexed Pages that are Ranking Well
Hi Guys, I recently rolled out a domain wide canonical tag change. Previously the website had canonical tags without the www, however the website was setup to redirect to www on page load. I noticed that the site competitors were all using www and as far as I understand www versus non www, it's based on preference. In order to keep things consistent, I changed the canonical tag to include the www. Will the site drop in rankings? Especially if the pages are starting to rank quite well. Any feedback is appreciated. Thanks!
Intermediate & Advanced SEO | | QuickToImpress0 -
If Robots.txt have blocked an Image (Image URL) but the other page which can be indexed has this image, how is the image treated?
Hi MOZers, This probably is a dumb question but I have a case where the robots.tags has an image url blocked but this image is used on a page (lets call it Page A) which can be indexed. If the image on Page A has an Alt tags, then how is this information digested by crawlers? A) would Google totally ignore the image and the ALT tags information? OR B) Google would consider the ALT tags information? I am asking this because all the images on the website are blocked by robots.txt at the moment but I would really like website crawlers to crawl the alt tags information. Chances are that I will ask the webmaster to allow indexing of images too but I would like to understand what's happening currently. Looking forward to all your responses 🙂 Malika
Intermediate & Advanced SEO | | Malika11 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0