Huge google index with un-relevant pages
-
Hi,
i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages.
since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do:
when a match is finished - not linked, but appears on the index and SERP
-
301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant?
-
use rel=canonical to the match Category
-
do nothing....
*301 redirect will shrink my index status, some say a high index status is good...
*is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google?
*would canonical remove the past matches pages from the index?
what do you think?
Thanks,
Assaf.
-
-
In terms of what you've written, blocking a page via robots.txt doesn't remove it from the index. It simply prevents the crawlers from reaching the page. So if you block a page via robots.txt, the page remains in the index, Google just can't go back to the page and see if anything has changed. So if you were to block the page via robots.txt, and add a noindex tag to the page, Google won't be able to see the page with the noindex tag to remove it from the index because it's blocked via robots.txt.
If you moved all of your old content to a different folder, and block that folder via robots.txt, Google won't remove those pages from the index. In order to remove them from the index, you would have to go in to Webmaster Tools and use the URL removal tool to remove that new folder from the index - if they see it's blocked via robots.txt, then and only then they'll remove the content from the index - it has to be blocked via robots.txt first in order to remove the whole folder with the URL removal tool.
I'm not sure though if this would work for the future - if you removed a folder from the index, and then added more content that was indexed previously afterwards, I'm not sure what would happen to that new content moved to that folder. Either way, Google will have to come back and recrawl the page to see that it has moved to the new folder, and then remove it from the index. So either way, the content will only be removed once Google recrawls the old content.
So I still think a better way to remove the content from the index is to add the noindex tag to the old pages. To facilitate the search engines reaching these old pages, I'd make sure there is a way the engines can get to them - make sure there is a path they can take to reach them.
Another good idea I saw on a forum post here a while ago would be to create a sitemap containing all of these old pages you have indexed and want removed. Add the noindex tag to the sitemap - using the Webmaster tools sitemap interface, you'll then be able to monitor the progress of deindexation over time - by checking how many pages on the sitemap/s of the old content are originally indexed as reported by webmaster tools, and then you can see later on how many of those pages are still indexed, this will be a good indicator for you of the progress of the deindexation.
-
Dear Mark,
*i've sent you a private message.
i'm starting to understand i've a much bigger problem.
*my index status contain 120k pages while only 2000 are currently relevant.
your suggestion is - after a match finishes pragmatically add to the page and google will remove it from it's index. it could work for relatively new pages but since very old pages don't have links OR sitemap entry it could take a very long time to clear the index cause they're rarely crawled - if at all.
- more aggressive approach would be to change this site architecture and restrict by robot.txt the folder that holds all the past irrelevant pages.
so if today a match URL is like this: www.domain.com/sport/match/T1vT2
restrict www.domain.com/sport/match/ on robots.txt
and from now on create all new matches on different folder like: www.domain.com/sport/new-match-dir/T1vT2
-
is this a good solution?
-
wouldn't google penalize me for removing a directory with 100k pages?
-
if it's a good approach, how much time it will take for google to clear all those pages from it's index?
I know it's a long one and i'll really appreciate your response.
Thanks a lot,
Assaf.
-
there are a bunch of articles out there, but each case is different - here are a few:
http://www.searchenginejournal.com/the-holy-grail-of-panda-recovery-a-1-year-case-study/45683/
You can contact me via private message here on the forum and I can try to take a more in depth look at your site if you can give me some more detailed info.
-
yes. when the 1st Panda update was rolled out i've lost 50% of the traffic from google and haven't really recovered since.
-
Are you sure you got hit by Panda before we talk about a Panda hit?
-
Thanks Mark!
any good article about how to recover from Panda?
-
Exactly - I'd build a strategy more around promoting pages that will have long lasting value.
If you use the tag noindex, follow, it will continue to spread link juice throughout the site, it's just the individual page with the tag will not be included in the search results and will not be part of the index. In order for the tag to work, they first have to crawl the page and see the tag - so it doesn't happen instantaneously - if they crawl these deeper pages once every few weeks, once a month, or even longer, it may take a while for these pages to be removed from the index.
-
Hi Mark
-
these pages are very important when they are relevant (before the match finished) - they are the source of most of our traffic which come from long tail searches.
-
some of these pages have inbound link and it would be a shame to lose all this juice.
-
would noindex remove the pages from the google index? how much time it would take? wouldn't a huge noindex also look suspicious?
-
by "evergreen pages" - you mean pages that are always relevant like League page / Sport page etc...?
Thanks,
Assaf.
-
-
Hi Assaf,
(I'm not stalking you, I just think you've raised another interesting question)
In terms of index status/size, you don't want to create a massive index of empty/low value pages - this is food for Google's Panda algorithm, and will not be good for your site in the long run. It'll get a Panda smack if it hasn't already.
To remove these pages from the index, instead of doing hundreds of thousands of 301 redirects, which your server won't like either, I'd recommend adding the noindex meta tag to the pages.
I'd put a rule in your cms that after a certain point in time, you noindex those pages. Make sure you also have evergreen pages on your site that can serve as landing pages for the search engines and which won't need to be removed after a short period of time. These are the pages you'll want to focus your outreach and link building efforts on.
Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing
Hi We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45. Any possible explanations why this might be happening and what can be done for it. Thanks, Priyam
Intermediate & Advanced SEO | | kh-priyam0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
Any idea why this page isn't indexing?
Hi Mozzers, Question for all of you. Any idea why this page isn't indexing in Google? It's indexing in Bing, but we don't see it in Google's results. It doesn't seem like we have any noindex tags or anyway issues with the robots files either. Any ideas? http://ohva.k12.com/
Intermediate & Advanced SEO | | petertong230 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
What may cause a page not to be indexed (be de-indexed)?
Hi All, I have a main category page, a landing page, that does not appear in the SERPS at all (even if I serach for a whole sentence from it). This page once ranked high. What may cause such a punishment for a specific page? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
How do I increase rankings when the indexed page is the homepage?
Hi Forum, This is a two-part question. The first is: "what may be the cause of some rank declines?" and the second is "how do I bring them back up when the indexed page is the homepage?" Over the last week I noticed some declines in several of my top keywords, many of which point to the site's homepage. The site itself is an eCommerce site, which had less visits last week than normal (holidays it seems, since the data jibes with key dates). Can a decline in traffic cause ranking declines? Any other ideas of where to look? Secondly, for those keywords that link to the homepage, how do we bring these back up since a homepage can't be optimized for every single keyword? We sell yoga products and can't have a homepage that is optimized for keywords like "yoga mat," "yoga blocks," "yoga pilates clothing," and several others, as these are our category pages' keywords. Any thoughts? Thanks!
Intermediate & Advanced SEO | | pano0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10