Why is google not deindexing pages with the meta noindex tag?
-
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function.
Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com
Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50
How long should it take for these pages to disappear from the index?
-
Google might have already crawled the pages but not indexed them yet. Be patient , if you have enough links coming in and the pages are less than 3 levels deep they will all be crawled and indexed in no time.
-
I guess it depends on the urgency of your situation. If you were just trying to clean things up then it's okay to wait for Google to re-crawl and solve the problem. But if you have been affected by panda and your site is not ranking then I personally would consider that an urgent enough need to use the tool.
-
This link almost makes it seem like I shouldn't use the webmaster tools removal.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119
-
The crawlers have so many billions of webpages to get to. We have more than 50,000 on our site; there's about 8,000 that they check more regularly than the others - some are just really deep on the site and hard to get to.
-
You can remove entire category directories from the index in one command using the tool. But the urls won't be removed from the cache, just the index. To remove them from the cache you'll need to enter each url individually. I think that if you are trying to clear things up for Panda reasons, just removing from the index is enough. However, I'm currently trying to decide if it will speed things up to remove from the cache as well.
-
Ok. That makes sense. I wonder why it takes so long? I'll start the long process of the manual removal.
-
Streamline Metrics has got it right.
I've seen pages take MONTHS to drop out of the index after being noindexed. It's best to use the URL removal tool in WMT (not to be confused with the disavow tool) to tell Google to not only deindex the pages but to remove them from the cache as well. I have found that when you do this the pages are gone within 12 hours.
-
In your experience how long does this normally take?
-
Yes it was around December 2nd or 3rd that we added the noindex tags. It just seemed like google wasn't removing any pages yet from the index. It did stop google from adding more of these pages though.
-
It all depends on how long it takes Google to re-crawl those pages with the no index tag on them.
I would do this along with the steps you have already taken in order to help speed the process up if you are in a hurry
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663419
-
Do you know when you added the noindex tags? Google will need to recrawl the pages to see the noindex tags before removing them. I just looked at one your category pages and it looks like it was cached by Google on December 1st, and there was no noindex tag on that page. Depending on how big your site is and how often your site is crawled will determine when they will be removed from the index. Here's Google's official explanation -
"When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.
Note that because we have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because we haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, we won't be able to see the tag either.)
If the content is currently in our index, we will remove it after the next time we crawl it. To expedite removal, use the URL removal request tool in Google Webmaster Tools."
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
-
Or canonical or by robots.txt
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Deindexing Site, but Reindexing 301 Redirected Version
A bit of a strange one, a client's .com site has recently been losing rankings on a daily basis, but traffic has barely budged. After some investigation, I found that the .co.uk domain (which has been 301 redirected for some years) has recently been indexed by Google. According to Ahrefs the .co.uk domain started gaining some rankings in early September, which has increased daily. All of these rankings are effectively being stolen from the .com site (but due to the 301 redirect, the site loses no traffic), so as one keyword disappears from the .com's ranking, it reappears on the .co.uk's ranking report. Even searching for the brand name now brings up the .co.uk version of the domain whereas less than a week ago the brand name brought up the .com domain. The redirects are all working fine. There's no instance of any URLs on the site or in the sitemaps leading to the .co.uk domain. The .co.uk domain does not have any backlinks except for a single results page on ask.com. The site hasn't recently had any design or development done, the last changes being made in June. Has anyone encountered this before? I'm not entirely sure how or why Google would start indexing 301'd URLs after several years of not indexing these.
Technical SEO | | lyuda550 -
How to explain "No Return Tags" Error from non-existing page?
In the Search Console of our Google Webmaster account we see 3 "no return tags" errors. The attached screenshot shows the detail of one of these errors. I know that annotations must be confirmed from the pages they are pointing to. If page A links to page B, page B must link back to page A, otherwise the annotations may not be interpreted correctly. However, the originating URL (/#!/public/tutorial/website/joomla) doesn't exist anymore. How could these errors still show up? Screenshot%202016-07-11%2017.36.27.png?dl=0
Technical SEO | | Maximuxxx0 -
Dynamic content in meta tag
Hi There, A quick query, If few stats digits or numbers are taken in Meta description and they change daily that's why meta description changed regularly. example : current stats: 2859594 hosted sites, 99.57% uptime last week, 59.52% positive social media user sentiment, and 43 user reviews. changes in digits effect SEO Ranking? Rajiv
Technical SEO | | gamesecure0 -
My sites "pages indexed by Google" have gone up more than qten-fold.
Prior to doing a little work cleaning up broken links and keyword stuffing Google only indexed 23/333 pages. I realize it may not be because of the work but now we have around 300/333. My question is is this a big deal? cheers,
Technical SEO | | Billboard20120 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
How should i knows google to indexed my new pages ?
I have added many products in my ecommerce site but most of the google still not indexed yet. I already submitted sitemap a month ago but indexed process was very slow. Is there anyway to know the google to indexed my products or pages immediately. I can do ping but always doing ping is not the good idea. Any more suggestions ?
Technical SEO | | chandubaba1 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
Duplicate title tags and meta description tags
According to GWT, it seems that some of the pages on my website have duplicate title and meta tags. The pages identified by Google are nothing but dynamic pages: http://www.mywebsite.com/page.php
Technical SEO | | sbrault74
http://www.mywebsite.com/page.php?param=1
http://www.mywebsite.com/page.php?param=2 The thing is that I do use the canonical link tag on all pages. Should I also use the "robots noindex" tag when the page is invoked using a GET parameter? Again sorry for my english. Thank you, Stephane1