Trying to figure out why one of my popular pages was de-indexed from Google.
-
I wanted to share this with everyone for two reasons. 1. To try to figure out why this happened, and 2 Let everyone be aware of this so you can check some of your pages if needed.
Someone on Facebook asked me a question that I knew I had answered in this post. I couldn't remember what the url was, so I googled some of the terms I knew was in the page, and the page didn't show up. I did some more searches and found out that the entire page was missing from Google.
This page has a good number of shares, comments, Facebook likes, etc (ie: social signals) and there is certainly no black / gray hat techniques being used on my site. This page received a decent amount of organic traffic as well.
I'm not sure when the page was de-indexed, and wouldn't have even known if I had't tried to search for it via google; which makes me concerned that perhaps other pages are being de-indexed. It also concerns me that I have done something wrong (without knowing) and perhaps other pages on my site are going to be penalized as well.
Does anyone have any idea why this page would be de-indexed? It sure seems like all the signals are there to show Google this page is unique and valuable. Interested to hear some of your thoughts on this.
Thanks
-
I did the fetch as google and everything looks normal. I also do not have any erros that have anything to do with this page.
It's very strange. It seems to have happen all of a sudden.
-
Go to do your Webmaster Tools and do a "Fetch as Google" on the page. Also check for any errors while you're there.
Is it possible that you had a noindex tag on the page at least temporarily in the past?
-
Thanks for your message. I've already checked that, and the site was in face de indexed.
-
Hello
I've had a few pages fall off the radar, however these pages weren't de-indexed, they just didn't show up for the keywords they used to. Try site:www.yourdomain.com/page/ to see if the page has actually been de-indexed.
The only conclusion I could come to with regards why my pages that fell off a cliff was the ratio of natural to commercial anchor text. I had more than 50% of anchor texts to those pages from exact keyword anchor texts.
Inner pages need brand and natural anchor text profiles also, not just the home page (so yourdomain.com/page, www.yourdomain.com/page, http://www.yourdomain.com/page).
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Google indexes the website after algorithm update?
I've noticed that the page ranks for some queries were improved unexpectedly, without any actions from my side. Is it possible that this improvement is connected with with Google algorithm update Sep, 4-5?
Algorithm Updates | | AurigaPR0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Issue with Category Ranking on Page 1 vs. Homepage Ranking on Page 2
A client has a high-volume keyword that is rendering different results, whether it is on page one or page two of Google SERPs. If the keyword is on page one, ONLY the category page is ranking. When the keyword bumps off to page two, BOTH the category AND the homepage are ranking. This is happening on our IP and theirs, incognito and personalized searches. This has been happening since February. Any thought/insights would be greatly appreciated, thank you!!!!
Algorithm Updates | | accpar0 -
Best Moz article on landing pages?
From what I understand, building landing pages to link back to sites is a thing of the past. I am looking for a good article that explains best current landing page practices (post Panda and Penquin). Any suggestions?
Algorithm Updates | | cschwartzel0 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
The Google/Yahoo Connection
I have been telling myself and clients for a while that you do not need to specially SEO things for different search engines. While I stand by this (staunchly) I can't help but notice how SLOW yahoo is to pick up my SEO updates and rank them as compared to google. Sometimes I see Rank increases within a day or two (or sooner) But Yahoo is still well behind in their caching and calculations.
Algorithm Updates | | TheGrid0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0