Site De-Indexed except for Homepage
-
Hi Mozzers,
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September:Please see screenshot attached to show this:
- 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools
- 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps.
Site is: (removed)
As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking.
It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools.
Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing.
What else could be wrong?
Any suggestions appeciated. Thanks.
-
Resolved!
Thanks for your replies everyone.
The strange thing was that even though the www version of pages did seem to 301 to non-www version (I checked headers were indeed 301), all our pages had disappeared from Google index and rankings too (exept homepage).
The resolution came after we had our host reset the domain to www version on the server to the original state. Within days of changing that, all our deindexed pages (the whole site) jumped back into the original ranking positions in Google with www version and are re-indexed like nothing had happened.
Hope this helps someone else.
-
Hi Emerald
Enter both www and non www to webmaster tools
pages like http://www.toursistanbul.com/bosphoruscruises.htm are set to noindex
There are 30+ pages indexed in google right now
It's a mix of www and non www
Webmaster tools treats this as different so you will see a drop.
Go to your webmaster tools and set a preferred version. (which is www.)
If you plan to move to https then you need to also enter both https (www and non www versions)
BTW, be sure your analytics is using the new code as well.
Good luck!
-
There is nothing you can do now. You have made a mistake and fixed it. Since then you have submitted a sitemap, "fetched" the site and redirected non-www traffic to www in your htaccess... There are no other ways to speed the process up. Just sit and wait for Google crawler to fully re-crawl the site and the number of indexed pages will come back to what it was.
You said all rankings disappeared in Moz Tracker, but what about the actual rankings in Google search results? Have you checked that? What are WMT and GA saying about your rankings/traffic?
My gut feeling tells me your pages are still ranking as they were, but since your WMT was still set to show data for www domain you weren't seeing any... am I correct?
-
Hi,
Without digging in detail, it is always awkward to suggest possible issues - simply because there are so many possibilities.
That said, what you mentioned about the move could cause a temporary drop while Google corrects the indexation of the site. The site has gone from www.site.com to http://site.com and now back to www.site.com again. That is a lot of movement for Google to try and make sense of.
I am guessing also that because this was not a planned move from www to non-www, that no 301's were implemented, which means that Google would effectively see the site as having just gone and now returned again.
I would assume that the site will recover, but this can take time.
Use webmaster tools to 'Fetch as Google' at the root of the site and see the site is back again.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Are 2 sites in same niche from same company white hat?
Hello, We want to open a second eCommerce store. Our first one is doing well. It would be different code, different graphics, a different category/menu system, but many of the products will be the same. Will that be safe and white hat now and into the future to have 2? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Does Lazy Loading Create Indexing Issues of products?
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
White Hat / Black Hat SEO | | innovatebizz0 -
Using a geolocation service to serve different banners in homepage. Dangers? Best Practices?
Hello, our website is used by customer in more than 100 countries. Becasuse the countries we serve are so many, we are using one single domain and homepage, without country specific content. Now, we are considering to use an geolocation service to identify the customer location and then to change the contents of one banner in the home page accordingly. Might this be dangerous from a SEO perspective? If yes, any suggesiton on how can we implement this to avoid troubles and penalties form the Search Engines? Thanks in advance for any help,Dario
White Hat / Black Hat SEO | | Darioz0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Multiple Versions of Mobile Site
Hey Guys, We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
White Hat / Black Hat SEO | | seekjobs
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site. Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curious My main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized! Thanks, LW I know the first thing that comes to your mind is Duplicate content0 -
My site has disapeared from the serps. Could someone take a look at it for me and see if they can find a reason why?
my site has disappeared from the serps. Could someone take a look at it for me and see if they can find a reason why? It used to rank around 4 for the search "austin wedding venues" and it still ranks number three for this search on Bing. I haven't done any SEO work on it in a while so i don't think i did anything to make Google mad but now it doesn't even rank anywhere in the top 160 results. Here's the link: http://austinweddingvenues.org Thanks in advance Mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
Index page de-indexed / banned ?
Yesterday google removed our index page from the results. Today they also removed language subdomains (fr.domain.com).. Index page, subdomains are not indexed anymore. Any suggestions? -- No messages in GWT. No malware. Backlink diversification was started in May. Never penguilized or pandalized. Last week had the record of all times of daily UV. Other pages still indexed and driving traffic, left around 40% of total. Never used any black SEO tool. 95% of backlinks are related; sidebar, footer links No changes made of index page for couple months.
White Hat / Black Hat SEO | | bele0