Recovering from index problem (Take two)
-
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down!
Below is my original message. Afterwards, I've added some update info.
For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides.
Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13)
This shows the issue pretty clearly.
I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there.
UPDATE
OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week.
Any ideas would be much appreciated!
-
Thank you all, this is brilliant.
-
Your problem is with the robots.txt file. You are blocking the URL
thewilddeckcompany.co.uk/index.php?id=13
That URL 301 redirects to the correct URL of
http://thewilddeckcompany.co.uk/products/bird-hides
Google cannot "see" the 301 redirect from the old "bad" URLs to the new "good" URL.
You have to let Google crawl the old URLs and see the 301 redirects so that it knows how things need to forward.
I would do this for all the duplicate pages, make sure they 301 to the correct pages and do not put the "bad" pages in robots.txt - otherwise the indexing will not be updated.
Something separate to check. We have seen Google taking a while to acknowledge some of our 301s. Go into your GWT and look at your duplicate title reports. You may see the old and new URLs showing as duplicates, even with the 301s in place. We had to setup a self canonicalizing link on the "good" pages to help get that cleaned up.
-
Blink-SEO
Jonathan is correct to try a Fetch as Google in WMT for the urls you need re indexed. (Note, that is not really the purpose of a Fetch as Google, but sometimes it works.)
I would also resubmit the sitemap now that you have blocked the offending url with robots.txt. It is likely the resubmission will help you the quickest IMO.Best,
Robert
-
It sounds like you just need to wait for Google to recrawl your robots.txt file. I saw this error in the serps:
www.thewilddeckcompany.co.uk/products/timber-water...
A description for this result is not available because of this site's robots.txt – learn more.So it is clear that the robots.txt file has not updated with the changes, after the mistake was made. Try fetching as Googlebot within webmaster tools, but it may take a little time to update. But at least it would seem that the robots.txt error is still a cause of the problem, just need to wait a little longer.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two high ranking pages instantly dropped from index - no manual penalty notification
We are facing an issue where two of our major rankings pages have just completely disappeared from search results. This has happened in the last 24 - 48 hours and there has been no changes made to the site. From what we can tell, it's only impacted two pages (but two very important category pages). I have double and triple checked all standard indexing protocols - Search Console URL inspection says the pages are fine to crawl and index. URLs have been requested to re-index but nothing has worked. This would have me to believe it could be a manual action yet there are no notifications in Search Console and we are listed as 'No issues detected' in all versions of our web property. Can anyone else think what could be the reason?
Intermediate & Advanced SEO | | Vuly0 -
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Website Does not index in any page?
I created a website www.astrologersktantrik.com 4 days ago and fetch it with google but still my website does not index on google as the keywords I use is with low competition but still my website does not appear on any keywords?
Intermediate & Advanced SEO | | ramansaab0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Problem with Google reading https homepage?
Hi Moz Community, In July, we changed our homepage to https via a 301 redirect from http (the only page on our site with https). Our homepage receives an A grade in the ‘On Page Grader’ by Moz for our desired keyword. We have increased our backlink efforts directly to our homepage since we switched to the SSL homepage. However, we still have not increased in search ranking for our specific keyword. Is there something we could have missed when doing the 301 redirect (submitting a new sitemap, changing rotbots.txt files, or anything else??) that has resulted in Google not correctly accessing the https version? (the https page has been indexed by Google). Any help would be greatly appreciated.
Intermediate & Advanced SEO | | G.Anderson0 -
XML Sitemap Indexation Rate Decrease
On September 28th, 2013 I saw my indexation rate decrease on my XML sitemap that I've submitted through GWT. I've since scraped my sitemap and removed all 404, 400 errors (which only made up ~5% of the entire sitemap). Any idea why Google randomly started indexing less of my XML sitemap on that date? I updated my sitemap 2 week before that date and had an indexation rate of ~85% - no I'm below 35%. Thoughts, idea, experiences? Thanks!
Intermediate & Advanced SEO | | RobbieWilliams0 -
Does anyone have a clue about my search problem?
After three years of destruction, my site still has a problem - or maybe more than one. OK, I understand I had - and probably still have - a Panda problem. The question is - does anyone know how to fix it, without destroying eveything? If I had money, I'd gladly give it up to fix this, but all I have is me, a small dedicated promotions team, 120,000+ visitors per month and the ability to write, edit and proofread. This is not an easy problem to fix. After completing more than 100 projects, I still haven't got it right, in fact, what I've done over the past 2 months has only made things worse - and I never thought I could do that. Everything has been measured, so as not to destroy our remaining ability to generate income, because without that, its the end of the line. If you can help me fix this, I will do anything for you in return - as long as it is legal, ethical and won't destroy my reputation or hurt others. Unless you are a master jedi guru, and I hope you are, this will NOT be easy, but it will prove that you really are a master, jedi, guru and time lord, and I will tell the world and generate leads for you. I've been doing website and SEO stuff since 1996 and I've always been able to solve problems and fix anything I needed to work on. This has me beaten. So my question is: is there anyone here willing to take a shot at helping me fix this, without the usual response of "change domains" "Delete everything and start over" or "you're screwed" Of course, it is possible that there is a different problem, nothing to do with algorithms, a hard-coded bias or some penalizing setting, that I don't know about, a single needle in a haystack. This problem results in a few visible things. 1. Some pages are buried in supplemental results 2. Search bots pick up new stories within minutes, but they show up in search results many hours later Here is the site: http://shar.es/EGaAC On request, I can provide a list of all the things we've done or tried. (actually I have to finish writing it) Some Notes: There is no manual spam penalty. All outgoing links are nofollow, and have been for 2 years. We never paid for incoming links. We did sell text advertising links 3-4 years ago, using text-link-ads.com, but removed them all 2 1/2 years ago. We did receive payment for some stories, 3-4 years ago, but all have been removed. One more thing. I don't write much - I'm a better editor than a writer, but I wrote a story that had 1 million readers. the massive percentage of 0.0016% came from you-know-who. Yes, 16 visitors. And this was an exclusive, unique story. And there was a similar story, with half a million readers. same result. Seems like there might be a problem!
Intermediate & Advanced SEO | | loopyal0 -
Recovering from a site migration
Hi. I've been working on http://www.alwayshobbies.com/ for a number of months. All was fine, but then we had a site migration which involved a huge number of redirects. There's been a couple of similar moves in the past. As a result, rankings have plummeted. To resolve this, we're considering letting all the old pages 404 by turning of the redirects, and removing all links to them where we can. Some key pages could have canonicals added, but basically we're looking to purge as much as possible. Does this sound like a reasonable tactic?
Intermediate & Advanced SEO | | neooptic0