Recovering from index problem (Take two)
-
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down!
Below is my original message. Afterwards, I've added some update info.
For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides.
Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13)
This shows the issue pretty clearly.
I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there.
UPDATE
OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week.
Any ideas would be much appreciated!
-
Thank you all, this is brilliant.
-
Your problem is with the robots.txt file. You are blocking the URL
thewilddeckcompany.co.uk/index.php?id=13
That URL 301 redirects to the correct URL of
http://thewilddeckcompany.co.uk/products/bird-hides
Google cannot "see" the 301 redirect from the old "bad" URLs to the new "good" URL.
You have to let Google crawl the old URLs and see the 301 redirects so that it knows how things need to forward.
I would do this for all the duplicate pages, make sure they 301 to the correct pages and do not put the "bad" pages in robots.txt - otherwise the indexing will not be updated.
Something separate to check. We have seen Google taking a while to acknowledge some of our 301s. Go into your GWT and look at your duplicate title reports. You may see the old and new URLs showing as duplicates, even with the 301s in place. We had to setup a self canonicalizing link on the "good" pages to help get that cleaned up.
-
Blink-SEO
Jonathan is correct to try a Fetch as Google in WMT for the urls you need re indexed. (Note, that is not really the purpose of a Fetch as Google, but sometimes it works.)
I would also resubmit the sitemap now that you have blocked the offending url with robots.txt. It is likely the resubmission will help you the quickest IMO.Best,
Robert
-
It sounds like you just need to wait for Google to recrawl your robots.txt file. I saw this error in the serps:
www.thewilddeckcompany.co.uk/products/timber-water...
A description for this result is not available because of this site's robots.txt – learn more.So it is clear that the robots.txt file has not updated with the changes, after the mistake was made. Try fetching as Googlebot within webmaster tools, but it may take a little time to update. But at least it would seem that the robots.txt error is still a cause of the problem, just need to wait a little longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
In the google index but search redirects to homepage
Hi everyone, thanks for reading i have a website "www.gardeners.scot" and have the following pages listed in google site: command http://www.gardeners.scot/garden-landscaping-Edinburgh.htm & http://www.gardeners.scot/garden-maintenance-Edinburgh.htm however when a user searches for "garden landscaping Edinburgh" or "garden maintenance Edinburgh" we are in the rankings but google search links these phrases to the home page not to their targeted pages. the site is about a year old have checked the robots.txt, sitemap.xml & .htaccess files but can see anything wrong there. any ideas out there?
Intermediate & Advanced SEO | | livingphilosophy0 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Using two 404 NOT FOUND pages
Hi all, I was wondering if any of you can advise whether it's no issue to use two separate custom 404 pages. The 404 pages would be different for different parts of the site. For instance, if you're on /community/ and you enter a non-existing page on: www.sample.com/community/example/ it would give you a different 404 page than someone who runs into a non existing page at: www.sample.com/definition/example/ Does anybody have experience with this and would this be fine?
Intermediate & Advanced SEO | | RonFav0 -
Help my site it's not being indexed
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design.... The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site. After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do.... Any help would be greatly appreciated Thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Language Subdirectory homepage not indexed by Google
Hi mozzers, Our Spanish homepage doesn't seem to be indexed or cached in Google, despite being online for over a month or two. All Spanish subpages are indexed and have started to rank but not the homepage. I have submitted sitemap xml to GWTools and have checked there's no noindex on the page - it seems to be in order. And when I run site: command in Google it shows all pages except homepage. What could be the problem? Here's the page: http://www.bosphorusyacht.com/es/
Intermediate & Advanced SEO | | emerald0 -
Recovering from an Automatic penalty
My website dropped from ~15 to 500 + for our main keywords in target markets (but are still doing okay in other countries and for other keywords) I cleaned my site up and contacted Google who told me no manual spam actions were taken on my site. The only thing I can think is that we suffered from an automatic penalty (the drop corresponded with a small panda update). If that is in fact what happened, how do we recover? Also feel free to contribute other ideas about what may have occurred.
Intermediate & Advanced SEO | | theLotter0