Over 1000 pages de-indexed over night
-
Hello,
On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages.
I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200.
The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages.
If you could offer any solutions that would be greatly appreciated.
Thanks, Robert.
-
Canonical and duplicate content are both interresting issues, thanks for your anserws!
-
The 1st question I would ask myself as a website owner is: How many pages have duplicate content on them? Maybe try going the manual way and check for yourself. There will be pages you do not know about. Use a tool like Xenu's link sleuth to extract all links on your website(which are reachable from atleast 1 link of your website).
It may happen that google is adjusting its index for all. Worth keeping a track on whether your competitors are losing at the same or slower rate or not.
-
I would first remove the canonicals (or actually do them correctly) and wait for a re-crawl of the pages. If not, then re-submit.
Also, problems with canonicals do show up in the page grader app, so when your admin re-writes the canonical, test the page before Google sees it : )
-
It appears so
I've asked my webmaster to remove all of the canonical tags but after reading that article I feel that a lot of the damage has already been done!
Looks like I'll have to submit a reconsideration request and beg Google for forgiveness.
-
Hmm, I'd remove them.
The fact that all your individual product pages have a rel=canonical of productsdetail.asp and category level is products.asp is telling Google that all your products pages are the same!
The problem is there are a lot of parameters in your URLs, let me go have a look at how to deal with that when using the canonical tag.
EDIT - Spent half my lunch on this but couldn't find much
I would guess you can get away with parameters in the canonical tag so decide which is the official URL and put that in the header for that paticular piece of content. Only really found this - http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
-
Could you be in a canonical loop?
-
I don't think this is a result of an algo change, I've just had a look at the crawl diagnostics and it appears that he has put the tag value of the homepage (www.bridgman.co.uk/ - where all the back links are pointed to) as www.bridgman.co.uk/default.asp !!
This has also been done to all of our product pages!! i.e.all of the duplicated product pages now have the tag value - www.bridgman.co.uk/productdetail.asp (a page which doesn't even exist on our site!!!)
Now I may be stating the obvious, but I guess this is the problem?
He did this 8 days ago, where should I go from here? Ask him to remove all rel canonical tags and pray we bounce back....?
-
First question would be do you think this may have had an effect on you? - http://searchengineland.com/google-forecloses-on-content-farms-with-farmer-algorithm-update-66071
All your content unique? Your links come from sites that may have been effected?
If not then I'd give it a few days to see if they come back (as HR128 says, may just be Google working out what's changed and what to do) and I'm doing a quick crawl of your site to see what I can see
-
It could take a couple of days before you see results, Google has to step back and reassess most likely, or your web developer has no clue what he's doin haha.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Not Being Indexed
Hey Everyone - I have a site that is being treated strangely by google (at least strange to me) The site has 24 pages in the sitemap - submitted to WMT'S over 30 days ago I've manually triggered google to crawl the homepage and all connecting links as well and submitted a couple individually. Google has been parked the indexing at 14 of the 24 pages. None of the unindexed URL's have Noindex or follow tags on them - they are clearly and easily linked to from other places on the site. The site is a brand new domain, has no manual penalty history and in my research has no reason to be considered spammy. 100% unique handwritten content I cannot figure out why google isn't indexing these pages. Has anyone encountered this before? Know any solutions? Thanks in advance.
Technical SEO | | CRO_first0 -
Need to de-index certain pages fast
I need to de-index certain pages as fast as possible. These pages are already indexed. What is the fastest way to do this? I have added the noindex meta tag and run a few of the pages through Search Console/Webmaster tools (fetch as google) earlier today, however nothing has changed yet. The 'fetch as google' services do see the noindex tag, but it haven't changed the SERPs yet. I now I should be patient, but if there is a faster way to get Google to de-index these pages, I want to try that. I am considering the removal tool also, but I'm unsure if that is risky to do. And even if it's not, I can understand it's not a permanent solution anyway. What to do?
Technical SEO | | WebGain0 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Page Content
Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. Unfortunately we have understood by now that many customers have entered exactly same title for their listings which has caused us having hundreds of similar page title. We have corrected all the pages which had similar meta tag and duplicate page title tags. We have also inserted controls to our software to prevent generating duplicate page title tags or meta tags. But also the page content quality not very good because page content added by customer.(example: http://www.enakliyat.com.tr/detaylar/evden-eve--6001) What should I do. Please help me.
Technical SEO | | iskq0 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0