Duplicate site (disaster recovery) being crawled and creating two indexed search results
-
I have a primary domain, toptable.co.uk, and a disaster recovery site for this primary domain named uk-www.gtm.opentable.com. In the event of a disaster, toptable.co.uk would get CNAMEd (DNS alias) to the .gtm site. Naturally the .gtm disaster recover domian is an exact match to the toptable.co.uk domain.
Unfortunately, Google has crawled the uk-www.gtm.opentable site, and it's showing up in search results. In most cases the gtm urls don't get redirected to toptable they actually appear as an entirely separate domain to the user. The strong feeling is that this duplicate content is hurting toptable.co.uk, especially as .gtm.ot is part of the .opentable.com domain which has significant authority. So we need a way of stopping Google from crawling gtm.
There seem to be two potential fixes. Which is best for this case?
- use the robots.txt to block Google from crawling the .gtm site
2) canonicalize the the gtm urls to toptable.co.uk
In general Google seems to recommend a canonical change but in this special case it seems robot.txt change could be best.
Thanks in advance to the SEOmoz community!
-
It's a little tricky. While Andrea is right about Robots.txt - it's not great for removal once pages/domains are indexed, you can block the sub-domain with robots.txt and then request removal in Google Webmaster Tools (you need to create a separate account for the sub-domain itself). That's often the fastest way to remove something from the index, and if it has no search value, I might go that route. Just proceed with caution - it's a delicate procedure.
Doing 1-to-1 canonicalization or adding 301 redirects may be the next strongest signal (NOINDEX is a bit weaker, IMO). However, Google will have to re-crawl the sub-domain to do that, so you'll need to keep the paths open.
-
First, if the pages are already indexed then a robots.txt won't make them go away. A meta tag no index on the pages is the better solution. This allows search engines to "read" you page, see the no index tag and then work to remove the pages from index. A robots.txt doesn't necessarily accomplish the same result.
-
If you can do a 1-to-1 page canonicalization (each page on .co.uk is canonicaled to the equivalent page on the .com) then I would do that.
Otherwise, I would noindex the backup site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdirectory site / 301 Redirects / Google Search Console
Hi There, I'm a web developer working on an existing WordPress site (Site #1) that has 900 blog posts accessible from this URL structure: www.site-1.com/title-of-the-post We've built a new website for their content (Site #2) and programmatically moved all blog posts to the second website. Here is the URL structure: www.site-1.com/site-2/title-of-the-post Site #1 will remain as a normal company site without a blog, and Site #2 will act as an online content membership platform. The original 900 posts have great link juice that we, of course, would like to maintain. We've already set up 301 redirects that take care of this process. (ie. the original post gets redirected to the same URL slug with '/site-2/' added. My questions: Do you have a recommendation about how to best handle this second website in Google Search Console? Do we submit this second website as an additional property in GSC? (which shares the same top-level-domain as the original) Currently, the sitemap.xml submitted to Google Search Console has all 900 blog posts with the old URLs. Is there any benefit / drawback to submitting another sitemap.xml from the new website which has all the same blog posts at the new URL. Your guidance is greatly appreciated. Thank you.
Intermediate & Advanced SEO | | HimalayanInstitute0 -
How to Get Rid of Dates Shown In Google Search Results
When I enter "Site: URL" to check what a search how Google displays search result, a date appears at the very front. This takes away several characters, really valuable real estate. How can I stop Google from displaying these dates? There are certain Wordpress plugins like "WP Date Remover" however the seem to only apply to blog posts. Dates are appearing on results on all my Wordpress pages. Is there an internal setting in Wordpress that will allow me to remove dates for these non blogpost pages?
Intermediate & Advanced SEO | | Kingalan11 -
Shall I change duplicated product descriptions if I am top ranking result and was first to publish it?
about 20% of total 500 products in our shop have a manufacturer product description, which appears repeatedly on maybe 10 other sites and also on ebay and amazon. Issue is just with our own brand. We have another brand website where we publish the same product descriptions as well (google knows that we own both sites). The product description was first published on shop website before anywhere else and we are ranking number one for the product despite quite strong competition and we outrank our brand website as well. Which of the following options would you opt for? keeping the description and just adding some unique content replacing the complete product description by a new one
Intermediate & Advanced SEO | | lcourse0 -
Indexing of internal search results: canonicalization or noindex?
Hi Mozzers, First time poster here, enjoying the site and the tools very much. I'm doing SEO for a fairly big ecommerce brand and an issue regarding internal search results has come up. www.example.com/electronics/iphone/5s/ gives an overview of the the model-specific listings. For certain models there are also color listings, but these are not incorporated in the URL structure. Here's what Rand has to say in Inbound Marketing & SEO: Insights From The Moz Blog Search filters are used to narrow an internal search—it could be price, color, features, etc.
Intermediate & Advanced SEO | | ClassicDriver
Filters are very common on e-commerce sites that sell a wide variety of products. Search filter
URLs look a lot like search sorts, in many cases:
www.example.com/search.php?category=laptop
www.example.com/search.php?category=laptop?price=1000
The solution here is similar to the preceding one—don’t index the filters. As long as Google
has a clear path to products, indexing every variant usually causes more harm than good. I believe using a noindex tag is meant here. Let's say you want to point users to an overview of listings for black 5s iphones. The URL is an internal search filter which looks as follows: www.example.com/electronics/apple/iphone/5s?search=black Which you wish to link with the anchor text "black iphone 5s". Correct me if I'm wrong, but if you no-index the black 5s search filters, you lose the equity passed through the link. Whereas if you canonicalize /electronics/apple/iphone/5s you would still leverage the link juice and help you rank for "black iphone 5s". Doesn't it then make more sense to use canonicalization?0 -
Google Site Extended Listing Not Indexed
I am trying to get the new Site map to be picked up by Google for the extended listing as its pulling from the old links and returning 404 errors. How can I get the site listing indexed quickly and have the extended listing get updated to point to the right places. This is the site - http://epaperflip.com/Default.aspx This is the search with the extended listing and some 404's - Broad Match search for "epaperflip"
Intermediate & Advanced SEO | | Intergen0 -
Google & Bing not indexing a Joomla Site properly....
Can someone explain the following to me please. The background: I launched a new website - new domain with no history. I added the domain to my Bing webmaster tools account, verified the domain and submitted the XML sitemap at the same time. I added the domain to my Google analytics account and link webmaster tools and verified the domain - I was NOT asked to submit the sitemap or anything. The site has only 10 pages. The situation: The site shows up in bing when I search using site:www.domain.com - Pages indexed:- 1 (the home page) The site shows up in google when I search using site:www.domain.com - Pages indexed:- 30 Please note Google found 30 pages - the sitemap and site only has 10 pages - I have found out due to the way the site has been built that there are "hidden" pages i.e. A page displaying half of a page as it is made up using element in Joomla. My questions:- 1. Why does Bing find 1 page and Google find 30 - surely Bing should at least find the 10 pages of the site as it has the sitemap? (I suspect I know the answer but I want other peoples input). 2. Why does Google find these hidden elements - Whats the best way to sort this - controllnig the htaccess or robots.txt OR have the programmer look into how Joomla works more to stop this happening. 3. Any Joomla experts out there had the same experience with "hidden" pages showing when you type site:www.domain.com into Google. I will look forward to your input! 🙂
Intermediate & Advanced SEO | | JohnW-UK0 -
SIte Redesign - Disaster for Organic Traffic
A client just redesigned their site and launched it around May 30. The organic traffic has had a MAJOR drop and has not returned yet. All of the old pages have been 301 redirected to the new pages. Any thoughts on what could be causing this to www.brickhousesecurity.com? In Google Webmaster Tools, before the redesign we were receiving about 300,000 impressions and 10-12,000 clicks. Now the impressions are only 100,000 with half as many clicks. Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Temporarily Delist Search Results
We have a client that we run campaign sites for. They have asked us to turn off our PPC and SEO in the short term so they can run some tests. PPC no problem straight forward action, but not as straight forward to just turn off SEO. Our campaign site is on Page 1, Position 4, 3 places below our clients site. They have asked us to effectively disappear from the landscape for a period of 1-2 months. Has anyone encountered this before, the ability to delist good SERP for a period of time? Details: Very small site with only 17 pages indexed within google, but home page has good SERP result. My issues are, How to approach this in the most effective manor? Once the delisting process is activated and the site/page disappears, then we reverse the process will we get back to where we were? Anyone encountered this before? I realise this is a ridiculous question and goes against SEO logic, get to page 1 results only to remove it, but hey, clients are always presenting new challenges for us to address..... Thanks
Intermediate & Advanced SEO | | Jellyfish-Agency0