Meta canonical or simply robots.txt other domain names with same content?
-
Hi,
I'm working with a new client who has a main product website.
This client has representatives who also sells the same products but all those reps have a copy of the same website on another domain name.
The best thing would probably be to shut down the other (same) websites and redirect 301 them to the main, but that's impossible in the minding of the client.
First choice : Implement a conical meta for all the URL on all the other domain names.
Second choice : Robots.txt with disallow for all the other websites.
Third choice : I'm really open to other suggestions
Thank you very much!
-
Thanks Mike!
In your opinion, big 301 redirects would be totally better ?
-
Okay, so no 301s.... well then, assuming all of those duplicate sites have been live for a while and crawled by the spiders, you might be better served with canonical tags.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When creating a sub-domain, does that sub-domain automatically start with the DA of the main domain?
We have a website with a high DA and we are considering sub-folder or sub-domain. One of the great benefits of a sub-folder is that we know we get to keep the high DA, is this also the case for sub-domains? Also if you could provide any sources of information that specify this, I can't see to find anything!
Intermediate & Advanced SEO | | Saba.Elahi.M.0 -
Self referencing canonicals and paginated content - advice needed
Hi, I help manage a large site that uses a lot of params for tracking, testing and to help deal with paginated content e.g. abc.com/productreview?page=2. The paginated review content correctly uses rel next and rel prev tags to ensure we get the value of all of the paginated review content that we have. The volume of param exclusions I need to maintain in Google & Bing Webmaster tools is getting clunky and frustrating. I would like to use self referencing canonicals, which would make life a lot easier. Here's my issue: If I use canonicals on the review pages the paginated content urls would also use the same canonical e.g. /productreview?page=2 pointing to /productreview I believe I am going to lose the value of those reviews, even though they use the rel next rel prev tags. BTW airbnb do this - do they know something I don't, don't care about the paginated reviews, or are they doing it incorrectly, see http://d.pr/i/14mPU Is my assertion above correct about losing the value of the paginated reviews if I use self referencing canonicals? Any thoughts on a solution to clearing up the param problem or do I have to live with it? Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Block in robots.txt instead of using canonical?
When I use a canonical tag for pages that are variations of the same page, it basically means that I don't want Google to index this page. But at the same time, spiders will go ahead and crawl the page. Isn't this a waste of my crawl budget? Wouldn't it be better to just disallow the page in robots.txt and let Google focus on crawling the pages that I do want indexed? In other words, why should I ever use rel=canonical as opposed to simply disallowing in robots.txt?
Intermediate & Advanced SEO | | YairSpolter0 -
301 many smaller domains to a new, large domain
Hi all, I have a question regarding permanently redirecting many small websites into one, large new one. During the past 9 years I have created many small websites, all focusing on hotel reservations in one specific city. This has served me beautifully in the past, but I have come to the conclusion that it is no longer a sustainable model and therefore I am in the process of creating one large, worldwide hotel reservations website. To not loose any benefit of my hard work the past 9 years, I want to permanently redirect the smaller websites to the correct section of my new website. I know that if it is only a few websites, that this strategy is perfectly acceptable, but since I am talking about 50 to 100 websites, I am not so sure and would like to have your input. Here is what I would like to do: (the domain names are not mine, just an example) Old website: londonhotels.com 301 to newdomain.com/london/ Old website: berlinhotels.com 301 to newdomain.com/berlin/ Old website: amsterdamhotels.com 301 to newdomain.com/amsterdam/ Etc., etc. My plan is to do this for 50 to 100 websites and would like to have your thoughts on if this is an acceptable strategy or not. Just to be clear, I am talking about redirecting only my websites that are in good standing, i.e. none of the websites I am thinking about 301'ing have been penalized. Thanks for your thoughts on this.
Intermediate & Advanced SEO | | tfbpa0 -
Multiple 301 Redirects on the same domain name
Hi, I'd appreciate some advice ont he below. I have a website, say www.site.co.uk that has just been redesigned using a new CMS. Previously it had URLs in the format /article.php?id=123, the new site has more friendly urls in the format /articles/article-slug. I have been able to import the old articles into my CMS using the same article IDs and I have created a unique slug for each post. So now in my database, I have the article id (from the querystring) and a slug. However, I have hundreds of old URLs indexed by Google in the format /article.php?id=123 and need to redirect these. My plan was to do the following. 301 Redirect /article.php?id=123 to an intermediate page, in this case /redirect/123. On this intermediate page I would do a database lookup for the article slug, based on the ID from the querystring, create a new URL and perform a second 301 redirect to my new URL E.g. /articles/article-slug-from-database. Whilst this works and keeps the site usable for visitors the two 301 redirects do worry me, as I don;t want Google indexing lots of /redirect/[article id] urls. The other solution is to generate hundreds of htaccess redirect rules that map old url to the new url. The first solution is much cleaner, but the two 301's worry me. Will Google work this out on it's own, is there a better way? Any advice is much appreciated. Cheers Rob
Intermediate & Advanced SEO | | AmyCrompton1 -
Disallow my store in robots.txt?
Should I disallow my store directory in robots.txt? Here is the URL: https://www.stdtime.com/store/ Here are my reasons for suggesting this: SEOMOZ finds crawl "errors" in there that I don't care about I don't think I care if the search engines index those pages I only have one product, and it is not an impulse buy My product has a 60 day sales cycle, so price is less important than features
Intermediate & Advanced SEO | | raywhite0 -
How long will Google take to read my robots.txt after updating?
I updated www.egrecia.es/robots.txt two weeks ago and I still haven't solved Duplicate Title and Content on the website. The Google SERP doesn't show those urls any more but SEOMOZ Crawl Errors nor Google Webmaster Tools recognize the change. How long will it take?
Intermediate & Advanced SEO | | Tintanus0 -
301 Redirect or Canonical Tag or Leave Them Alone? Different Pages - Similar Content
We currently have 3 different versions of our State Business-for-Sale listings pages - the versions are: **Version 1 -- Preferred Version: ** http://www.businessbroker.net/State/California-Businesses_For_Sale.aspx Title = California Business for Sale Ads - California Businesses for Sale & Business Brokers - Sell a Business on Business Broker Version 2: http://www.businessbroker.net/Businesses_For_Sale-State-California.aspx Title = California Business for Sale | 3124 California Businesses for Sale | BusinessBroker.net Version 3: http://www.businessbroker.net/listings/business_for_sale_california.ihtml Title = California Businesses for Sale at BusinessBroker.net - California Business for Sale While the page titles and meta data are a bit different, the bulk of the page content (which is the listings rendered) are identical. We were wondering if it would make good sense to either (A) 301 redirect Versions 2 and 3 to the preferred Version 1 page or (B) put Canonical Tags on Versions 2 and 3 labeling Version 1 as the preferred version. We have this issue for all 50 U.S. States -- I've mentioned California here but the same applies for Alabama through Wyoming - same issue. Given that there are 3 different flavors and all are showing up in the Search Results -- some on the same 1st page of results -- which probably is a good thing for now -- should we do a 301 redirect or a Canonical Tag on Versions 2 and 3? Seems like with Google cracking down on duplicate content, it might be wise to be proactive. Any thoughts or suggestions would be greatly appreciated! Thanks. Matt M
Intermediate & Advanced SEO | | MWM37720