Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use 'Click here' as an inbound link for my cornerstone content?
Hello Should I use 'Click here' as an inbound link for my cornerstone content? Example: For a full selection of our Facebook Event Attendee packages, please click here. OR Example: Please click the following link for a selection of our Facebook Event Attendee packages. This is my product page to help you better understand the context: LikeChimp
On-Page Optimization | | xdunningx0 -
Indexing Issues
One of the main pages on my site, http://www.waikoloavacationrentals.com/kolea-rentals/condos, I have been having a hard time getting google to index it correctly or at all. It is one of the top pages on my site and should be in my sub links in google, but it is not even showing up in searches. Any input would be appreciated. The only red flap issue is the number of outgoing links, but that is the way the page is supposed to be. I would assume most real estate listing pages are very similar. Ultimately when you look at traffic, time on page, inbound links, etc. it is one of the top pages on my site in all those categories. Any input would be greatly appreciated.
On-Page Optimization | | RobDalton0 -
SEO and dynamic content
I am working on a project right now and I am looking for some advice on the SEO implications. The site is an e-commerce site and on the category pages it is using an external call to retrieve the products after the page is loaded. How it works is all content on the site is loaded, then after that a js script appends an ID and loads all of the product information. I am unsure how Google will see this, anyone have any insights?
On-Page Optimization | | LesleyPaone0 -
Is this hidden content?
Hi all, I was wondering if the homepage of www.dirtylooks.com has hidden content in a search engines eyes. There is some text which appears underneath a tile called "hair tools" that has to be scrolled in order to be viewed by a visitor. As this isn't the typical white on white or off page by CSS hidden content are we in danger of being penalised?
On-Page Optimization | | BenfromBNKR0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
I'm using Canonical URL but still receiving message - Appropriate Use of Rel Canonical
Hello, I checked my site and it looks like everything is setup correctly for canonical url but I keep getting the message that it's not. Am I doing something wrong? SORRY I FIGURED IT OUT! THANK YOU! HOW DO I DELETE THIS?
On-Page Optimization | | seohlp440 -
Is there a way to tell Google a site has duplicated content?
Hello, We are joining 4 of our sites, into 1 big portal, and the content from each site gonna be inside this portal and sold as a package. We don't wanna kill these sites we are joining at this moment, we just wanna import their content into the new site and in a few months we will be killing them. Is there a way to tell Google to not consider the content on these small sites, so the new site don't get penalised? Thanks,
On-Page Optimization | | darkmediagroup0 -
Issue: Rel Canonical
My SEO Report shows issues: Rel Canonical I have a wordpress website each page has its content but I'm getting errors from my SEOMOZ report. I instaledl the yoast plug in to fix the issue but I'm still getting 29 errors. Wordpress 3.4.1
On-Page Optimization | | mobiledudes0