How to Best Establish Ownership when Content is Duplicated?
-
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author?
Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A.
Will that work, or is there a better solution?
This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
-
Thanks grasshopper!
-
No; site B is using Spry, a now defunct Adobe framework: http://blogs.adobe.com/dreamweaver/2012/08/update-on-adobe-spry-framework-availability.html
Site A uses ExpressionEngine.
-
Hi Allie,
You're absolutely on the right track. Cross-domain canonicalization was created to deal with situations exactly like the one you're describing. I've used it many times, and it works great.
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Im pretty sure you can trackback to the original.. Are you using wordpress?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
WMT Index Status - Possible Duplicate Content
Hi everyone. A little background: I have a website that is 3 years old. For a period of 8 months I was in the top 5 for my main targeted keyword. I seemed to have survived the man eating panda but not so sure about the blood thirsty penguin. Anyway; my homepage, along with other important pages, have been wiped of the face of Google's planet. First I got rid of some links that may not have been helping and disavowed them. When this didn't work I decided to do a complete redesign of my site with better content, cleaner design, removed ads (only had 1) and incorporated social integration. This has had no effect at all. I filed a reconsideration request and was told that I have NOT had any manual spam penalties made against me, by the way I never received any warning messages in WMT. SO, what could be the problem? Maybe it's duplicate content? In WMT the Index Status indicates that there are 260 pages indexed. However; I have only 47 pages in my sitemap and when I do a site: search on Google it only retrieves 44 pages. So what are all these other pages? Before I uploaded the redesign I removed all the current pages from the index and cache using the remove URL tool in WMT. I should mention that I have a blog on Blogger that is linked to a subdomain on my hosting account i.e. http://blog.mydomain.co.uk. Are the blog posts counted as pages on my site or on Blogger's servers? Ahhhh this is too complicated lol Any help will be much appreciated! Many thanks, Mark.
Intermediate & Advanced SEO | | Nortski0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Duplicate Content, Campaign Explorer & Rel Canonical
Google Advises to use Rel Canonical URL's to advise them which page with similiar information is more relevant. You are supposed to put a rel canonical on the non-preferred pages to point back to the desired page. How do you handle this with a product catalog using ajax, where the additional pages do not exist? An example would be: <colgroup><col width="470"></colgroup>
Intermediate & Advanced SEO | | eric_since1910.com
| .com/productcategory.aspx?page=1 /productcategory.aspx?page=2 /productcategory.aspx?page=3 /productcategory.aspx?page=4 The page=1,2,3 and 4 do not physically exist, they are simply referencing additional products I have rel canonical urls' on the main page www.examplesite.com/productcategory.aspx, but I am not 100% sure this is correct or how else it could be handled. Any Ideas Pro mozzers? |0