How to Best Establish Ownership when Content is Duplicated?
-
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author?
Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A.
Will that work, or is there a better solution?
This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
-
Thanks grasshopper!
-
No; site B is using Spry, a now defunct Adobe framework: http://blogs.adobe.com/dreamweaver/2012/08/update-on-adobe-spry-framework-availability.html
Site A uses ExpressionEngine.
-
Hi Allie,
You're absolutely on the right track. Cross-domain canonicalization was created to deal with situations exactly like the one you're describing. I've used it many times, and it works great.
http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html
-
Im pretty sure you can trackback to the original.. Are you using wordpress?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is best practise for markup in a nav bar, avoid duplicate content.
I'm wondering what's best practise for markup in the nav bar. So if we have <nav> Can we have headers in there, does it help? If we do, would a H1 confuse google as to wether that is the H1 of importance of the one in thein the </nav>
Intermediate & Advanced SEO | | TrueluxGroup0 -
Is writing good content the best SEO?
Hi, After reading Mr. Shepard's amazing article on the 7 concepts of advanced on-page SEO (https://mza.bundledseo.com/blog/7-advanced-seo-concepts), I decided to share my own experience in hopes of helping others. I started doing legal SEO back in 2013. At the time I really didn't know much about SEO. My first client (my brother) had recently left the D.A.'s office to become a criminal defense attorney. I told him to write content for the following areas: domestic violence, sex crimes, and homicide. He finished his first content piece on domestic violence and I was not impressed. It seemed too unique, individualized, and lacked the "generic" feel that many of the currently ranking pages had. Please note that I don't mean "generic" in a negative way. I just mean that his content regarding domestic violence felt too personalized. Granted, his "personalized" approach came from a Deputy D.A. with over 13 years handling domestic violence, sex crimes, and murder cases. I was inclined to re-write his content, but lacking any experience in criminal law I really had no choice but to use it. IMPORTANT: Please note that I barely knew any SEO at the time (I hadn't even yet discovered MOZ), and my brother knew, and continues to know, absolutely nothing about SEO. He simply wrote the content from the perspective of an attorney who had spent the better part of 13 years handling these types of cases. The result? Google: "Los Angeles domestic violence lawyer/attorney", "Los Angeles sex crimes lawyer/attorney", and "Los Angeles homicide attorney." They have held those spots consistently since being published. I know that MANY other factors contribute to the success of content, but at the time I published them we had few links and very little "technical SEO." Unfortunately, I started learning "SEO" and applied standard SEO techniques to future content. The result? Never as good as the articles that were written with no SEO in mind. My purpose in writing this is to help anyone about to tackle a new project or revamp an existing site. Before getting too caught up in the keywords, H tags, and all the other stuff I seem to worry too much about, simply ask yourself - "is this great content?" Thanks again to the MOZ team for the great advice they have shared over the years. Honestly, I think I sometimes become overly reliant on SEO b/c it seems easier than taking the time to write a great piece of content. P.s. Any "SEO" stuff you see on the above-mentioned pages was done by me after the pages ranked well. P.p.s. I don't mean to imply that the above-mentioned pages are perfect, because they are not. My point is that content can rank well even without any emphasis on SEO, as long as the person writing it knows about the subject and takes the time to write something that readers find useful.
Intermediate & Advanced SEO | | mrodriguez14403 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Guest blogging and duplicate content
I have a guest blog prepared and several sites I can submit it to, would it be considered duplicate content if I submitted one guest blog post to multipul blogs? and if so this content is not on my site but is linking to it. What will google do? Lets say 5 blogs except the same content and post it up, I understand that the first blog to have it up will not be punished, what about the rest of the blogs? can they get punished for this duplicate content? can I get punished for having duplicate content linking to me?
Intermediate & Advanced SEO | | SEODinosaur0