Copying Content With Permission
-
Hi, we received an email about a guy who wants to copy and paste our content on his website, he says he will keep all the links we put there and give us full credit for it, so besides keeping all the links on the page, which is the best way for him to give us the credit? a link to the original article? an special meta tag? what?
Thank you
PS.Our site its much more authorative than his and we get indexed within 10min from the moment we publish a page, so I don't worry about him out raking us with our own content.
-
Very controversial...duplicate content...
-
Syndication Source and and Original Source are both generally used for Google News algo at this point. For the main SERPs you would use a cross-domain rel="canonical". The problem with all of these is that they require the re-publisher to edit their html header file on a per-content basis. That is not technologically scalable for many sites so it could kill the deal. If they are willing to give you the rel canonical tag pointing to your domain, that is best (especially if the story includes links to your site). Otherwise, getting your site indexed first and making sure their links to your site int he copy are followable should do the trick.
Don't let them publish every single story you write though. You want readers to have a reason to come subscribe to your site if they read something on the other site.
-
Thanks Matt, that's great stuff! I always keep track of what gets indexed. And yes, choosing who to share the content with is for sure very important, I would not want a content farm related to our site in any way, specially now
-
Hi Andres,
As long as you're getting direct followed links back to your original article, then that should be enough. A couple of other things though:
- Even though you're confident you'll be indexed before the other site, I'd still implement some embargo time on when they can publish on their site as a fallback.
- Take a look at the site itself that will be linking to you... is it something you a) want your content associated with, and b) want your link profile associated with?
Some resources you may be interested in:
[1] http://www.seomoz.org/blog/whiteboard-friday-content-technology-licensing
[2] http://googlewebmastercentral.blogspot.com/2006/12/deftly-dealing-with-duplicate-content.html (deals with syndication)
[3] http://www.mattcutts.com/blog/duplicate-content-question/
-
If this happens often you should consider using http://www.tynt.com/ and modify your attribution settings to suit your needs.
-
I have not tested the "syndication-source" or "original-source" tags personally but I have seen a very good case of credit syndication being used at http://www.privatecloud.com
Almost 95% of the content on this website is duplicate word for word of the original article located on the third party websites. I have been tracking this site for almost 6 months now and have seen several instances of duplicate pages (with credit to original article) indexed and ranking on Google SERPs.
Using this example I would agree that your technique should work fine.
-
Hi Sameer, I am not sure about using a canonical tag since its not our site and maybe there will be more content than just ours, he ask permission just to copy and paste so yes its dupe and we wanted index for the backlinks, this is my idea:
http://googlenewsblog.blogspot.com/2010/11/credit-where-credit-is-due.html
syndication-source indicates the preferred URL for a syndicated article. If two versions of an article are exactly the same, or only very slightly modified, we're asking publishers to use syndication-source to point us to the one they would like Google News to use. For example, if Publisher X syndicates stories to Publisher Y, both should put the following metatag on those articles:
let me know what you think.
-
Hey Andrés,
As a general rule, content is considered duplicate only if it is more than 35-40% copy of the original. If the person wants to copy your website word for word then here are the few ways you can avoid duplicate content penalty
- Rel canonical - Add a rel canonical tag to the section of the non-canonical page. This will inform Google on what page is the most relevant to be indexed (your webpages in this case).
2. Reduce duplication - Ask the person to modify the content and rewrite in their own words. DupeCop is a good tool that will allow you to compare two content pieces and measure the duplication percentage. (Don't use respun content always rewrite in your own words.)
3. NoIndex Meta Robot tags - If they are not willing to change the page content then you can ask them to prevent those pages getting spidered by adding a noindex meta tags.
Best
Sameer
-
So the best way to get the credit from the article are just the links? is there any special tag? something like meta name=syndication-source? no need?
And yes, you are right its manual syndication and he will keep all the links.
thank you Gianluca
-
Hi...
what you describe is somehow a sort of syndication of your content. A manual one, but still a syndication.
I believe that the guy, when he says he will give you full credit for the content, was meaning an optimized full link to the original article.
If it is so, I would say yes to that guy. If not, ask him to do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Would Google Call These Pages Duplicate Content?
Our Web store, http://www.audiobooksonline.com/index.html, has struggled with duplicate content issues for some time. One aspect of duplicate content is a page like this: http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html. When an audio book title goes out-of-publication we keep the page at our store and display a http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html whenever a visitor attempts to visit a specific title that is OOP. There are several thousand OOP pages. Would Google consider these OOP pages duplicate content?
Technical SEO | | lbohen0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
How different should content be so that it is not considered duplicate?
I am making a 2nd website for the same company. The name of the company, our services, keywords and contact info will show up several times within the text of both websites. The overall text and paragraphs will be different but some info may be repeated on both sites. Should I continue this? What precautions should I take?
Technical SEO | | savva0 -
Avoiding duplicate content on product pages?
Hi, I'm creating a bunch of product pages for courses for a university and I'm concerned about duplicate content penalties. While the page names are different and some of the test is different, much of the text is the same between pairs of pages. I.e. a BA and an MA in a particular subject (say 'hairdressing' will have the same subject descriptions, school introduction paragraph, industry overview paragraph etc. 1. Is this a problem? In a site with 100 pages, if sets of 2 pages have about 50% identical content... 2. If it is a problem, is there anything I can do, other than rewrite the text? 3. From a search perspective, would both pages show up in search results in searches related to 'hairdressing courses' 'study hairdressing' etc? Thanks!
Technical SEO | | AISFM0 -
Duplicate Content Vs No Content
Hello! A question that has been throw around a lot at our company has been "Is duplicate content better than no content?". We operate a range of online flash game sites, most of which pull their games from a feed, which includes the game description. We have unique content written on the home page of the website, but aside from that, the game descriptions are the only text content on the website. We have been hit by both Panda and Penguin, and are in the process of trying to recover from both. In this effort we are trying to decide whether to remove or keep the game descriptions. I figured the best way to settle the issue would be to ask here. I understand the best solution would be to replace the descriptions with unique content, however, that is a massive task when you've got thousands of games. So if you have to choose between duplicate or no content, which is better for SEO? Thanks!
Technical SEO | | Ryan_Phillips0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0