Duplicate Articles
-
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
-
The magazine has already given us the ok, like I said they're much more offline focused so it's more about what Google thinks. I think I agree about playing it safe with the canonical tag though. Thanks!
-
If it's really just for your own reference or limited use, I'd probably set up the cross-domain canonical and keep it off of Google's radar. Later, if you wanted to self-publish, you could remove that.
If it's just your site and theirs, it's probably not a high-risk situation. In some ways, it's more about the relationship. If your pages started ranking instead of theirs, I don't know if that goes against your general agreement with them. I'd probably play it safe for now.
-
Our site doesn't have the largest audience yet but management simply wants a place they can go or send clients to easily find everything in one place. The magazine is more for offline advertising but they post it online as well.
-
I'd just add to what Jason said, which I think is generally on-target. If the magazine really is the "source", then posting all those articles again on your site could look "thin" to both users and search engines. In general, you're not ranking for them now, so you probably won't lose out, from an SEO standpoint. There is some risk if you copy a lot of articles, though. You don't want to look like you're scraping your own content, in essence.
The cross-domain rel-canonical should remove the risk of any sort of search penalty or problems. So, again, it's a question of whether it provides value to your site.
At some point, you have to ask - would it make sense to only post them on your site? In other words, if you're building an audience, does it make sense to build it for someone else? Granted, that's a much larger business and marketing decision (far beyond SEO).
-
It's nots a "bad" thing to post the articles in two places, as this type of syndication is somewhat commonplace in the corporate world. Provided your site already as a lot of content and is generally good quality, there's no risk of a penalty for syndicating content.
However, I would encourage management to look at it from the user's perspective: If the user reads the article in the magazine, they're not going to find it very useful to see the same article again on your site. Conversely, if your website visitors aren't going to see the article in the magazine first, why send it to the magazine at all?
One solution is to quote a snippet of the original magazine article on your site, and then write a 200+ word summary or intro for the magazine article that perhaps summarizes the key points, introduces the article in a different way, etc., and then links to the magazine.
From a user's perspective, all the content you've published on your site and in the magazine is unique and potentially useful. From the SEO perspective, there's no possibility of an issue and - unlike syndication - you're adding a unique page of content to your site that is highly likely to be indexed and help you in the long run.
Syndication isn't bad, but you have to ask why you're doing it in the first place. It's often just as easy to create a short "What You'll Learn In This Article" intro on your site than it is to cut-and-paste.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Does Google checks the author name of the articles with backlinks to a website?
Hi, This may sound a little too suspicious; but just want to take your suggestions and experience in this. We are trying to create articles on third party websites to increase backlinks, our brand popularity and awareness about our features. If the same author is mentioned in multiple or tens of articles with backlinks to same website; will Google monitor the author name? Is there anything wrong in creating too many external articles with same author name? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Can I post an article on my blog if it has already been published and taken down?
Hi Guys, A writer for my site has offered to let me post her article on my blog, however the article has already been published on another blog, but the blog has now been taken down. If I publish this on my blog will there be any harm to my blog? I want to stay clean and not be in trouble with penguin in any way shape or form! Cheers everyone appreciate some advice here!
White Hat / Black Hat SEO | | edward-may0 -
Read article and share your views
HI All, Yesterday i was read one article,As per article he is saying link building is very important so please read and share your views. http://searchengineland.com/7-things-wish-cmos-knew-link-building-192705
White Hat / Black Hat SEO | | dotlineseo0 -
Is article syndication still a safe & effective method of link building?
Hello, We have an SEO agency pushing to implement article syndication as a method of link building. They claim to only target industry-relevant, high authority sources. I am very skeptical of this tactic but they are a fairly reputable agency and claim this is safe and works for their other clients. They sent a broadly written (but not trash) article, as well as a short list of places they would syndicate the article on, such as issuu.com and scribd.com. These are high authority sites and I don't believe I've heard of any algo updates targeting them. Regarding linking, they said they usually put them in article descriptions and company bylines, using branded exact and partial matches; so the anchor text contains exact or partial keywords but also contains our brand name. Lately, I have been under the impression that the only "safe" links that have been manually built, such as these, should be either branded or simply your site's URL. Does anyone still use article syndication as a form of link building with success? Do you see any red flags here? Thanks!
White Hat / Black Hat SEO | | David_Veldt0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0