Duplicate Content - That Old Chestnut!!!
-
Hi Guys,
Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links.
1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links.
2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea?
A million thanks for any guidance.
Kind Regards,
C
-
Hi Craig,
Search this in Google "keyword"+“Guest bloggers wanted” OR “guest blogger wanted”
Then analyse the authority of each site, and choose the best 10 to submit your articles to.
You could also join the guest posting community at myblogguest.com
Good luck!
Greg
-
Thanks Greg,
If you could, where should I look for the best guest post sites? I have heard that some can be good and some can be bad. Now whilst or views and writing is fairly good we are no Technoratiiums......Your thoughts on this would be greatly appreciated?
Thanks,
C
-
Do both.
Google wants to see fresh content on you site, but then you also need contextual backlinks from other sites.
Do 10 guest posts, and publish 10 on your website.
Greg
-
Hi Alan,
Thanks for that, makes a lot of sense. So all that said and done. Would I be better, putting 15-20 articles per month only on our own blog on our website or should I also post some different articles as guest posts?
Nae Easy.
Thanks,
Craig
-
Google seems to have a number of ways to treat duplicate content of the type you are suggesting.
You are almost suggesting syndicated content. This is closely related to a press release, which is issued with one press releases distributor and then published on multiple other sites.
A little investigation shows that google handles such duplicate content pages differently, depending on which way the wind is blowing today.
If the site that posts the secondary copies is a powerful site or a site that google likes, then there seems to be no problem with duplicate content. To check this out, find an Associated Press story , then search for it at google and you will find it on dozens, or even hundreds of newspaper and radio station websites. Most of them will have exactly the same headline and exactly the same content.
Any site that posts the secondary copies, that google doesn't like, for example, covered by panda or penguin or a manual penalty, the content will not be found in search results, unless you go to the end of results and redo the search with the duplicates shown.
If the original site is weak or covered by panda or penguin, then the secondary copies may display in search results, but your original may not. It is also possible that your index pages that contain a snippet of the original may display in the results, but not the complete original, if other, more powerful sites are displaying it.
So, if my observations are valid, it means that you should keep the original to yourself and not syndicate it. If you want to get value from other sites, then write a different story for them (guest post, as DiscoverAfrica suggests) and get a link back to your site within the body of the story.
-
Hi Craig,
I would suggest looking for guest posting opportunities with webmasters in your niche (or similar niche) rather than publishing on Article/Blog directories.
1.) Always only publish your articles in one place, don't try to mass submit the same article to many websites.
2.) You can publish the article on your website, and then syndicate the article on other websites, but this isnt the best idea either. If you decide to guest post, the webmasters usually check to see if the content is original, if not they wont accept it. Even if they do accept it, the link pointing to the original article on your website is merely a "reference" rather than an endorsement from one site to the next.
Hope this makes sense?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Getting rid of duplicate content with rel=canonical
This may sound like a stupid question, however it's important that I get this 100% straight. A new client has nearly 6k duplicate page titles / descriptions. To cut a long story short, this is mostly the same page (or rather a set of pages), however every time Google visits these pages they get a different URL. Hence the astronomical number of duplicate page titles and descriptions. Now the easiest way to fix this looks like canonical linking. However, I want to be absolutely 100% sure that Google will then recognise that there is no duplicate content on the site. Ideally I'd like to 301 but the developers say this isn't possible, so I'm really hoping the canonical will do the job. Thanks.
Technical SEO | | RiceMedia0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0