Duplicate Content: is it a myth?
-
Hi everyone, just wanted to clarify some things, i've heard about a million different things on this; I just wanted to ask if duplicate content is a myth or its not thanks.
Peter
-
It took me about 2-3 weeks each time. I have a large list of Wordpress ping services and I manually pinged everything I could so it's possible it could take longer. Make sure you have a sitemap in WMT and it should speed you along as well. If you just made the change, I'd suggest resubmitting a map so they recrawl fresh.
-
How long did it take for the tags to drop from Google's index?
I have a site with about 800 tags indexed. Even after I did no-index follow, they're still showing up after a month. Do I have to wait it out... or could it not be picking up the noindex?
-
Thanks for the answer Matt I appreciate it
-
Our tested answer is that it's both. And therein lies the problem and the different answers you'll get.
Matt Cutts has repeatedly said "don't put your entire blog post on your homepage but if you have a teaser and link to the actual location that's a good way to do it." (http://www.webpronews.com/google-duplicate-content-2011-12)
He says Google understands blog structure but that you should only show excerpts other than the main page.
The issue is that it conflicts with a lot of things we've seen & heard including older stuff he's said.
"typically a whitehat site doesn’t neet to worry about 1-3 versions of an article on their own site"
The problem is say on Wordpress you tag a post with 15 tags. If you don't no-index a lot of your "ways" of seeing it, you don't have 1-3 versions. You can have 15 tags+ the permalnk + archive pages + category pages + ... and it just adds up. You shouldn't worry about 1-2 versions on your site. You should worry about 15-30 versions.
Out of curiosity, I ran a test on 2 generic wordpress.com blogs. I tagged one with 15 tags, tagged the other with 2 tags. Same subdomain length and topic. I indexed 3 posts with long random strings. In all 3 cases the one with 2 tags beat the one with 15 tags.
Then I no-indexed the 15 tag posts and they passed the 2 tag posts within 2 weeks. So in my experience, fewer duplicated posts meant more juice. However, no-indexing just the tags improved my SEO.
Small scale test but one I'd love to see repeated on a much larger scale sometime. In my opinion, duplicate content hurts unless it's excerpts (news sites, blog front pages).
Hope that helps (and explains why people give you both answers.)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When pitching a whitepaper as Push Content for Link Building, is it ok to give the person I'm pitching a link to a landing page with a form on it?
When pitching a whitepaper as Push Content for Link Building (i.e. pushing out content that my client has created), is it ok to give the person I'm pitching a link to a landing page with a form on it? Or should I create a landing page with the whitepaper included on it? I’m not sure if the client will be ok with this b/c I know they use the whitepaper for sales purposes to gain leads. For example, my pitch email would include a line such as this, "the whitepaper can be found at LINK and I'd love if you could share it with your readers." I think it may be weird/a little wrong to ask a webmaster to include a link on his site to a landing page with a form to get the whitepaper. Does this make sense? What have others done with whitepapers as Push Content for link building?
Link Building | | ArketiGroup1 -
How can a site with no content, non SEO optimized or any value rank better?
I understand there are many more variables to rank #1, but this site is currently #1 for one of our top keywords and I really don't understand what they Competior site barely has 200 words on any given page, very few content other than images with no ALT tags, titles, No H1 tags. in other words the website is not SEO optimized. My site has content, product descriptions, images, relevant titles, and definitely more optimized than competitor. My site VS competitor Total pages: 246 (me) vs 49 (them) Page Authority: 34 vs 26 MozRank: 5.3 vs 4.5 MozTrust: 5.7 vs 5.4 Domain Authority: 22 vs 17 External Links: 65 vs 7 Internal Links 152 vs 1 Followed Links 14 vs 5 More C-Blocks 14 vs 4 Total Social Shares: 127 vs 4
Link Building | | Droidman860 -
Giveaway outreach - duplicate content?
I'm thinking of hosting a giveaway, and promoting this to bloggers. Thinking of how to go about this, I've run into a sort of a road block. I'm thinking the best way would be to attach a flyer detailing the giveaway so that bloggers have easy access to the information. However, I fear that a lot of them will just copy + paste from the flyer straight to a blog post - which will create a lot of duplicate content. Anybody in the community willing to share their experiences and how they were able to go around the duplicate content issue?
Link Building | | ALLee1 -
Does PR Content need to be unique?
Hi, I've noticed that some our competitors are publishing unique content for each PR websites. Is that how it should be? Thanks
Link Building | | Rubix0 -
Duplicate Content Link Juice
Hello SEO Community, So sometimes I will pitch one guest post to multiple outlets and if multiple reply positively, I will tell all but the single most authoritative site that the article is being published elsewhere and it won't be unique for their site. But some don't mind and still post anyway. I definitely don't mind because it just means that more people will read my article and more sites will be linking back to me. What are the implications on the link juice from the most authoritative site? What about the other ones (which are still authoritative, just not as much)? Is there any reason I should explicitly tell them not to post if I know it will appear somewhere else? Thanks much,
Link Building | | CleanEdisonInc0 -
Client Content Strategy (or lack of)
Hey everyone, I just finished pitching a client for some SEO work. He said he does not want to write any type of content. He also said his industry is so unique that he does not want anyone else writing content for him either. His site has about 25 pages of total content now. I am going to get all of his onpage seo good to go but what would you do? Just build links? I really prefer to have some sort of content strategy in place but find it can be challenging with clients.
Link Building | | BigStereo0 -
Link & Content Building
this is a tough one for me. Some of you know that the average SEO contract is fairly limited in budget.Clients in very competitive industries are fearing spending a lot on SEO. I do not think I should list the reasons for this.what do you do with limited budgets for SEO.The onsite-technical optimization is fairly easy and take much prsonal labor but it is manageable.The off-site optimization, links, content , and just getting you client out there is time and funds consuming over the scope of regular projects I know.My question is what do you do with onsite optimization really. I am sure some advice will include write great content, get it out there, guest blogging however those are not readily available and are not fitting the scope of any project.what do you think?
Link Building | | ciznerguy0 -
Templated content = duplicate content?
I'm curious where the line is drawn for "duplicate content" by the search engines. Obviously the same article, or even the same article with minor edits, can and should be detected as duplicate. I have a use case where there is a database of similar, but not duplicate, content that changes as time passes. I want to serve this content up via html template but don't want the 1000 pages to be considered duplicates of each other. Example: Imagine local weather. You could create a template for city name, longitude, latitude, altitude, and current weather conditions. The values for all fields would be different for each of the 1000 database entries (cities) and one of the "current weather conditions" would change frequently (hourly, let's say). Now, if I have a nice heirarchical index pages (first one maybe points to 50 state sub-pages, and each state page points to 20 city pages) that point to the 1000 city-specific pages, would the city-specific pages be considered 'duplicate' since they are based on the same HTML template but all have different values in key areas? Does the answer change based on the % of the template (or visible text) that changes for each city? My goal is to get these 1000 subpages as part of my site, have them indexable, and have them each flow a little bit of link juice to my home page. Best practices? What should I be careful of? Thanks!
Link Building | | scanlin0