Does posting a source to the original content avoid duplicate content risk?
-
A site I work with allows registered user to post blog posts (longer articles).
Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?
Thanks!
-
I don't know what Roger says, but I believe that followed links on noindex pages will pass PageRank, anchor text and other link benefits. Your instructions are to "no index" but the page will still be crawled.
-
Hi EGOL.
If you noindex pages and other sites link to them, do you benefit from that or not?
Do you see any pagerank on those, that are old enough to show it?
What does Roger say about those?
-
I publish other people's content. That caused a Panda problem about a year ago - which I was able to recover from by noindexing those pages. Now I noindex / follow any content that I publish that appears on another website.
The articles that I write are published on my own site only.
-
I'm concerned about what's best for my site -and would therefore not post other peoples content - so i've never had to deal with this
I guess if I owned both sites i would prefer to cross canonical the duped pages to my other site If i didn't own the other site i would probably just opt to noindex follow that page i guess
-
The last question in the text is......
Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible. -
Egol,
The Matt Cutts video seems to say you can't canonicalize between two totally different domains. So, we couldn't use a canonical for that.
-
Canonicalling them will give the benefit to the author's original page. It does not have benefit for you.
If you want them to rel=canonical for you then it is good to do it for them.
-
If you want to avoid panda with content on your own site then you can noindex, follow those pages.
Your visitors will be able to use them but they will not appear in the search engines.
-
Hey Egol, What is the benefit of canonicalling to them over just meta noindex,following the page?
-
So, you're not saying rel canonical to their page?
What if we just no-follow pages on our site that author originally published on their site? Right now we link to it as orginally published on ....
I'm trying to avoid a Panda penalty for non-unique blog posts reposted on our site.
-
I have used rel=canonical to reduce duplicate content risk. However, more important, the rel=canonical gives credit to the page where it points.
One problem with guest posting is that to reduce duplicate content risk and transfer credit to your own site, you must have the site owners cooperation.
Of course, you can get author credit by linking the post to your Google+ profile - if you think that has value.
-
Hi,
Thanks, Egol
So, on a page of ours where someone re-posts their blog post on our site, we'd add a canonical tag on our page to point to their original page? That would be a canonical tag between two different domains. I didn't think that was okay.
And, if we did that, we wouldn't be risking some kind of Panda duplicate content penalty?
Thanks!
-
"Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content?"
No. To prevent that you need to use the rel=canonical.
See Matt Cutts video here....
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
Keyword Duplication in the title
Hello, I read on this great SEO Blueprint Article here that you don't want to duplicate any words in the title tag, even one duplicate. But what if your branding and keywords both have the same word in it. For example, making the title here like this: NLP Training and Certification Center | NLP and Coaching Institute which is 66 characters by the way. Your thoughts on the duplicate word "NLP"?
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
What's the right way to gain the benefits of an EMD but avoid cramming the title?
Hi Guys, Say I'm (completely hypothetically) building weddingvenuesnewyork.com and right now I'm organizing the tags for each page. What's the best layout so that I can optimize for "wedding venues new york" as much as possible without it becoming spammy. Right now I'm looking at something like "Wedding Venues New York: Wedding Receptions and Ceremony Venues" for the title.. To get other strong keywords in there too. Is there a better layout/structure?.. And is having the first words of the title on the homepage the same as the domain name going to strengthen the ranking for that term, or look spammy to Google and be a bad move? This is a new site being built
White Hat / Black Hat SEO | | xcyte0 -
Avoiding Keyword Stuffing
I'm using this in my title, "Used Cars and Used Car Loans in _____" Is this a good practice since they're similar but different keywords? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0