BEING PROACTIVE ABOUT CONTENT DUPLICATION...
-
So we all know that duplicate content is bad for SEO.
I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source ....
so if anyone else copies it they get penalised and not me...
Would appreciate your answers...
regards,
-
Thanks for this link & to break this MYTH that google accept that "first indexed is original"
-
Daylan offers some good points. If you decide to use TYNT, you may want to consider AddThis which offers the same feature but also includes many social networking options.
-
As Ryan said, rel="author" would help.
Another thing which might help is an app called Tynt. It adds a little snippet of code for whenever your content is copied from your site, then attributes it to your domain by attaching a link when that copy is pasted. It's not fool proof of course and the url can be removed, but I like to think of it as one more 'safe guard' to attributing content to your site (plus it's good for link building).
One bit of advice I found useful is if you do link to other content on your site in your posts, to use absolute paths to your own domain, this way if your content is just scraped you at least get the direct link to your site as well and added kudos as being relevant to the topic even if the post is picked up by someone else.
Of course, none of this is fool proof, but it's just a couple of added aspects that might help your site benefit a little if your content is stolen.
-
Thanks for the response Ryan...again you were great help.
-
http://www.google.com/support/webmasters/bin/answer.py?answer=1229920
Google is working on it. They implemented rel=author which is the above link. They do not accept the "first indexed is original" idea presently, but it could be taken as an indicator.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://mza.bundledseo.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
How to handle duplicate content with Bible verses
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
Intermediate & Advanced SEO | | jeremyskillings0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0