Same content but translated. Penalization?
-
Hi There,
I’ve got a question. There are two website that are under the same proprietor but must stay not related (different brand, different IP, different country, different language). The question is: Does google penalize one of the site if I entirely translate the content from site 1 to site2?
Thank you very much for you input
-
Thanks a lot for your answers...it's all more clear now.
Have a lovely day
-
If the web sites are "separated" in term of business or need to be so because of a company decision, then you don't need to use the hreflang annotations, also because, honestly, they are in two different languages and targeting different countries and don't share the brand name.
Geotargeting the sites to their corresponding countries in Search Console (which is automatic, if your sites are under country code level domain names), earning local visibility with links, citations and social mention, as well having a well done content localization is more than enough for a situation like yours.
-
No.
Translated sites are not duplicated sites, hence don't worry and proceed (if it was so, almost all sites going international would be banned :-)).
-
Thanks for your replies
The sites will sell the same product but will have different brand name and each of them will belong to a different country with different languages. So it is possible for me to transfer the same site structure and translate the same content (or at least 95%) but I don't want the sites(brand) to be related.
Therefore the possibility of adding hreflang tags to point from site1 to site2 might be not the best option since I don't want them to be associated, right? Yet I don't want to risk a penalization...What do you think?
Cheers
-
Hi there
If I am understanding correctly, you're saying you want to take content from Site A and translate it to be placed on Site B?
I would look into the following:
Hreflang attributes (Google)
Language tags (Bing)
Country Targeting (Google)
Geo-Targeting (Bing)
International SEO & Checklist Out of curiosity, how do you plan on keeping these sites unrelated when the content will be a translation? If you take the steps above you should be able to avoid a duplicate content issue, but you're going to want to consider the user experience and creating content that speaks to the users of that region or country.Hope this helps! Good luck!
-
If you would have 1 site in Dutch and the other one in English then you shouldn't get in trouble. You're just adding value for your users to have the site translated in their own language. What I would do though is look into the possibility of adding href lang tags to point from 1 site to the other site to tell the search engines that your content is also available in another language.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Updating Content - Make changes to current URL or create a new one?
I'm working with a content team on a job search guide for 2019. We already have a job search guide for 2018. Should we just edit the content of the job search guide for 2018 to make it current for 2019, which means the job search guide for 2018 would not exist anymore or should we keep the 2018 guide and just create a new web page for the 2019 guide that way both exist. We currently rank very well for the 2018 job search guide.
Content Development | | Olivia9541 -
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
A good content calendar/organizer suggestion?
Does anyone have a good content calendar/organizer/software/etc to help plan delivering and pushing out content? I haven't ever used anything other than an actual calendar, and that doesn't seem to help all that much. Is there anything better out there? Any suggestions would be fantastic! Much appreciated, Ruben
Content Development | | KempRugeLawGroup0 -
Duplicate listings content
I've been listing our business on various business and wedding directories most of which require a short description of the business. I've written up a "boilerplate" description which is fairly similar to one we have on the site. If I use this content on multiple listings sites will it cause me issues with google seeing it as duplicate content and finding it difficult to decide which pages to rank?Most of these sites are supplier directories so I don't expect people to site search them for our business they are much more likely to browse categories but will it affect organic results for our actual website? Should I be writing a different piece of copy for each listing which requires a company description to avoid any issues?
Content Development | | EdoubleD
Any help is greatly appreciated!!!0 -
Questions about reorganizing a website's content structure
I'm working with a publisher who is considering reorganizing the content on their site, which up to now has been presented more or less as a portal site for a variety of segments in a particular business industry. One scenario that is on the table is to remove some content sections altogether and republish them under their own unique domains. It's important to note that this reorganization would be part of a new brand strategy. So my first question is whether this is one of those avoid-at-all-costs scenarios? My second question is if we decide to procede, what kind of time could it take for these new domains to generate the same level of search traffic they are currently pulling in on the portal site? Thanks!
Content Development | | accessintel0 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
Matt Cutts and Curated Content -- something is confusing here...
Okay, I read an interview somewhere this week where Matt Cutts said he didn't care much for curated content. Today I searched on that subject and came up with the following video of his: http://youtu.be/zZU7O1BHfyo So, in the video he is going along and saying not to just grab content and repost it. And then at around minute 3:15 he says that, on the other hand, you can have a blog like DaringFireball.net and that's a good thing, because the blogger takes the time to pick and choose what he is posting. I went to Daring Fireball to take a look, and I saw that he writes maybe one line of commentary, and then pastes in a big chunk of the curated content along with a link to the source. This shocked me. How could Matt like that blog -- he keeps telling that he likes original not duplicate, curated content. So, the difference is that a blog can get away with this if they exercise discretion in what they choose to copy and paste? How the hell would the Google algorithm know what the intention of the blogger is? And here I've been wasting my time writing up paragraphs and paragraphs to precede any excerpts I paste in, in fear of getting hit by Google. I'd like to hear your comments on this.
Content Development | | bizzer0 -
Is it considered as duplicate content ?
Hello, I see a lot of errors on my webmaster tools because of this ajax code on my questions pages of the site (screen) : www.dismoicomment.fr The code : | / ADD ANSWER FORM |
Content Development | | elitepronostic
| | $("#answer-add-button").click(function () { |
| | $.ajax({ |
| | type: 'POST', |
| | url: '/answers/quelle-assurance-choisir-pour-un-scooter/', |
| | data: $("form#answer-add").serialize(), |
| | dataType: 'html', |
| | success: function(data) { |
| | |
| | if(data=="answer") { |
| | $('.answer-add-message').show().empty(); |
| | $(document).ready(function() { |
| | $(' Vous avez déjà répondu à cette question. ').appendTo('.answer-add-message'); |
| | }); | I have add a line on my robots.txt : http://www.dismoicomment.fr/robots.txt for remove all urls with /answers/. These urls with /answers/ aren't indexed in google. Do you think that it is dangerous and that can be considered as duplicate content ? 1129546035.png0