Google Translate for Unique Content
-
We are considering using the Google Translation tool to translate customer reviews into various languages for publication as indexable content both for users and for search engine long tail visibility and rankings.
Does anyone have any experience, insights or caveats to share?
-
Yep, I think that is pretty much on the money.
Also, if you want reviews, you could always put them in sub folders and geotarget the sub folders as well so.
Then, if you add these folders in webmaster tools and geo target them you should keep everything nice and clean and avoid the wrath of whatever white and black animal Google unleash upon the web next!
Hope this helps.
Marcus -
Thanks Marcus, hadn't see that.
Sounds like what MC is saying is that translated text is not suitable as unique content but if the translated content is quality checked and edited then it could in principle be used.
-
This is worth a read:
http://www.webpronews.com/google-duplicate-content-2011-12
It is not review specific but I think the point stands - don't use Google Translate for website content and if you want to do this, get them rewritten by a translator.
Point being, if your reviews are auto translated, it will not be perfect language so.... is not great content and in some respects, having barely coherant reviews could be a negative equity and is certainly not going to add that credibility you were looking for.
So, no real problem with translating them, but don't use Google Translate for it.
Hope this helps
Marcus
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
How does google treat dynamically generated content on a page?
I'm trying to find information on how google treats dynamically generated content within a webpage? (not dynamic urls) For example I have a list of our top 10 products with short product descriptions and links on our homepage to flow some of the pagerank to those individual product pages. My developer wants to make these top products dynamic to where they switch around daily. Won't this negatively affect my seo and ability to rank for those keywords if they keep switching around or would this help since the content would be updated so frequently?
Intermediate & Advanced SEO | | ntsupply0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Should you give all the posts in a Forum an unique description? Or let it empty so Google can make one with the crawled keywords .... ...
To make all descriptions for all forum posts unique is a hell of a job.... One option is to crawl the first 165 characters and turn these automaticly into the meta description of the page.
Intermediate & Advanced SEO | | Zanox
If Google thinks the meta description is not suitable for the search query, Google will make a own description. In this case all te meta descriptions are unique, like the Google Guidlines want you to do. How will Google think off the fact when we delete the meta description tag so Google will make all the descriptions by herself?0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Effect of I-Frame on Google Rank
My commercial real estate web site (www.nyc-officespace-leader.com) allows visitors to search for office space listings. The site sources listings through a third party and they are displayed in an i-frame. The i-frame directs visitors to listing pages such as: http://listings.nyc-officespace-leader.com/getspace.mpl?sp_id=A0173921&cust_id=offspldr Atleast 10,000 of these pages have backlinks to my site. My question is the following: Could these tens of thoudands of alpha numeric URLs be detrimental to my sites ranking on Google after the Panda/Penguin updates? SIte traffic dropped from 7,000 per month to about 3,300 after the April Google update. Rewriting content for dozens of pages and adding a blog have only somewhat mitigated the negative effects of Panda/Penguin. Could Google be viewing these links from the third party lisitng provider as a negative when they viewed these links as a plus before? Any downside to removing the third party links and parsing these listings from landlord websited and displaying them as part of my site with their own URL, title tag, description tag? Obviously the new URLS would not be alphanumeric. If these links have not caused the drop in traffic last April, what could be responsible? Thanks in advance for your opinion!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
Google Local oddity
So I spotted something a little weird... one of my client's Google Local placements in blended results has the domain name - complete with the .com extension appearing where the business name typically appears: Businessxyz.com www. businessxyz .com of Google reviews Has anyone seen this? I setup their Google Places account quite some time ago and used the business name - not the url. I also setup their Google+ and Local page - using the name. None of the page titles on the website contain the url. I simply can not pinpoint where G is pulling this from or why for that matter. All competitors are appearing with business name - only my client has the domain name visible for the particular local search query. Any ideas?
Intermediate & Advanced SEO | | SCW0