How will Google deal with the crosslinks for my multiple domain site
-
Hi,
I can't find any good answer to this question so I thought, why not ask Moz.com ;-)!
-
I have a site, let's call it webshop.xx
-
For a few languages/markets, Deutsch, Dutch & Belgian, English, French.
-
I use a different TLD with a different IP for each of these languages, so I'll end up with:
-
webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr
-
They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff
-
My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening)
My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made:
-
I get full external links juice (content is translated so unique?)
-
I get a bit of the juice of an external link
-
They are actually seen as internal links
-
I'll get a penalty
Thanks in advance guys!!!
-
-
Thanks Alex, that is definitely something I'll have to look into. All pages are translated by hand.
-
It depends on how your content is translated - is it auto translated or a high-quality translation by a native speaker of the language? Google have said that auto translations can be a really bad experience - so in this case your translated content could be ignored, and maybe your sites could be penalised if it looks like you're generating the content to spam links. If your translations are good quality, you should not have a problem - but you do need to send the correct signals to the search engine crawlers.
- Markup your content or sitemaps with the hreflang attribute. Note the 2 values when targeting particular countries e.g. hreflang=”fr-fr” if your content is for French speakers in France and hreflang=”fr-be” for French speakers in Belgium. The language and country codes are linked from the above page.
- You should also have a Webmaster Tools account for each TLD, and geotarget the domains to the relevant country.
- Country-specific addresses, phone numbers and currency on each translated website can all help send the right signals about your content too.
Google's advice is here: https://support.google.com/webmasters/answer/182192
In answer to your question, I'm not sure, just make sure you do everything properly to avoid potential problems. I'd say it won't be 1 or 4.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Google Disavow and Penalty lifted please help?
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob
White Hat / Black Hat SEO | | BobAnderson0 -
Exact Match Domains - Why are they still dominating?
Fantastic day! I am seeing exact match domains still dominating. SEOmoz has some insight: http://www.seomoz.org/blog/exact-match-domains-are-far-too-powerful-is-their-time-limited But that's from two years ago. Is Google ever going to target the manipulators that buy up all the exact match domains? One of our partners is getting the itch, and I am running out of explanations on why we don't manipulate. But if these practices are dominating their industry, what to do? I have to get paid to feed the family so just telling the client buh-bye isn't going to work. At least not in this stage of agency building. Their root domain doesn't do much for them, however, you know we well optimize those subdomains and rank those fine. But if my client can just buy an exact match domain, and it will take less SEO work to get it ranked then why not? He has the SEO expert in his back pocket to clean up the mess IF they would even get a penalty or drop in rank. Is all SEO really is find algo hole, manipulate, penalty, fix, find algo hole, manipulate, penalty, fix. Wash. Rinse. Repeat. Please share your experiences and insight! Thanks, Ben
White Hat / Black Hat SEO | | cyberlicious0 -
Ranking in Multiple Geographic Locations
Hey Mozers, We are a Joomla Web Design firm located in Milwaukee Wisconsin, however, we serve clients all over the midwest (and US) (chicago, madison, minneapolis, etc) I'm curious what the best strategy for ranking in these new geographic areas? Originally I was thinking of creating geographic specific landing pages for each area, however, i'm scared it will hurt us with Google's recent penguin and panda 3.5 updates. Also, won't i need to link to these landing pages from our main website to get them indexed? What about creating mini websites on subdomains: (example) Chicago.SavvyPanda.com?? What are your ideas? Do you have clients who have successfully started ranking in multiple geographic cities/areas?
White Hat / Black Hat SEO | | SavvyPanda0 -
Are there tools out there to determine when a link linked to your site? I want to know when a link farm was done a site.
In Webmaster Tools I discovered that a client of mine with signed up for or hired another company to get links. The links are poor quality and from other countries, so it looks like a link farm was done. I want to know when they links were linked to the site, and not sure how to find that information out. Does anyone know how to find this out?
White Hat / Black Hat SEO | | StrategicEdgePartners0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Is Google stupid?
Why does buying links still work? I don't mean approaching an individual webmaster and cutting a deal, that seems to be nearly impossible to detect. But the huge link brokers, like Text Link Ads, Build my Rank or Linkvine, Google has to be aware of them, right? Can't they just create accounts to see the whole network, and ban the sites? Why wouldn't they just do that?
White Hat / Black Hat SEO | | menachemp0