How can I penalise my own site in an international search?
-
Perhaps penalise isn't the right word, but we have two ecommerce sites.
One at .com and one at .com.au.
For the com.au site we would like only that site to appear for our brand name search in google.com.au.
For the .com site we would like only that site to appear for our brand name search in google.com.
I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site.
What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site.
Is there anything I can do to lower the ranking of my .com site in Google.com.au?
-
One of the examples scenarios Google gives is:
Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland.
Tough call, you might have to do some research to see if this solution will help in your particular scenario.
-
They aren't identical, they have a different design, text, almost everything.
They are similar. As in they are both book stores.
The .com.au has Australian wording / spelling, the .com has English spelling and wording.
Do we need to specify hreflang="en-au" if they are different sites?
-
Are the sites identical but just hosted on different domains to target different regions?
Is there any variation in the English used on each site, for example, do you have Australian English spelling on the .com.au and US (or other) English on the .com?
If yes, you might want to have a look into the rel="alternative" hreflang="x" meta tags.
Checkout: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Especially the Example configuration: rel="alternate" hreflang="x" in action section
-
Thanks Mat, that definitely sounds wise.
Penalise was definitely the wrong word, I more meant, what other signals can we put out to Google to say that this is the com.au site and we want this to appear above the com.
-
I'd be ever so careful about doing anything to deliberately try to lower you ranking. It just sounds like an approach that could go horribly wrong.
You best bet might be to live with the fact that both will appear (or better still - enjoy and encourage it), but use the sites to achieve the end goal of getting users on to the correct site.
The usual way to do this would be to check the IP address of the user against a geoip database. I've used both the paid and free versions of the database available at maxmind.com for this. That will allow you to identify users that are in Australia and direct them towards to .au site.
How you direct them is important. You could just automatically redirect those users to the new site. Some people will say that this can look like cloaking and cause issues, but I don't believe that alone will do this. However it is often better to intercept those users with a message along the lines of "It looks like you are connecting from Australia - would you like to view our dedicated Australia website?" - then list the benefits and offer a choice there.
If you do that it would be good to set a custom variable in analytics to know when that message had been shown. That would allow you to measure how many people are following the suggestion.
Once you are happy it is working then you will probably end up encouraging both domains to appear as dominating the SERP for your brand is always useful.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile site scrolls past content straight to the products. Can this affect our seo?
As our content can be quite long at the top, we introduced js anchor scroll going straight to the products, by passing the banner and the content at the top. Can this have an issue on seo?
Intermediate & Advanced SEO | | JH_OffLimits1 -
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://mza.bundledseo.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://mza.bundledseo.com/blog/large-site-seo-basics-faceted-navigation1 -
Wrong Search words coming in search console
Hey there, My website All good but, in webmaster Search console some bad Queries(search terms) coming which is totally different from website. I want to make sure, is that harmful for my website traffic, as well as keywords Ranking?? How should i stop them to be crawl, ?? can any help for this query.?? i have attached screenshot of that, please check & help out, http://prntscr.com/cmusoq Thnx in advance.
Intermediate & Advanced SEO | | poojaverify060 -
New site. How important is traffic for a new site? And what about domain age?
Hi guys. I've been building a new site because i've seen a real SEO opportunity out there. I'm a mixing professional by trade and so I wanted to take advantage of SEO to help gain more work. Here's the site: www.signalchainstudios.co.uk I'm curious about domain age. This site fairly well optimised for my keywords, and my site got pretty good content on it (i think so anyway). But it's no where to be seen on the SERP's (link at all). Is this just a domain age issue? I'd have though it might be in the top 50 because my site's services are not hard to rank for at all! Also what about traffic? Does Google want to see an 'active' site before it considers 'promoting' it up the ranks? Or are back links and good content the main factor in the equation? Thanks in advance. I love this community to bits 🙂 Isaac.
Intermediate & Advanced SEO | | isaac6631 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Why is this site have a PR 0 rank? Anyone can figure this out? LegionSafety.com
Our site dropped in PR and we haven't done anything and not sure why the drop. Anyone have any recommendations?
Intermediate & Advanced SEO | | legionsafety0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Can converting a site to HTTPS impact ranking?
We have a client with a very large site that would like to put a login on each page; however, that would require the entire site be put behind a secure connection (changing http:// to https:// on every page). They rank for a ton of keywords and rank well. Would the change impact their rankings at all? Could it possibly help them?
Intermediate & Advanced SEO | | dknewmedia0