Hreflang implementation issue
-
We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products.
The code used for “hreflang” tag execution, on a product page, being:
These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below:
<loc>http://www.example.com/my/product_name</loc>
<lastmod>2017-06-20</lastmod>
Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market.
If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console > Security Issues
Hi all, *Admin please feel free to remove or add this to any existing post. I have searched the community for any similar questions. While checking in the Google Search Console, under the "Security Issues" (lone section) I have found Google pointing out specific pages of our website where the message we are seeing is "Content injection - These pages appear to be modified by a hacker with the intent of spamming search results." The Learn More link takes us to https://developers.google.com/webmasters/hacked/docs/hacked_with_spam?ctx=SI&ctx=BHspam&rd=1 We've never injected spam code or have not been injected with any spammy code so what baffles me is why would Google pick this up when we have mentioned to them very clear that our code is secure and not hacked. Has anyone received a similar message and had any luck removing the message correctly? Thanks in advance!
Intermediate & Advanced SEO | | SP10 -
Moz page optimization score issue, have a score of 95, but can get to 99 if I ad my keyword basically twice in the url.
Hello, I have a keyword for lack of providing too much info we will say my keyword is laptop-bags. Now we have a /laptop-bags/ page and inside that page **/laptop-bags/leather-shoulder/ ** We got a score of 95 for that page. Now I got a score of 99 when I changed it to **/laptop-bags/leather-shoulder-laptop-bags/ ** The way Bigcommerce handles is it will use the product category title in the url, page title and site links, to me it feels like it's spammy, as well as on my /laptop-bags/ page, I now have 18 keywords of " laptop bags " on that page when before it was 12, since I added laptop-bags to all 6 categories inside the laptop-bags page. How would you handle this, use the /keyword/ then /longtail-keyword/ in full or would using /laptop-bag/leather-shoulder/ still rank for leather shoulder laptop bags? I've asked this before and was told to use whatever sounded better to the user, but now moz is telling me different.
Intermediate & Advanced SEO | | Deacyde0 -
Referring domain issues
Our website (blahblah).org has 32 other domains pointing to it all from the same I.P address. These domains including the one in question, were all purchased by the website owner, who has inadvertently created duplicate content and on most of these domains. Some of these referring domains have 301's, some don't - but it appears they have all been de-indexed by Google. I'm somewhat out of my depth here (most of what I've said above has come from an agency who said we should address this before being slapped by Google). However I need to explain to my line manage the actual issues in more detail and the repercussions - any anyone please offer advice please? I'm happy to use the agency, or another - but would like some second opinions if possible?
Intermediate & Advanced SEO | | LJHopkins0 -
Https Homepage Redirect & Issue with Googlebot Access
Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
Intermediate & Advanced SEO | | G.Anderson
My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg0 -
Issues with Google-Bot crawl vs. Roger-Bot
Greetings from a first time poster and SEO noob... I hope that this question makes sense... I have a small e-commerce site, I have had Roger-bot crawl the site and I have fixed all errors and warnings that Volusion will allow me to fix. Then I checked Webmaster Tools, HTML improvements section and the Google-bot sees different dupe. title tag issues that Roger-bot did not. so A few weeks back I changed the title tag for a product, and GWT says that I have duplicate title tags but there is only one live page for the product. GWT lists the dupe. title tags, but when I click on each they all lead to the same live page. I'm confused, what pages are these other title tags referring to? Does Google have more than one page for that product indexed due to me changing the title tag when the page had a different URL? Does this question make sense? 2) Is this issue a problem? 3) What can I do to fix it? Any help would be greatly appreciated Jeff
Intermediate & Advanced SEO | | IOSC0 -
Use of rel="alternate" hreflang="x"
Google states that use of rel="alternate" hreflang="x" is recommended when: You translate only the template of your page, such as the navigation and footer, and keep the main content in a single language. This is common on pages that feature user-generated content, like a forum post. Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland. Your site content is fully translated. For example, you have both German and English versions of each page. Does this mean that if I write new content in different language for a website hosted on my sub-domain, I should not use this tag? Regards, Shailendra Sial
Intermediate & Advanced SEO | | IM_Learner0 -
Multilingual sites: Canonical and Alternate tag implementation question
Hello, I would like some clarification about the correct implementation of the rel="alternate" tag and the canonical tag. The example given at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 recommends implementing the canonical tag on all region specific sub-domains, and have it point to the www version of the website Here's the example given by Google. My question is the following. Would this technique also apply if I have region specific sites site local TLD. In other words, if I have www.example.com, www.example.co.uk, www.example.ca – all with the same content in English, but prices and delivery options tailored for US, UK and Canada residents, should I go ahead and implement the canonical tag and alternate tag as follows: I am a bit concerned about canonicalizing an entire local TLD to the .com site.
Intermediate & Advanced SEO | | Amiee0