Hreflang="x-default"
-
Hello all
This is my first question in the Moz Forum, hope I will get some concrete answers I am looking for some suggestions on implementing the hreflang="x-default" properly in our site. Any previous experience or a link to a specific resource/ example will be very helpful. I have found many examples on implementing the homepage hreflang, however nothing on non-homepage urls within your site.
The below will be the code for the "Homepage" for /uk/. Here /en-INT/ is a Global English site not targeted for any country unlike en-MY, en-SG, en-AU etc. Is this the correct approach?
Now, in case of non homepage urls, should the respective en-INT url be "x-default" or the "x-default" shouldn't exist altogether? For example, will the below be the correct coding?
Many thanks
Avi
-
Hi Avi, thanks for your question! Did any of these responses answer it? If so, please mark one or more as a "good answer". If not, please let us know how we can help.
Christy
-
Martijn - "x-default" is correct:
For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well:
-
Your first set of examples is right for the homepages. For the product pages (the second one) the only line you have to delete is the line with the hreflang="x-default".
-
Hi Martin
Thanks for your response. I am a bit confused when you said I am doing it right/ but the x-default shouldn't exist. Could you clarify a bit?
Thanks.
-
Hi Avi,
You're doing it the right way. However the x-default also shouldn't exist. It's just the dummy text.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suggested Screaming Frog configuration to mirror default Googlebot crawl?
Hi All, Does anyone have a suggested Screaming Frog (SF) configuration to mirror default Googlebot crawl? I want to test my site and see if it will return 429 "Too Many Requests" to Google. I have set the User Agent as Googlebot (Smartphone). Is the default SF Menu > Configuration > Speed > Max Threads 5 and Max URLs 2.0 comparable to Googlebot? Context:
Intermediate & Advanced SEO | | gravymatt-se
I had tried NetPeak SEO Spider which did a nice job and had a cool feature that would pause a crawl if it got to many 429. Long Story short, B2B site threw 429 Errors when there should have been no load on a holiday weekend at 1:00 AM.0 -
International Blog Structure & Hreflang Tags
Hi all, I'm running an international website across 5 regions using a correct hreflang setup. A problem I think I have is that my blog structure is not standardized and also uses hreflang tags for each blog article. This has naturally caused Google to index each of the pages across each region, meaning a massive amount of pages are being crawled. I know hreflang solves and issues with duplication penalties, but I have another question. If I have legacy blog articles that are considered low quality by Google, is that counting against my site once or multiple times for each time the blog is replicated across each region? I'm not sure if hreflang is something that would tell Google this. For example, if I have low quality blog posts: blog/en-us/low-quality-article-1
Intermediate & Advanced SEO | | MattBassos
blog/en-gb/low-quality-article-1
blog/en-ca/low-quality-article-1 Do you think Google is counting this as 3 low quality articles or just 1 if hreflang is correctly implemented? Any insights would be great because I'm considering to cull the international setup of the blog articles and use just /blog across each region.0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Why is "Noindex" better than a "Canonical" for Pagination?
"Noindex" is a suggested pagination technique here: http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284, and everyone seems to agree that you shouldn't canonicalize all pages in a series to the first page, but I'd love if someone can explain why "noindex" is better than a canonical?
Intermediate & Advanced SEO | | nicole.healthline0 -
What does this kind of rel="canonical" mean?
It looks like our CMS may not be configured correctly as there is an empty section in the rel="canonical" rel="canonical" href="{page_uri}" /> Will having the above meta tag be harmful to our SEO?
Intermediate & Advanced SEO | | voicesdotcom0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Avoiding 301 on purpose; Landing homepage linking to another domain with "Click here to go" and 5 sec meta refresh
Hello, Some users when they search for our site by using "ourbrand" keyword that ignore the first result (we will call it here ourbrand.de -not real name-) and they look for ourbrand.com . Even though we have that domain name also registered (indeed it also has a high ranking power) we are doing a 301 from the dot com to the dot.de . What we want to do is to index the homepage of the dot com, that is http://www.ourband.com as a secondary result while doing a 301 to any other internal URL of the dot com to the dot .de. Yes, we will loose link juice for the main domain but at least we will not loose visits from the brand traffic (which is our main traffic). So the question is, would Google index ourbrand.com if we show just a landing page that just show our logo, a "Click here to go to ourbrand.de" with a link to http://www.ourbrand.de and a meta refresh of 6 seconds to that URL? Additionally a cookie would be sent to the first time visitors, so in the next time they would be automatically redirected. PS: The 6 seconds is to avoid search engine consider it a "301" like it do with short meta refresh (not sure what time is the minimum to avoid be considered a 301). Any other suggestions on how to deal with this problem are welcomed
Intermediate & Advanced SEO | | Zillo0