Page Authority for localized version of website
-
Hello everyone,
I have a case here were I need to decide which steps to take to improve page authority (and thus SEO value) for the German pages on our site. We localized the English version into German at the beginning of 2015.
www.memoq.com - English
de.memoq.com - German
By October 2015 we implemented href tags so that Google would index the pages according to their language. That implementation has been successful. There is one issue though: At that time, all our localized pages had only "1" point for Page Authority ("PA" in the Moz bar). At the beginning we though that this could be due to the fact that localization was done using subdomains (de.memoq.com) rather that subfolders (www.memoq.com/de). However, we decided not to implement changes and to let Google assess the work we had done with the href tags.
Its been a while now, and still all our German pages keep having only "1" point for page authority. Plus we have keywords for which we rank in the top 10 in English (US Google Search), but this not the case for the translated version of the keywords for German (Germany Google search).
So my question basically is:
Is this lack of page authority and SEO value rooted in the fact that we used subdomain instead of subfolder for the URL creation. If so is it likely that Page Authority for German pages and SEO value will increase if I change the structure from subdomains to subfolders?
Or is it that the problem in PA is rooted somewhere else that I am missing?
I appreciate your feedback.
-
Correct. I just confirmed that it is how Mozscape handles the hreflang tag. We do not yet transpose the actual value of the canonical version onto the hreflang variant. You should, in theory, assume the Page Authority of your homepage is identical with the Page Authority of the de.* variant. At least, that appears to be the way in which hreflang is supposed to be handled.
-
HI,
Thanks for your quick reply and expect your next answer. So the subdomain/subfolder issue is not the actual reason for the "1" point for every German page?
-
Page Authority is based on the links pointing to the page. There are only a handful of links on the web which point to your de.* version of your site, so it wouldn't have any independent Page Authority. Now, my guess is that MozScape simply does not currently project PA through to all of the hreflang variants. I am double checking on this now and should have an answer for you soon.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Will my site get devalued if I add the same company schema to all the pages of my website?
If I add the exact same schema markup to every page on my website - is it considered duplicate content? Our CMS is telling me that if I want schema mark-up on our site that it has to be the same on every page on the website. This limitation is frustrating but I am trying to figure out the best way to work within their boundaries. Your help is appreciated.
Technical SEO | | Annette_Wetzel0 -
Why is the Page Authority for posts in my blog so low
I have noticed that the Page Authority for my posts in my blog are all hovering around 1 and the rest of the pages on my website are around 20. The Domain Authority for my website is 16 and I think the page authority of my posts are negatively affecting my Domain Authority as I write more content. Any suggestions or recommendations as to why posts have such low Page Authority compared to similar pages. I have images, links, and great content in my posts, but they are considerably lower in Page Authority*
Technical SEO | | JoeyGedgaud0 -
Redesigned and Migrated Website - Lost Almost All Organic Traffic - Mobile Pages Indexing over Normal Pages
We recently redesigned and migrated our site from www.jmacsupply.com to https://www.jmac.com It has been over 2 weeks since implementing 301 redirects, and we have lost over 90% of our organic traffic. Google seems to be indexing the mobile versions of our pages over our website pages. We hired a designer to redesign the site, and we are confident the code is doing something that is harmful for ranking our website. F or Example: If you google "KEEDEX-K-DS-FLX38" You should see our mobile page ranking: http://www.jmac.com/mobile/Product.aspx?ProductCode=KEEDEX-K-DS-FLX38 but the page that we want ranked (and we think should be, is https://www.jmac.com/Keedex_K_DS_FLX38_p/keedex-k-ds-flx38.htm) That second page isn't even indexed. (When you search for: "site:jmac.com Keedex K-DS-FLX38") We have implemented rel canonical, and rel alternate both ways. What are we doing wrong??? Thank you in advance for any help - it is much appreciated.
Technical SEO | | jmaccom0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Wrong Page Ranking
Higher-level page with more power getting pushed out by weaker page in the SERPs for an important keyword. I don't care about losing the weaker page. Should I: 404 the weaker page and wait for Google to (hopefully) replace it with the stronger page? 301 the weaker page to the stronger page? NOTE: Due to poor communication between content team and myself, the weak and strong pages have similar title tags (i.e, "lawsuits" and "litigation")
Technical SEO | | LCNetwork0 -
Changed URLs from Upper to Lower Case, and lost page authority should we switch back?
During an overhaul of our site architecture we switched from having capitals in our urls to all lower case, we did the 301's but the page authority is not nearly what it was should we switch back? (new) http://www.usleaseoption.com/rent-to-own/florida vs (old) http://www.usleaseoption.com/rent-to-own/Florida/
Technical SEO | | mjo1360 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0