International SEO and server hosting
-
I'd appreciate feedback on a situation. We're going through a major overhaul in how we globally manage our websites.
Regional servers were part of our original plan (one in Chicago, UK, and APAC) but we've identified a number of issues with this approach. Although it's considered a best practice among many, the challenges we'd face doing it are considerable (added complexity, added steps and delays to updating sites, among others).
So, we shifted our plan and how are looking at hosting here in the US but to use Akami to deliver images and other heavier data pieces from their local servers (in the UK, etc.). This is how many of the larger companies like Amazon, etc. delivery their global websites.
We hope that using Akami will allow us to have good performance while simplifying our process. Any warning signs we should be aware of? Anyone doing it this way and has a good experience/bad experience?
-
Gerd knows a lot more about CDNs than I do
Yes, you absolutely need to have the CDN content appear as your own subdomain. Standard SEO applies for your image and video content optimization to make sure the content which is now sitting on the subdomain (not your TLD) gets indexed properly.
-
Make sure that your CDN services provide you with domain aliasing - for example if your domain is www.example.com you want your CDN services host-name be part of the domain - i.e. cdnuk.example.com for the UK region.
You will then at least get some value from image crawlers etc. Don't go for any CDN service which does not allow your content to resolve to a subdomain of your primary domain.
SEO does play a role though as the speed of the CDN will affect your overal pagespeed and will also affect how much content a bot will be able to crawl within your allocated crawl quota. The faster your load-time/CDN the more content will be crawled.
I would not bother with localisation tags if your main objective is to optimise performance / page-load time based on your users geo-location.
It looks like you set your mind on Akami, but I would perhaps also evaluate Amazon S3/Cloudfront or Rackspace as those service deliver the same level of SLA but might be more cost-effective for your purposes.
Get your CDN provides to give you a 1-2 month free proof-of-concept (they will only offer this if your traffic is substantial) so that you can try out the service. Never sign up for contracts longer than 12 months, and only sign an annual contract if you receive a large discount. Most CDN companies will charge you for 10 months when signing up for an annual contract.
Also ensure that your CDN provider gives you (near-) or preferably real-time access to statistics and performance reports (you want to see how many requests/sec they have served and what the speed was.
Test your site / CDN via tools such as webpagetest.org or pingdom.com - they have POPs across the globe to simulate remote tests.
-
Thanks for confirming!
-
You don't need to do this anymore. Google uses other signals now to determine what region you should appear in. They understand that someone may choose to host a site in the US rather than some small country for reliability reasons. Just geo-target your sites and you will be fine.
s) and language tag
b) proper language for that region
c) add your local address and contact information to your footer globally if possible
d) geo-target in WMT
Sites like amazon serve their heavier data pieces locally for performance issues, not for SEO.
Same rules apply though with interlinking same owned sites sitting on the same server though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Poor internal linking?
Hi guys, Have a large e-commerce site 10,000 pages as a client and they are currently not getting much organic traffic to their level 3 sub-category pages, the URLs are like: https://www.domain.com.au/category/s...-category-type These pages have been on-page optimised, category content added, yet hardly any traffic. However the site level 1, level 2 pages do quite well. So this suggests this might be an internal linking issue? The site is definitely not penalized and as enough authority for these level 3 pages to rank. Any ideas would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | bridhard80 -
SEO Strategy help
Hi, I run a B2B 3rd party retail ecommerce site and I am kind of stuck on how to implement my SEO strategy.
Intermediate & Advanced SEO | | steve45058
So I learned from AdWords data that the best converting words to my site is the (Brand name, Model Number). Many of my B2B customers already know what they want/are looking for. Now this is all fine and dandy for adwords, but I don't really know how to implement this strategy on the SEO side. I do rank decent for some of these product keywords, but 99% of them I do not (which confuses me because some of the brands I rank high for are the more popular brands eg. more competition.) When I do keyword research on SEMRush or another site, it tells me that the competition for this type of keyword strategy is extremely high. Any Help, Advice would be greatly appreciated. Thanks!1 -
Which URL is better for SEO?
We have a URL structure question: Because we have websites in multiple countries and in multiple languages, we need to add additional elements to our URL structure. Of the two following options, what would be better for SEO? Option 1: www.abccompany.com/abc-ca-en/home.htm Option 2: www.abccompany.com/home.abc.ca.en.htm
Intermediate & Advanced SEO | | northwoods-2603420 -
Switching Hosting Company - bringing hosting closer to our website
Hi,
Intermediate & Advanced SEO | | TruvoDirectories
we are a Belgian company running .be websites and our current hosting company is based in Norway.
Is there any SEO benefit in transferring our .be websites to a hosting company based in Belgium?0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
How to be a good SEO optimizer while competing with a good ranked Bad SEO optimizer?
My keywords are very competitive. My on page optimization report gives A grade for all the keywords I want to target to my Root domain. But my root domain does not show up on search engines for those same keywords. So thanks to SEOmoz i have managed to understand the place I lack is good link building. My competitors have done lot of link building through spamming, commenting on blogs, directories etc. Now according to good seo, this is not right. What do i do? I get digging more in it, i realized that i am getting traffic mostly for less globally searched keywords. But my competitors get high traffic from well searched keywords. How do i cope with such competition? Thanks
Intermediate & Advanced SEO | | MiddleEastSeo0 -
SEO Strategy for Microsite
I am working on a project to build a microsite of sorts that will represent a joint program between two large organizations with established web presences and strong domains. Each of the organizations has dedicated sections on their sites speaking to the program, but the leadership has decided the joint program deserves it's own site with dedicated content. The two larger sites perform very well for SEO, and I don't necessarily want to jeopordize thir rankings by delivering content that competes directly with them. So I am doing some keyword research to find some opportunities that will alllow me to use the new site to target keywords not yet being captialized by the larger sites. My grand scheme is to have the three sites targeting the broadest array of keywords possible, thus maximizing exposure and avoiding competition. Here is the rub: the content between the three sites will be different but very similar, and there will be plenty of cross linking, especially from the existing sites to the new site, as we grow the brand of the joint program. I'm curious to here some expert opinions on what the puitfalls of the strategy are and what are some of the things I can do to avoid falling in the black hat category - I recognize that proliferating sites around a single topic and cross linking them is black hat. The organizations simply want to build a brand around a joint program and we are striggling to do that without a dedicated website.
Intermediate & Advanced SEO | | AmyLB0