Using GeoDNS across 3 server locations
-
Hi,
I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience
So UK visitors would connect to UK dedicated server, North America - New York server and so on
Is this a good way or would this effect SEO negatively.
Cheers
Keith
-
Hi Keith,
I meant the physical bandwidth - i.e. your time. I probably should've been more clear in a technical forum!
For the architecture, there are a few common setups. What I am in the middle of doing here at my company is through Google Cloud services. Duplicating the website app or script (I.e. Wordpress, Ghost, Drupal, CMS, Python App, Rails app, etc) across the several servers and using a load balancer to determine the fastest server. In the app's configuration I am using a single Database server also set up on Google Cloud, so when one server executes a command, it is reflected for all users on all servers. If you're Cron-jobbing all the servers you have set up but no common database, you're going to have some integrity issues, with some servers having some comments or edits, and some servers not.
-
Hi,
I have quite a lot of servers dotted around UK and USA so hosting and bandwidth is no big issue. if I host soley UK the ping times is a whopping 100ms+ to USA and vice versa so this leads me to hosting at least bother countries and latency will be 10-20ms and TTFB nice and low
I like the idea of creating and maintaining one major site as all will be English based, any backlinks will always be pointed to the dot com as opposed to splitting across multiple domains. Seo wise not too bothered will be focusing on speed and entertaining people with info on what they looking for - too me this is more important then the rest
Al servers are Cpanel based, so will try and find a solution to replicate sites in real-time or cron based intervals. this will be the next challenge
If I can pull this off it will be great for other sites I have too
Regards
Keith
-
Personally, I would use the one domain. And from what you've said, you would prefer it as well.
Thankfully, rankings are on a domain basis and not an IP basis, so there would be no issue in the first scenario. If you are duplicating and synchronizing the servers, you are better off using the one domain because you aren't creating two separate websites with differing content (UK English vs US English).
Do you have the bandwidth or ability to produce separate versions (for each domain) for each area you want to target? If not you are best off generalizing your website to target all English users instead of en-US, en-GB, etc. You're going to have to evaluate your geotargeting goals and budget.
-
Hi,
Many thansk for your input
I was planning to use cloudns GeoIP to send visitors to the server of their region.
So having one web site - www.xyz.com that is duplicated across three server (location) so all people see the same site. this would maintain the backlinks and no matter if google crawls from USA or UK it will see it as one domain with exception of 3 IP's in useor have www.xyz.com and www.xyz.co.uk as duplicates and set this in google webmaster tools.
plus set the language en-US and en-UKNot sure which is the best solution. www.xyz.com has the most backlinks and DA, where www.xyz.co.uk has zero and will be new to the world
I would rather people generate backlinks for the one domain as well
Your thoughts are welcome
Regards
Keith
-
The way GeoDNS works is through one of two methods: split DNS or load balancing. The end result is the same, the user will be directed to their closest or fastest available server.
Theoretically, this helps achieves a major goal of technical SEO - great site speed.
With the new Google Web Core Vitals update of this year, site speed and user experience has been further notched up as ranking factors. To get more technical– LCP, largest contentful paint, the speed of which the largest asset on a page loads, and FCP, first contentful paint, the speed of which the first legible content is produced on the screen, are site speed signals used by Google in their ranking algorithm. By connecting a user to the closest/ fastest server available, you can bring down the time on LCP and FCP and thereby increase your rank. The rank change may not be immediately noticeable depending on the competitiveness of your keywords and industry. You can measure these and other variables here: https://developers.google.com/speed/pagespeed/insights/
In short: No, your SEO won't be negatively impacted, and it will more likely be positively impacted by these optimizations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having a sub-domain on a different server affect SEO?
I'm working with a company that has a hard-coded website on the root domain, and then a WordPress blog on a subdomain on a separate server. We're planning on implementing a hub and spoke model for their content, hosting the main hubs on the root domain and the linked articles on the blog. Is having the blog on a different server going to hinder our SEO efforts?
Technical SEO | | KaraParlin0 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Web server locations for international seo
We have a site that is currently hosted in the far east for the far eastern market. We are having issues with the hosting co. so we are considering bringing the site back onto our servers in the UK. However, we don't obviously want to damage too much the bit of uplift we get from local hosting. What is our best approach? Is it ok just to have the site in the UK even though its aimed at the Far East? Or is the use of a proxy server good? Or should we look for other local hosts? Any help very gratefully received. Iain
Technical SEO | | iain0 -
Genesis WP Theme H1 Tag not properly Used?
I am in the process of redesigning my website, and I have been working on the Genesis framework a lot lately, so I used the Genesis framework to make my new site. The URL is http://protechig.com As I look at the H1 on the page (homepage only, every other page has solid h1s from an SEO perspective.) The first thing that I see is that the home page H1 is a links (to protech's home page). The second thing that I see is the the title text is replaced with an image (my logo) and there is a text-indent:-99999; and overflow:hiden; I just want to know from an SEO perspective if this is okay, and, if it isn't, what I could/should to to rectify it. Thanks Zach
Technical SEO | | Zachary_Russell0 -
3 pages crawled?
For some reason, my account says it only crawled 3 pages this week, where its usually about 3K. This is my robots which shouldnt affect http://www.theprinterdepo.com/robots.txt and this is my site http://www.theprinterdepo.com any idea?
Technical SEO | | levalencia10 -
Hacked Server IP Range Penality?
I use a justhost.com reseller account to host about 15-20 of my own websites. None of the sites are related, nor do I interlink or do anything blackhat with any of them. All of the sites have unique content. Some of it isn't great, but I didn't use a writing service on any of then, it was all written by myself. Recently I found a list of my sites (as well as about 200 others hosted by justhost) on a hacker website that listed the cpanel usernames. I alerted the host and the issue is being fixed. I am changing all of my usernames and passwords for all of the sites. Anyway, I recently took a look at some analytics and rankings and noticed that I lost a lot of my rankings on a handful of those sites recently. I know there was the big de-indexing of junk blogs recently, but I don't think that is the case. I can still find all of my sites in Google, they are just out of the first 50 results, when a majority of them were ranked from 5-20 in the SERPs. Here are three of the sites and their phrases: http://nintendoconsoles.com/ "nintendo consoles" - This domain I bought from someone so there could be some sort of sandbox period for it. http://webhostingfordrupal.com/ "web hosting for drupal" - This was a new register. http://seotirical.com "seotirical" - we don't show up, but all of the tweets and links to us do. - This was a new register. I realize these sites aren't perfect, and might not have been ready for the first page. The Nintendo site is about 3 months old, but the Drupal site has been around for 8+ months. I might try adding a fresh piece of content tonight to see if that helps, but I thought it was curious that it happened across the board with about 5-6 sites on my one reseller account. I don't even know if the top portion of this post has anything to do with the problems, but I thought I'd see if anyone has insight. Cheers,
Technical SEO | | vforvinnie
Vinnie0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0 -
Using the Canonical Tag
Hi, I have an issue that can be solve with a canonical tag, but I am not sure yet, we are developing a page full of statistics, like this: www.url.com/stats/ But filled with hundreds of stats, so users can come and select only the stats they want to see and share with their friends, so it becomes like a new page with their slected stats: www.url.com/stats/?id=mystats The problems I see on this is: All pages will be have a part of the content from the main page 1) and many of them will be exactly the same, so: duplicate content. My idea was to add the canonical tag of "www.url.com/stats/" to all pages, similar as how Rand does it here: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps But I am not sure of this solution because the content is not exactly the same, page 2) will only have a part of the content that page 1) has, and in some cases just a very small part. Is the canonical tag useful in this case? Thank you!
Technical SEO | | andresgmontero0