Does server (host) location effect local search results?
-
Hey
I was wondering if the location of your server (host) effects your local search engine results?Suppose I have an e-commerce website in the Netherlands and I want to host my website in the USA or UK, does this effect my search engine results in the Netherlands?
-
From Google's Webmaster Tools help on Geotargeting:
"If no information is entered in Webmaster Tools, we'll rely largely on the site's country domain (.ca, .de, etc.). If an international domain (.com, .org, .eu, etc) has been used, we'll rely on several signals, including IP address, location information on the page, links to the page, and any relevant information from Google Places."
More info here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=62399
I recently set the geotargeting in Google Webmaster Tools of a UK hosted .com domain to Australia and started ranking on google.au.
Make sure you've optimised for local as you normally would.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, but not shown in search result
Hi all We face this problem for www.residentiebosrand.be, which is well programmed, added to Google Search Console and indexed. Web pages are shown in Google for site:www.residentiebosrand.be. Website has been online for 7 weeks, but still no search results. Could you guys look at the update below? Thanks!
Technical SEO | | conversal0 -
Localizing URLs Path - Hreflang
Hello, This is a simple question regarding how URLs should be managed for proper results with the hreflang tags. Right now, we have a website in English and German. The hreflang tag is working properly. This is how we currently have it: https://www.memoq.com/ https://de.memoq.com/ But we will soon change the way we localize our web, moving out of the sub-domain structure. There is this possibility of localizing the URLs path, but I was wondering if the hreflang tag would work in such case. The new structure would look something like: https://www.memoq.com/why-memoq https://www.memoq.com/de/warum-memoQ So my question is: If we localize the keyword in the path of the URL, will the tag still work? Or do they need to be in the same language than the English version. Thanks!
Technical SEO | | Kilgray1 -
How to avoid instead suggestion from Google search results ?
Hi, When I search for "Zotey" in google, the following message is being displayed. Showing results for zotye
Technical SEO | | segistics
Search instead for zotey Anyone let me know how to get rid of this conflict asap? Regards, Sivakumar.0 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
Inurl: search shows results without keyword in URL
Hi there, While doing some research on the indexation status of a client I ran into something unexpected. I have my hypothesis on what might be happing, but would like a second opinion on this. The query 'site:example.org inurl:index.php' returns about 18.000 results. However, when I hover my mouse of these results, no index.php shows up in the URL. So, Google seems to think these (then duplicate content) URLs still exist, but a 301 has changed the actual goal URL? A similar things happens for inurl:page. In fact, all the 'index.php' and 'page' parameters were removed over a year back, so there in fact shouldn't be any of those left in the index by now. The dates next to the search results are 2005, 2008, etc. (i.e. far before 2013). These dates accurately reflect the times these forums topic were created. Long story short: are these ~30.000 'phantom URLs' in the index out of total of ~100.000 indexed pages hurting the search rankings in some way? What do you suggest to get them out? Submitting a 100% coverage sitemap (just a few days back) doesn't seem to have any effect on these phantom results (yet).
Technical SEO | | Theo-NL0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
No results with Link Analysis
So I have been working with a domain since November last year that still shows no improvement in regards to the link analysis. I am baffled because we have gotten them onto the first page on Google for a few of the keywords we are optimizing. Any help with this is greatly appreciated and I am a noob so definitely open to learning. Thanks in advance to all of you. Domain in question - www.modernportablerefrigeration.com Domain is currently on a shared server if that makes any difference. Cordially, Todd Richard [email protected]
Technical SEO | | RichFinnSEO0 -
Will duplicate content on ecommerce cause harm in search results?
First off, SEO learner, not a professional, therefore question is not for any client. A new (less 1 yr) ecommerce site in a particular sector is now moving into partnership with relevant websites to be their online store. A 'store' link on the partner site will redirect to the ecommerce domain to a dedicated area (on a domain/directory path) with the partner's branding. To do this though means duplicating the entire catalogue for each partner that comes on board for this scheme. So the original ecommercesite.com/categories also delivers ecommercesite.com/partner1/categories (with partner's brand identity) ecommercesite.com/partner2/categories (with partner's brand identity) and so on Won't duplicating the product catalogue in directories cause problems in delivering effective SERPs for the original ecommerce site?
Technical SEO | | BeIntermedia0