Can site blocked for US visitors rank well internationally?
-
Because of regulatory reasons, a stock trading site needs to be blocked to United States visitors
Since most of google datacenters seem to be located in the US, can this site rank well in the other countries where does business despite being blocked in the US? Do U.S. Google data centers influence only US rankings?
-
In theory yes it shouldn't make a difference to your international rankings. It's just making sure Googlebot (which does come from the US) is able to crawl your website.
For every other country, because it's for legal reason I would suggest just block them, but you need Googlebot (and other bots like Roger) to crawl your website.
Instead of completely blocking traffic from the US, would it be possible (and I don't know the legal requirements) to detect if the visitor is from the US and show them a different page? "Sorry, we can't help you but here are some great resources" and link to some American sites. Hopefully, Google would still crawl your website as would be able to see your sitemap.
Sometimes if it's a legal requirement you have to do it, even though it may potentially harm your rankings (I have a similar issue with a finance client, where some of the legal requirements go against Google / UX best practices but they have to be done to trade).
What I would do is once you have blocked US visitors, is monitor Search Console and your server logs to make sure Googlebot is still crawling your website as frequently and as deeply.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Desk top rankings differ from mobile rankings
Hi there! We are working on clients site for SEO. We have noticed that our clients 'desk top search results' on Google differ from the 'mobile / tablet search results'. For the desktop - the client is in the top 4 results of page one for Google, but for mobile / tablet results they are at the bottom of page one. We converted the clients site to be responsive 6 months ago. We have been through Google webmaster tools and made all suggested alterations. All programming is to a high standard. Competitors all seem to keep their current rankings - whilst our clients seem to drop. Any suggestions what the issue is would be very much appreciated 🙂 Thanks in advance. Phil. . We re-built their website so that it was responsive. We have been But -
Algorithm Updates | | Globalgraphics1 -
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Is user engagement a ranking signal?
Read something about user engagement might be a signal that Google uses, along with links and on-site optimisation, to decide if a search result goes up or down. What should I believe and what are the developments in this field?
Algorithm Updates | | MozzieJr0 -
Https slower site Versus Non https faster site??
Hey all, I know that everyone is going on about https as a ranking signal (as far as I read it is not a very important ranking signal, but a low ranking signal) but Site speed is a ranking signal https is now a ranking signal as well https makes sites slower So in view of the above, what's better? An https site that is slower A non https site that is faster Thanks!
Algorithm Updates | | bjs20100 -
How can I submit a reconsideration request while not having any manual action?
Hello, My URL is: www.BannerBuzz.com and it has been penalized by google algorithm and my all ranking are getting down so i have found some links and disavowed them in google webmaster tool and now I want to submit a reconsideration request with google but I am not able to do the same. When I Click on "Request reconsideration of your site." it takes me to Check any manual action against my website. After Checking the manual action it says "No manual web spam actions found." and After that No Option to Submit My Request! Can we submit a request while not having any manual action against the site? Thanks
Algorithm Updates | | CommercePundit0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Can anyone explain these changes to our Titles in the SERPS?
Hi there, We've been doing well in the SERPS over the past few weeks. Our previous meta title was displayed as: "Hunter Original Tall - Buy Original Tall Online Here" However, recently we've seen the title in the SERP switch over to: "Hunter Original Tall - Cloggs.co.uk" This has occurred on several of our product pages which display a particular style of a certain brand. So for example: "Ugg Bailey Button - Cloggs.co.uk" Has anyone else experienced these changes or can explain why this may have happened?
Algorithm Updates | | NigelJ
There is not change to the source code and our Titles have proven to have good click through rates in the past. Any ideas mozzers?0