Blocking certain countries via IP address location
-
We are a US based company that ships only to US and Canada. We've had two issues arise recently from foreign countries (Russia namely) that caused us to block access to our site from anyone attempting to interact with our store from outside of the US and Canada.
1. The first issue we encountered were fraudulent orders originating from Russia (using stolen card data) and then shipping to a US based International shipping aggregator.
2. The second issue was a consistent flow of Russian based "new customer" entries.
My question to the MOZ community is this: are their any unintended consequences, from an SEO perspective, to blocking the viewing of our store from certain countries.
-
Both answers above are correct and great ones.
From a strategical point of view, formally blocking russian IPs does not have any SEO effect in your case, because - as a business - you don't even need an SEO strategy for the Russian market.
-
Fully agree with Peter, very easy to bypass IP blocking these days, there are some sophisticated systems that can still detect but mostly outside the range of us mere mortals!
If you block a particular country from crawling your website it is pretty certain you will not rank in that country (which I guess isn't a problem anyway) but I suspect this would only have a very limited (if any) impact on your rankings in other countries.
We have had a similar issue, here are a couple of ideas.
1. When someone places an order use a secondary method of validation.
2. With the new customer entries/registrations make sure you have a good captcha, most of this sort of thing tends to be from bots. A captcha Will often fix that problem.
-
Blocking IPs on geolocation can be dangerous. But you can use MaxMind GeoIP database:
https://github.com/maxmind/geoip-api-php
or you also can implemente GeoIP in "add to cart" or "new user" as additional check. So when user is outside of US/CA you can require them to fill captcha or just ignore their requests.Now from bot point of view - if bot visit with US IP and with UK (example) IP they will see same pages. Just within UK they can't create new user or adding to cart. HTML code will be 100% same.
PS: I forgot... VPN or Proxies are cheap these days. I have few EC2 instances with everything just for mine own needs. Bad Guys also can use them so think twice about possible "protection". Note the quotes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating country specific pages to show pricing in local currencies
For our product page, we want to be able to show the pricing in the local currency of the visitor. I discussed this with our web developer and he said that we can create country-specific pages, so one for UK, Australia, etc. I am afraid that this solution might hurt our SEO as Google might see this as duplicated content. What are your thoughts about this? The website runs on WordPress.
Technical SEO | | Maggie.Casas0 -
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Location-Aware Browsing issue. Could it cause a drop in rankings
Hello Mozzers. A client has seen big drop in their organic rankings in google.co.uk. Desktop results have dropped from page one to page three and mobile results have completely gone. They have recently added Location-Aware Browsing to the site so that it attempts to find you location and add this to the search box. However this appears to be in a loop and doesn't then turn off once you have allowed it. It keeps appearing in both Firefox and Chrome and asking for permission to use you location. Could this have resulted in the ranking drop? There have also rolled out a new mobile version and this has the same issue. Any ideas? Thanks in advance TH
Technical SEO | | highwayfive0 -
Country Specific Domain
Guyz, we are new startups and have one very simple question regarding domain name. Should we use example.com or example.com.au ? Our Goal initially would be to target customer from Australia and gradually go global. So if we opt for .com.au we may have an edge in terms of local SEO in the beginning but lose out in the long run. What is the best way to tackle this? Thanks
Technical SEO | | WayneRooney0 -
How can I block incoming links from a bad web site ?
Hello all, We got a new client recently who had a warning from Google Webmasters tools for manual soft penalty. I did a lot of search and I found out one particular site that sounds roughly 100k links to one page and has been potentialy a high risk site. I wish to block those links from coming in to my site but their webmaster is nowhere to be seen and I do not want to use the disavow tool. Is there a way I can use code to our htaccess file or any other method? Would appreciate anyone's immediate response. Kind Regards
Technical SEO | | artdivision0 -
SEOBook RankChecker Works at one location but not another
I use SEOBook's RankChecker to do spot checking of keywords for clients or for potential clients. I like the tool quite a bit, but I've noticed that it recently stopped working at our office (instead of rankings, I just get a series of dashes) but works fine from my home computer. I'm thinking that it may have to do with our company's firewall. Anyone have any thoughts?
Technical SEO | | TopFloor0 -
301 on certain url string
I have a few thousand old urls with the string /content/ in them and are looking for a way to 301 batch redirect them. So for all the urls that contain the word 'content' I would like to redirect to 1 specific page. I have tried the methods below without success. Regular 301's are working fine but this particular method is not working for me. I am running a Joomla site but I don't imagine that would have any impact. Any suggestions would be greatly appreciated. Redirect 301 ^content/.*$ http://www.mysite.com Redirect 301 ^content/ http://www.mysite.com
Technical SEO | | omega0 -
Is it ok if a company address is located in the footer which is in a external php file?
We all know that a company's address must be in the footer on every single page. But what if that actual address resides inside an external php file. Meaning the entire footer is an external php file. Will that affect local seo negatively? thanks for your help.
Technical SEO | | RogersSEO1