Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Can I Do To Improve The SEO of My Site?
We have a website that is ranking okay but we can't seem to get past #6 or #7 for a specific national keyword, "self storage software". We are working on a more effective back-linking strategy right now, but we really are having a hard time identifying steps to take besides that. If anyone can help me out and give me some suggestions I would be very appreciative. Maybe even seeing a competitive analysis from someone else would help catch something that I am not seeing. Website is www.storageunitsoftware.com Thanks in advance.
Technical SEO | | kenturley0 -
IP Change
Hello MOZ friends! We recently changed servers and subsequently had a change in IP. It's a better and faster server but have seen a significant drop in SERPS. Could this be a result of moving the site? Thanks!
Technical SEO | | wearehappymedia0 -
SEO trending down after adding content to website
Hi
Technical SEO | | swat1827
Looking for some guidance. I added about 14 pages of unique content and did all of the on page SEO work using Yoast - have 'good' status on all of them some of the website architecture was changed - mainly on one page. That being said, we got a significant bump the day I implemented, however every day thereafter we have had very bad results. Worse than we had before for about 3 days now. I did resubmit the updated sitemap to GWT and I'm showing no crawl errors. Also, curious if my Robots.txt file could be the issue. All it contains is User-agent: *
Disallow: /wp-admin/ Any insight or advise is greatly appreciated!
Thanks for your time0 -
SEO Ultimate’s and Yoast together
We are using SEO Ultimate’s and Yoast together on a Wordpress website, does anyone have negative experience by using this to plugins together? The main result to use this, is because we could remove the /product/ from the slug. We want clean URL’s and not URL’s that have: website.com/product/PRO1 or website.com/product-catagory/CAT2 (standard Wordpress) But we want: website.com/PRO1 or website.com/CAT2 In Yoast is possible to remove the URL /category/, but not the /product/. With SEO Ultimate’s it is possible. So we use them both. SEO Ultimate’s for the removal of the URL, and Yoast for it’s great functions: Sitemap, no index /page2/, no index media etc. etc. But Yoast gives the following note: ‘’WordPress SEO (Yoast) is known to cause conflicts with SEO Ultimate. Please deactivate WordPress SEO if you wish to continue using SEO Ultimate.’’ I was wondering what this could mean for overal SEO performance. And maybe there is a way to remove the URL’s by using other methods > and remove the SEO Ultimate’s plugin.
Technical SEO | | Happy-SEO1 -
Blocking subdomains without blocking sites...
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com. Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load? This is a simplification of a problem I am running across. Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites). Thoughts?
Technical SEO | | SL_SEM1 -
Using Web Applications for SEO
I am in the sign/banner business, for years I have had a flash based web application that I developed which allows customers to design their own signs/banners online. With the demise of flash i am prompted to begin developing an HTML5 based application to take it's place. Developing this software is a rather expensive endeavor so many local sign shops, which don't sell on the web, don't bother to develop such an application, but what if i gave it to them? I assume a fair amount would find great value in such an application thereby allowing their clients to communicate a design idea without having to drive to the store front. The application would actually run embedded on my site thus earning me a link back to my site. Question is this: Is this a bad idea. If dozens of sign shops are running my application embedded on their sites will the help or hurt me? Thanks.
Technical SEO | | RocketBanner0 -
Blocked by meta-robots but there is no robots file
OK, I'm a little frustred here. I've waited a week for the next weekly index to take place after changing the privacy setting in a wordpress website so Google can index, but I still got the same problem. Blocked by meta-robots, no index, no follow. But I do not see a robot file anywhere and the privacy setting in this Wordpress site is set to allow search engines to index this site. Website is www.marketalert.ca What am I missing here? Why can't I index the rest of the website and is there a faster way to test this rather than wait another week just to find out it didn't work again?
Technical SEO | | Twinbytes0 -
Buying new domains to help with SEO
Hi, Does buying new keyword related domains and 301 redirect them to my site have any seo benefit?
Technical SEO | | Socialdude0