Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One-Pager and SEO
We're building a page that is going to feature over 31 people as difference makers in their field. We're unveiling one a day for an entire month. The very early mockup of the page has name, pic, some bio info, and a link to open up a new window with the full bio. I would love to have all of the bio content for all of the people on the page (and indexable), but I'm not sure how to do that while still being able to hide the full bios until they are expanded. Anybody have any tips that are SEO-friendly and/or examples of a page that is built like this and ranks well. Thanks!
Technical SEO | | spackle0 -
SEO url best practices
We're revamping our site architecture and making several services pages that are accessible from one overarching service page. An example would be as follows: Services Student Services Essay editing Essay revision Author Services Book editing Manuscript critique We'll also be putting breadcrumbs throughout the site for easy navigation, however, is it imperative that we build the URLs that deep? For example, could we simply have www.site.com/essay-editing rather than www.site.com/services/students/essay-editing? I prefer the simplicity of the former, but I feel the latter may be more "search robot friendly" and better for SEO. Any advice on this is much appreciated.
Technical SEO | | Kibin0 -
Does Quantcast interfered with SEO?
Quantcast (http://www.quantcast.com/), a stats program was removed from my clients website because the former SEO consultant said it interefered with his SEO efforts, claiming the google bots weren't reading it right. I have never heard of that before and my client now wants Quantcast back on their website. Has anyone heard of this problem with Quantcast?
Technical SEO | | StreetwiseReports0 -
Matching C Block
Hi Guys We have 2 sites that are in the same niche and competing for the same keywords. The sites are on seperate domains one is UK and one is .com They have their own IP's however have both have the same C Block... We have noticed that when the rankings for one site improves the other drops.... Could the C Block be causing this?
Technical SEO | | EwanFisher0 -
SEO mark-up language
Hi there, i have been looking into schema and noticed some articles on it recently too. Who here has used schema/mark-up language in their code? I heard Google will be looking for at this soon to interpret data better. Is it schema .org where i can get the code information from? Any help much appreciated.
Technical SEO | | pauledwards0 -
SEO-Friendly FAQ Software
Hi, We use Kayako Supportsuite (help desk and FAQ software) installed on our site and it's causing about 80% of all our SEO errors/issues. I've googled a lot and I can't seem to find a FAQ software (we need the ability to maintain and constantly update a large # of questions) that doesn't have duplication (print version, pdf version of each question, etc.), terrible URL structure and an overall very cheesy look. We don't need the helpdesk part, just FAQ functionality. Anybody have any ideas? Thanks!
Technical SEO | | mynewco0 -
What are SEO factors in re-doing a website?
Most of my work now involves converting older websites to CMS-based sites (in Wordpress) and I'm wondering about best practices here. If I create a "dev" or "sandbox" directory for my development work how do I keep the pages from being indexed while I am working on the new site? Can I "noindex" a directory? What do I do with the old html files when the new site goes live? I'm assuming I will do a 301 redirect from domain.com/index.html to the new domain.com/, and also on all of the inner pages that have equivalent pages in the new site. But there will be a lot of old files left that have no equal in the new site. Do I just delete these, or noindex nofollw them?
Technical SEO | | bvalentine0