How do you block incoming links to your site?
-
With the new update to google focusing on link spam and multiple anchor text ? If you have incoming links that you would like to block or make no follow?
-
I guess the only real way is to use site explorer and then contact the site directly... We have been spammed bad by competitors and google is penalizing us.. I have contacted the sites and am waiting for response...
-
Yes, as Ryan said, this is the nature of the internet. Just like you can write about anything, anybody, you can create a link to anybody and just to clarify, the block that Ryan is talking about is essentially that if someone linked to you, you could "technically" setup a block so anybody visiting your website from that link, would not be able to access your website..but it would still get counted as a link from that website to yours.
-
It can't be done with respect to Google's visibility. Anyone can link to you at any time and there is nothing you can do about it.
The main positive action you can take is to earn as many good links as possible so if you get some bad links pointed to your site they will only represent a small percentage of your total links.
I know people hate this answer but we have to trust Google has systems in place to deal with black hat SEO tactics of link spamming a competitor's site.
If you wanted to go to extreme measures, you can block a link from reaching your site in your web server settings. On Apache servers (the most common servers) you can do it in the htaccess file. Once again, this is not recommended but those who are extremely determined to block the links from working can do it this way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drastic Drop in Link Juice
Hi Back in December we shifted my web domain from a gourmetdirect.com to gourmetdirect.co.nz as part of a site-wide revamp. Everything was going along fine until recently when my Linking domains plummeted and external links fell from 6000 approx to 600. We still have the .com live for loads of disfunctional reasons. Can anyone help? I have gone from a top ranker to a no show and my contractors are all shaking their heads.
Algorithm Updates | | GourmetDirect0 -
Are links from directories still good practice?
Ok, so I am new at "link building"....which of course I have read furiously on how that philosophy is changed, it's a goal, not so much a process. I am focusing on great content, social sharing, etc. BUT, I see competitors still getting links from some of the directories that I have found listed on Moz as being "good" directories to list in. For example, yelllow pages, manta, ibegin, hot frog, etc. Do I have the terminology totally twisted here? Is it still good practice to get a couple links from these directories. Or is this practice completely the wrong thing to do post Panda & Penquin. Thanks guys!
Algorithm Updates | | cschwartzel0 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Sitemap link in footer?? Is it needed
Hi, I know sitemap is important to have as it tells google the pages to crawl. I have an xml sitemap for google to crawl. However, Do I need a sitemap link in footer. Any thoughts?? Does it have any harm if I dont include a sitemap link in footer
Algorithm Updates | | pejman500 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Are the latest Ranking Reports counting the new large format site links as positions?
Received my weekly ranking report this morning and noticed a specific keyword that I've been ranking in the 3rd or 4th spot has dropped a significant amount of positions. I tested the results myself and it appears the site links of the manufacturer are being counted as positions? My keyword has me in the 3rd position (although it is much lower on the physical page now because of the new format). I'm really wondering how this will affect organic listings going forward - this new format could be a game changer.
Algorithm Updates | | longbeachjamie2