Using rel="nofollow"
-
Hello,
Quick question really, as far as the SERPs are concerned If I had a site with say 180 links on each page - 80 above suggested limit, would putting 'rel="nofollow"' on 80 of these be as good as only having 100 links per page?
Currently I have removed the links, but wereally need these as they point to networked sites that we own and are relevant...
But we dont want to look spammy...
An example of one of the sites without the links can be seen here
whereas a site with the links can be seen here
You can see the links we are looking to keep (at the bottom) and why...
Thanks
-
Sorry, by "bigger problems" I just meant the potential link-farm.
The nofollow will remove the SEO risk - you'll still lose a little link-juice to those links, but you won't get penalized down the road for having them. Of course, you won't gain any SEO value from the cross-linking either. At this point, though, I think that's inevitable. The risk is greater than the reward from cross-linking this many domains.
Any other ways to block the links are going to look more suspicious to Google than nofollow (including iFrames). Any I can think of would be best avoided in this scenario.
Any way you can contextually cross-link would create less SEO risk and potentially let you get some ranking value out of the connections. That's why I suggested links at the job listing level. I think that might benefit users a bit more, too. Even then, you don't want to go overboard.
-
Hi Dr Peter
Thanks for the detailed response. A few questions, you say I have bigger problems than the 100 links/page - is that just that the sites are at risk of looking like a link farm, or are there bigger problems still?
I hear you on the fact that links in the footer carry less weight and on the consolidating aspect, and it is something that we are working on, but in the mean time, I would really like to find a way if possible to keep some form of cross connecting the sites.. We do have the detail pages which we don't need to be SEO primed, really we only need the -
SEO primed, you can see the different phrases (search patterns) that we are targeting; each site has hundreds of pages like this, that we don't necessarily need primed as these are only live for 28 days..
Is there an option to either include these links in either an iframe in the footer area (for user reference only) or on the detail pages?
Any other options that will work, that will not result in the sites being at risk of looking like a link farm?
I appreciate your insight.
Many thanks
-
You've got bigger problems here than 100 links/page (that's just a guideline) - you're cross-connecting enough sites that it looks like a link farm. Having them all in the footer only adds to the problem, and makes the tactic look lower-quality. I'd definitely no-follow these, as you could potentially be penalized.
The nofollow won't really help with the 180 links - it'll still burn up link-juice. It'll just keep these links from getting you into trouble. Realistically, these links are probably already being devalued by the algorithm.
Practically, being in the footer, these links may not have a ton of value for visitors (if you click-mapped the page, I'm betting the CTR is very low). I wonder if there's a way to integrate them. For example, when someone clicks through to a job listing in Kent, having something like "See more jobs in Kent" on that page and link it to: http://www.kentjobsonline.net/.
From an SEO standpoint, these geo-targeted microsites have lost a lot of value over the past couple of years. I've even seen the strategy run into Panda issues. You may want to re-evaluate down the road and consider consolidating.
-
They are all legitimate links, all 74 sites act as one larger site, or a network of sites. But the thing is when I removed the links I moved up in the SERPs...
What would be the best way of showing te links to visitors but hiding them from SERPs?
Thanks
-
Quality is MUCH better than quantity
-
You should build the site for functionality and if there is a legitimate reason to have all those links then feel free to put them in. I wouldn't over think this.
-
I think nowadays rel=nofollow there is no meaning. With the changes in google, also the link juice is not been affect by it. You can check some info in this old post: http://www.seomoz.org/q/do-you-use-nofollow-and-rel-nofollow
But in SEO's live everything is gray, and not black in white. Its better for you using 100 links per page and go with more security.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using Schema markup for Feefo reviews
I am a little confused about whether or not it is ok to use Schema markup with reviews collected through Feefo. We use Feefo to collect reviews from our customers and these get displayed on our website. We get service ratings as well as product ratings through Feefo. My question is: Is it ok to use Schema markup for these? I would have thought they would fall under 3rd party reviews, but this article from the Feefo website seems to suggest that it would be ok to use markup in the way they recommend. Can anyone confirm how Google handles review markup like this? Thanks in advance!
On-Page Optimization | | ViviCa10 -
Moz Pro recommends not using a keyword more than 15 times. If there is a lot of content and the density is low, is it okay to go over that?
From MOZ on-page grader... "Recommendation: Edit your page to use your targeted keywords no more than 15 times." But if I use a keyword 50 times and the keyword density is only 2 percent, is that ok? What is more important, the raw number used or the density?
On-Page Optimization | | Jeremy-Marion1 -
Can Javascript Links Be Used to Reduce Links per Page?
We are trying to reduce the number of links per page, so for the low-value footer links we are considering coding them as javascript links. We realize Google can read java, but the goal is to reduce level of importance assigned to those internal links. Would this be a valid approach? So the question is would converting low-value footer links to js like below help reduce the number of links per page in google's eyes even though we're reasonably sure they can read javascript. <a <span="" class="html-tag">href</a><a <span="" class="html-tag">="</a><a class="html-attribute-value html-external-link" target="_blank">javascript:void(0);</a>" data-footer-link="/about/about">About Us
On-Page Optimization | | Jay-T0 -
Rel="Canonical"
Hi!, We´ve just launched a new website and on this web we are using a lot Call to Actions on every page of the web and all of this CTA`s goes to the same Landing Page. (Ej: http://www.landing page.com) The problem comes when Google says this Landing Page is duplicate content because we are using some parameters like, for instance, http://www.landing page.com/?fuente=Soporteensalesforce So now we have just 1 Landing Page but Google sees 13 pages, because of this parameters and Moz alerted me that Google is seeing it as duplicate content. Yesterday I put this on the head of the only Landing Page we have so Google can see it in the proper way, as just one landing, but I don´t know if it is enough or should I do anything else? What I put on the Head: Hope someone can help me about this because I´ve tried to find a solution and this is the only thing that came up to me, and don´t know if it´s the right thing. Thanks for your time!
On-Page Optimization | | Manuel_LeadClic0 -
"og:description" vs. name="description"
According to Rock Your SEO with Structured Social Sharing "OG description overrides meta description tag." Moz Crawl Diagnostics seems to ignore og:description and only look for meta name="description" - does that mean my meta descriptions tags should be meta name?
On-Page Optimization | | leighw0 -
Ratio follow/nofollow outbound links
Dear all, So far, I couldn't find any satisfying answer to my problem - I hope you might be able to help: Due to the fact that our website consists of user-generated content, we've got many many outbound links to other sites. Until now, it was possible to assign a follow-attribute to these links. In the ages of "pandas" however, we realised we had to limit this possibility as well as the amount of outbound follow links already published. My question now is whether there is some kind of rule of thumb as to what ratio of outbound follow and nofollow links is advisable. I'd appreciate any ideas or comments. Thanks a lot!
On-Page Optimization | | Mulle0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Can you find the "problem" metric or metrics?
I just dropped from #2 for my main keyword to #5 and am not sure why. My companies Ranking Metrics compared to top 5 Page Authority:#2, MozRank:#1, MozTrust:#1, MT/MR:#1, Total Links: #1, Internal Links:#1, External Links: #1, Followed Links:#1, No Follow Links: #1, Linking Root Domains:#1, OnPageAnalysis Grade: "A", Broad Keyword usage: Yes, Broad keyword in document:
On-Page Optimization | | Boodreaux0