Why are the bots still picking up so many links on our page despite us adding nofollow?
-
We have been working to reduce our on-page links issue. On a particular type of page the problem arose because we automatically link out to relevant content. When we added nofollows to this content it resolved the issue for some but not all and we can't figure out why is was not successful for every one. Can you see any issues?
Example of a page where nofollow did not work for...
http://www.andor.com/learning-academy/4-5d-microscopy-an-overview-of-andor's-solutions-for-4-5d-microscopy
-
ahhh, duh! Dr. Pete shed light on what we should be thinking about here. You're not getting messages for sending out too much PR but for too many links. He's right; nofollow will not remove them from being counted. Nofollow stops PR from being passed.
Link equity is a broader concept than PageRank. Link equity considers relevance, authority and trust, link placement, accessibility, any value of relevant outbound links, etc. It sounds as if you need to focus more on how you implement the links on your site.
If you need to reduce links, as mentioned earlier, use AJAX as an external file if those links are needed on the page. If they don't offer any value, then remove them. I viewed your page earlier but cannot access it now. They didn't appear to help the user experience anyway. Often what's good for the user is good for Google.
-
The main issue with too many on-page links is just dilution - there's not a hard limit, but the more links you have, the less value each one has. It's an unavoidable reality of internal site architecture and SEO.
Nofollow has no impact on this problem - link equity is still used up, even if the links aren't follow'ed. Google changed this a couple of years back due to abuse of nofollow for PageRank sculpting.
Unfortunately, I'm having a lot of issues loading your site, even from Google's cache, so I'm not able to see the source code first-hand.
-
I don't see 197 on that page I only see 42 external followed links. See the screenshot below:
-
This suggestion for AJAXing the tabs would put the content in a separate file. Such would be a great way to guarantee a reduction in on-page links!
Also, the suggestions to clean up those meta tags and the massive VIEW STATE are spot on. A little optimization will go a long way to ensuring the bots crawl all your pages. If you do have speed issues and crawl errors, it could be that the bots are not getting to subsequent pages to read your nofollows. Just a consideration of the whole pie.
-
Yes, would nofollow all the links.
To address the mystery, are you sure your other pages have since been crawled? Or is it that you are still getting warnings after subsequent crawls?
-
Whoa! Your view state is HUGE (That's what she said).
I couldn't decode it but somewhere along the lines the programmer didn't turn off session management and, likely, the entire copy of the page is encoded in the view state. This is causing load speed issues.
-
You meta tags are in more trouble then your link count:
id="MetaDescription" name="DESCRIPTION" content="Page Details" />
AND
name="Description" />
I see you are using DNN: what version and what module are you using? There are a ton of things one can do to DNN to make it SEO enhanced.
-
My suggestion is to try AJAXing the tabs. If the outbound links are more of a concern then the keywords of the link, AJAX loading of the tab content would remove them from consideration. Google won't index content pulled in from an external source.
However, be careful to put a rel="nofollow" on the link that loads the content as you don't want SEs indexing the source.
Do not put a meta nofollow in the head, it will kill all links on the page and seriously mess up your link flow. Your use of rel="nofollow" is correct in the context of the specific link tags.
I wouldn't sweat the shear number of links - the 100 count is a left over from the days when spiders only downloaded 100k from the page. It has since risen to the point that the practical limitations of over 100 links is more pressing (IE, do you visitors actually value and use that many links?)
If each link is valuable and usable, no need to worry. If not, perhaps there is a structural way to reduce the count.
Also, load the footer by AJAX onscroll or on demand. Assuming all of the pages can be found in the top navigation, the bottom links are just exacerbating your issues. Primarily, this section is giving far too much weight to secondary or auxiliary pages.
For instance, your Privacy Policy only needs to be linked to where privacy is a concern (ie the contact form). Good to put it on the home or about pages too if you have a cookie policy.
-
Hi Karl,
Would this suggestion not stop crawling to all links on the page?
Also, the issue is we have seen the rel='nofollow' work on other pages and reduce our warnings but then for some pages it has not. This is where the mystery lies.
-
it may be how the nofollow tag is formated? It should be;
and yours is rel='nofollow'
-
Hi James,
Thanks for responding. The issue is that we are still getting a link count of 197 on page links when there is not this many links on the page.
-
What do you mean the nofollow did not work? I noticed on the example page that some of your external links in the papers section are nofollow while the videos are not nofollowed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a number of products in anchor text (link to product list page) have any influence on SEO?
For example: shower cabins (660), used in our onpage-navigation which links to a product list page.
Technical SEO | | Maxaro.nl0 -
Adding directories to robots nofollow cause pages to have Blocked Resources
In order to eliminate duplicate/missing title tag errors for a directory (and sub-directories) under www that contain our third-party chat scripts, I added the parent directory to the robots disallow list. We are now receiving a blocked resource error (in Webmaster Tools) on all of the pages that have a link to a javascript (for live chat) in the parent directory. My host is suggesting that the warning is only a notice and we can leave things as is without worrying about the page being de-ranked/penalized. I am wondering if this is true or if we should remove the one directory that contains the js from the robots file and find another way to resolve the duplicate title tags?
Technical SEO | | miamiman1000 -
Different links to ultimately the same page on Magento
Hi Everyone, I'm wondering if some of you could help me out a bit here as I'm a bit consfused. If you please take a quick look at my site: https://tesorotiles.co.uk the way it's setup is that you can get to the same page via 3 or 4 different routes as below: https://tesorotiles.co.uk/type/wall-tiles/rho https://tesorotiles.co.uk/by-area/bathroom-tiles/rho https://tesorotiles.co.uk/collections/rho These 3 are the exact same page and we've done it this way to make sure there is no break in the breadcrumb. Is this ok SEO wise or anyone have any recommendation. Thanks in advance
Technical SEO | | VIVO0 -
Product pages getting no internal links in Magento
Hello I think i have a serious problem. Most of my products are not getting internal links.
Technical SEO | | macrovet
I discoverd this when i was running a Crawl Test Tool Report | Moz Here an example of one product.
This product can be navigate to a normal way true the navigation structure on my website. The navigation is http://www.macrovet.nl/scheermachine/scheerapparaat-paard-paardenscheermachine.html
On this page is the product URL: http://www.macrovet.nl/aesculap-econom-equipe-gt674.html
Time Crawled 2014
Title tag: Aesculap Econom Equipe GT674 | Macrovet.nl
Meta Description: Bekijk en bestel een Aesculap Econom Equipe GT674 paardenscheermachine voor de scherpste prijs Macrovet.nl
HTTP Status Code: 200
Referrer http://www.macrovet.nl/sitemap.xml
Link Count: 550
Content-Type Header: text/html; charset=UTF-8
4XX (Client Error): NO
5XX (Server Error): NO
Title Missing or Empty: No
Duplicate Page Content: NO
URLs with Duplicate Page Content (up to 5)
Duplicate Page Title:No
Long URL NO
Overly-Dynamic URL NO
301 (Permanent Redirect) NO
302 (Temporary Redirect) NO
301/302 Target
Meta Refresh NO
Meta Refresh Target
Title Element Too Short NO
Title Element Too Long No
Too Many On-Page Links YES
Missing Meta Description Tag No
Search Engine blocked by robots.txt No
Meta-robots Nofollow No
Meta Robots Tag INDEX,FOLLOW
Rel Canonical Yes
Rel-Canonical Target http://www.macrovet.nl/aesculap-econom-equipe-gt674.html
Blocking All User Agents No
Blocking Google No
Internal Links 0
Linking Root Domains 0
External Links 0
Page Authority 1 Domain Autority 30 Do you have an answer what is wrong, thanks for your answers Regards,
Willem-Johan0 -
Too many on page links
Yes this question again. I know it get's asked a lot and I know of a few fixes, but this one I'm having a problem with. So we have a fan gallery on our site which is not only causing duplicate page titles, which I'm thinking we can fix with a canonical, but also too many on page links. The issue is this is on drupal which I have very little experience with and it seems to just be located within the fan galleries section of the site. After looking at a few things I know that no-follow wont be an option since from what I read it wont really work anyway so I was wondering if anyone else has an asnwer. I just read through a million articles trying to find a simular situation and can't seem to find anyone with the same thing. It might have something to do with the plugins the programmers used, but my inexperience with drupal is making this difficult. Thanks guys.
Technical SEO | | KateGMaker0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
OSE - still showing me links are there when they are not?
Has anyone else had the issue for a couple of months now where ose is showing links that are live when they are not. I am doing seo for a big e-commerce website so it is important i can find out what the anchor text distribution is for the domain. How is this possible when ose shows me links which are not live? There's been 2 or 3 updates and still showing expired links in our back-link profile. This is very frustrating and has led me to see if anyone else can recommend an alternative which is more up to date? And yes before seomoz reply, i know you only update a certain percentage of the web.
Technical SEO | | pauledwards0 -
If two links from one page link to another, how can I get the second link's anchor text to count?
I am working on an e-commerce site and on the category pages each of the product listings link to the product page twice. The first is an image link and then the second is the product name. I want to get the anchor text of the second link to count. If I no-follow the image link will that help at all? If not is there a way to do this?
Technical SEO | | JordanJudson0