Link juice and max number of links clarification
-
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
- 50 PR page
- 10 links on page
- 5 * .9 = 4.5 PR goes to each link.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
- 50 PR page
- 150 links on the page
- .33 *.9 = .29PR to each link BUT only for 100 of them.
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
-
As always in the SEO industry, there's no right answer for any particular case but I think you got a really structured approach to it. It would be great to know the results of your experiment. This could be a really good article in the seomoz community, let me know how it goes!
-
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
-
I think that the link #102 may have the same value of link #35, I don't think that adding many links diminishes the value of each one. What I assume however is that:
- having many links in one page diminishes the control you have on them, so google may crawl some of them and give different weight on each one. That0s why I'll better put fewer links
- you're right about having more links to your pages augment the possibility of have thoes pages in a better position against other. However as I said before, beware that google may not crawl all your links all the time. You can achieve the same proiportion of importance with less links (ex. 10 links vs 2 is the same of 100 vs 20: same weight more control and less internal spam risks.
- be wise when you build your links and try to not use too many anchor rich links. Even if you're onsite you don't want to let google think you're trying to overoptimize your page or its backlink profile. Create variations of your anchors and use them all.
-
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
-
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
-
As eyepaq said a 100 links limit is not the case anymore, however even if google is able to give value to them all it really makes sense to ahve so many links in your page? Are you using fat footers? Don't rely on that structure to give value to your internal pages, if you find 100 links in one page to be needed for users to navigate through your site try to restructure it a little and create different categories.
I don't know how much value is lost after 100 links but you should try to have tinier and themed list of links adding a further step in your navigation.google won't give hesmae value to those pages as users' won't either.
-
Hi,
You should count those at all. If you get stuck in counting and calculating PR and how much PR is passed from one page to another you will lose focus from what it dose matter. This dosen't.
About the 100 links per page - that was a very old technical limitation from Google's side. There is no longer the case.
See more here: http://www.mattcutts.com/blog/how-many-links-per-page/
and a fast 2 and so min video from Matt Cutts here: http://www.youtube.com/watch?v=l6g5hoBYlf0
So the bottom line is that you should not count and focus on PR and how much PR is passed -only look at things from a normal user and ask your self: dose t his page make sense ? Dose it make sense to have over 100 links on this page ?
Not sure if this was the answer you are looking for but ... hope it helps.
Cheers.
-
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
-
Hi Spry, as you already mentioned, not all links has the same weight, there are navigationla links like in the footer, in the menu; also google may give some different weight among them, moreover some value may be reduced, and also there are some other factors that google uses to weight each link in a page that we don't know, but we may assume they have.
So given that we can calculate an aproximate value of juice passed from a link to another I won't rely so much in PR, the time you're spending in this caluclations may be given to other tasks. In general you may assume that the best pages to obtain links are pages which are nearest to the homepage of a site and which has the least number of outgoing (both internal and external) links.
Don't rely so much on PR, I've seen so many low page rank pages ranking well and high pr pages with no rankings that I think that you need to consider other parameters which are more important when it comes to linkbuilding: age of the domain, authority, topic related, etc etc.
If your calculations are made for onsite optimization just try to have your main pages higher in your site structure and linked directly from the homepage or from m ain categories.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel Sponsored on Internal Links
Hi all. Should you use rel sponsored on internal links? Here is the scenario: a company accepts money from one of their partners to place a prominent link on their home page. That link goes to an internal page on the company's website that contains information about that partner's service. If this was an external link that the partner was paying for, then you would obviously use rel="sponsored" but since this is a link that goes from awebsite.com to awebsite.com/some-page/, it seems odd to qualify that link in this way. Does this change if the link contains a "sponsored" label in the text (not in the rel qualifier)? Does this change if this link looks more like an ad (i.e. a banner image) vs. regular text (i.e. a link in a paragraph)? Thanks for any and all guidance or examples you can share!
Technical SEO | | Matthew_Edgar0 -
Thousands of external links
My site has supposedly over 4,000 external links from it. Is there a good piece of software that could scrape my site that tell me from which page all the links originate from and where they're all going? I'm surprised that the number is this high because our entire site is only a few hundred pages. Here's the site XXXkidecalsXXX.com (just remove the XXX for the URL.) If you want to weigh in on any other issues you see with the site, I'd be happy for any suggestions in general.
Technical SEO | | Santaur0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Should I no follow all external links?
I have worked with a few different SEO firms lately and a lot of them have recommended on the sites I was working on to "no-follow" all external links on the site. On one hand this traps all the link equity/Pagerank. On the other I would think this practice is frowned upon by Google. What are some opinions on this?
Technical SEO | | MarloSchneider0 -
Google is somehow linking my two sites that aren't linked! HELP
Good Morning... In my Google webmaster account it is showing an increase of backlinks between one site i own to the other.... This should not happen, as there are no links from one site to the other. I have thoroughly checked many pages on the new site to see if i can find a backlink, but i can't. Does anyone know why this is showing like this (google now shows 50,000 links from one site to the other).. Can someone please take a look and see if you can find any link from one to the other... original site : http://goo.gl/JgK1e new site : http://goo.gl/Jb4ng Please let me know why you guys think this is happening or if you were actually able to find a link on the new site pointing back to the old site... thanks a lot
Technical SEO | | Prime850 -
Redirecting broken incoming links
I have a number of 404s happening on my site due to other websites incorrectly linking to my content. Perhaps they typed the word wrong, or their software did. Here are some examples from webmaster tools: learn/ingredie.. shop/accessories_and_extras/professional.. lore/idx.php.. learn/step_by_step_instruc shop/prod shop/product lore/email_ As you can see, none of those are actual pages - but truncated URLs of actual pages. Should I find a way to redirect these pages - or let them 404? Thanks!
Technical SEO | | dreadmichael0 -
OSE Link Differential
I have the chrome toolbar installed. In the SERP a site I was looking at had 686 links from 12 domains linking to the root domain. When I checked this site in OSE with filters set to all pages in root domain it shows 65 links from 12 domains. Can anyone explain the difference?
Technical SEO | | waynekolenchuk0 -
Removing inbound Spam Links
Hello, Last February one of my clients websites was delisted. It turns out that some time ago that had attempted to launch a social network along time lines of ning. The project had fallen apart of the was still up. At some point spammers found it and started using it as part of a link farm. Once it was discovered, the subdomain it was posted on was removed and the website returned to search within 2 weeks. Last week, the website disappeared again OSE shows that in the last 2 months the website has got 2000 (There are about 16,000 total spam links) additional spam links now pointing and the root domain. On top of that, Google Webmaster Tools is reporting about 15,000 404 errors. I have blocked Google from crawling the path where the path were the spam pages used to be. If there a way to block the 1000s of inbound spam links?
Technical SEO | | Simple_Machines0