First link importance in the content
-
Hi, have you guys an opinion on this point, mentioned by Matt Cutts in 2010 :
Matt made a point to mention that users are more likely to click on the first link in an article as opposed to a link at the bottom of the article. He said put your most important links at the top of the article. I believe it was Matt hinting to SEOs about this.
http://searchengineland.com/key-takeaways-from-googles-matt-cutts-talk-at-pubcon-55457
I've asked this in private and Michael Cottam told me he read a study a year ago that indicated that the link juice passed to other pages diminished the further down the page you go. But he can't find it anymore !
Do you remember this study and have the link ?
What is your opinion on Matt's point ?
-
Thanks for your answers, I think the first has more importance for Google, as it is for the user. Too bad the study can't be found anymore !
-
It also supports Google's "above the fold" algorithm update. Get your relevent content above the fold (links too). Think of the fold as the area of your monitor that you can see without scrolling down the page. That's why the top of page 1 pays the money and value diminishes as you go down the page.
Google ran a series of tests last year where AdWords in the right space on the page alternated with space at the bottom of the page. We structured AdWords to be at the top of the page on the right and were pissed off when they moved our ads to the bottom of the page. We wanted our ads to be seen without people having to scroll down the page.
Granted there's a lot of different monitors and Webmaster Central has tools for testing how pages look, but consider your own browsing habits.
People tend to take the path of least resistance (and viewer patience is growing shorter and shorter as the months go by).
-
Hi Baptiste
A good question.
Check out an awesome blog post from Rand from back in May 2010, entitled "All Links are Not Created Equal: 10 Illustrations on Search Engines' Valuation of Links" you'll see that Topic Number 1 provides some great information specific to your question.
I believe that on the whole (as in more times than not, but not always) visitors are more likely to click on the first link as opposed to the second, third...
As the most important content is often towards the beginning of a page's content, generally speaking, it's logical that the first link would be deemed more important than the second, third... Therefore the first link would pass on more of any available link juice.
Of course, relevance and context also play a part, there is no absolute answer one way or the other.
On a closely related topic of "multiple links", check out these two blog posts here on SEOmoz:
- Results of Google Experimentation - Only the First Anchor Text Counts
- 3 Ways to Avoid the First Link Counts Rule
In summary, "Google does not appear to count multiple links to the same target page from a single page", which I believe is still true today.
I hope that helps,
Regards
Simon
-
It makes sense to i would have to agree. When i comes to SEO logical is the way to go.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Did Google Ignore My Links?
Hello, I'm a little new to SEO, but I recently was featured (around 2 yrs ago) on some MAJOR tech blogs. For some reason however, my links aren't getting picked up for over 2 years - not even in MOZ, or other link checker services. - By now I should have had amazing boost from this natural building, but not sure what happened? This was completely white hat and natural links. The links were after the article was created though, would this effect things? - Please let me know if you have any advice! - Maybe I need to ping these some how or something? - Are these worthless? Thanks so much for your help! Here's some samples of the links that were naturally given to http://VaultFeed.com http://thenextweb.com/microsoft/2013/09/13/microsoft-posts-cringe-worthy-windows-phone-video-ads-mocking-apple/ http://www.theverge.com/2013/9/15/4733176/microsoft-says-pulled-iphone-parody-ads-were-off-the-mark http://www.theregister.co.uk/2013/09/16/microsoft_mocks_apple_in_vids_it_quickly_pulls/ http://www.dailymail.co.uk/sciencetech/article-2420710/Microsoft-forced-delete-cringe-worthy-spoof-videos-mocking-new-range-iPhones.html And a LOT more... Not sure if these links will never be valid, or maybe I'm doing something completely wrong? - Is there any way for Google to recognize these now, and then they'll be seen by MOZ and other sites too? I've done a LOT of searching and there's no definitive advice I've seen for links that were added after the URL was first indexed by Google.
Intermediate & Advanced SEO | | DByers0 -
Thin Content to Quality Content
How should i modify content from thin to high quality content. Somehow i realized that my pages where targetted keywords didn't had the keyword density lost a massive ranking after the last update whereas all pages which had the keyword density are ranking good. But my concern is all pages which are ranking good had all the keyword in a single statement like. Get ABC pens, ABC pencils, ABC colors, etc. at the end of a 300 word content describing ABC. Whereas the pages which dropped the rankings had a single keyword repeated just twice in a 500 word article. Can this be the reason for a massive drop. Should i add the single statement like the one which is there on pages ranking good? Is it good to add just a single line once the page is indexed or do i need to get a fresh content once again along with a sentence of keyword i mentioned above?
Intermediate & Advanced SEO | | welcomecure1 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
What are your thoughts on using Dripable, VitaRank, or similar service to build URL links too dilute link profile???
One of my sites has a very spamy link profile, top 20 anchors are money keywords. What are your thoughts on using Dripable, VitaRank, or similar service to help dilute the link profile by building links with URLs, Click Here, more Info, etc. I have been building URL links already, but due to the site age(over 12 years) the amount of exact match anchor text links is just very large and would take forever to get diluted.
Intermediate & Advanced SEO | | 858-SEO0 -
Domain Links or SubDomain Links, which is better?
Hi, I only now found out that www.domain.com and www.domain.com/ are different. Most of my external links are directed to www.domain.com/
Intermediate & Advanced SEO | | BeytzNet
Which I understand is considered the subdomain and not the domain. Should I redirect? (and if so how?)
Should I post new links only to my domain?0 -
First Link Priority question - image/logo in header links to homepage
I have not found a clear answer to this particular aspect of the "first link priority" discussion, so wanted to ask here. Noble Samurai (makers of Market Samurai seo software) just posted a video discussing this topic and referencing specifically a use case example where when you disable all the css and view the page the way google sees it, many times companies use an image/logo in their header which links to their homepage. In my case, if you visit our site you can see the logo linking back to the homepage, which is present on every page within the site. When you disable the styling and view the site in a linear path, the logo is the first link. I'd love for our first link to our homepage include a primary keyword phrase anchor text. Noble Samurai (presumably seo experts) posted a video explaining this specifically http://www.noblesamurai.com/blog/market-samurai/website-optimization-first-link-priority-2306 and their suggested code implementations to "fix" it http://www.noblesamurai.com/first-link-priority-templates which use CSS and/or javascript to alter the way it is presented to the spiders. My web developer referred me to google's webmaster central: http://www.google.com/support/webmasters/bin/answer.py?answer=66353 where they seem to indicate that this would be attempting to hide text / links. Is this a good or bad thing to do?
Intermediate & Advanced SEO | | dcutt0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0