Internal Linking: Site-wide VS Content Links
-
I just watched this video in which Matt Cutts talks about the ancient 100 links per page limit.
I often encounter websites which have massive navigation (elaborate main menu, side bar, footer, superfooter...etc) in addition to content area based links.
My question is do you think Google passes votes (PageRank and anchor text) differently from template links such as navigation to the ones in the content area, if so have you done any testing to confirm?
-
He also said: "We invite and strongly encourage readers to test these themselves."
This is what I am after, personal opinion from people who have either tested or experienced the effect first hand.
-
it is a thought that there is an importance if the links apear at the begining of body or at end and if they are in specific tags. How do you specify to crawler that a speciffic link is from a navbar and that link has an bigger value than other content links?
-
Rand has written a blog about this a while ago, how not all links on webpages are created equal, you might find it interesting:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
-
Thanks for your input!
It seems like your vote goes towards all links being treated equally regardless of their location/function. Interesting... I have suspicion that there is or should be difference. Why?
Consider this, Google notices 150 sitewide links that always appear. Wouldn't it make sense for Google to treat page-specific links differently to sitewide ones as that would in fact improve their ranking system (e.g. 150 standard links not diluting the importance of a page specific link given through content).
Thoughts?
-
Many of this massive navigation are made in flash, javascript, and google can't see them as links, then it conts them as 1 link or as refference to javascript file and nothing else, that's how its done to have massive links but not seen by google or you can set nofollow to non preffered links then google will analyze different your page. And the answer is No, Links are Links everywhere only difference is the tag that link contains, and you can test this with tools like spider view try one on 2 pages and you'll se that there is no difference
Best,
Ion
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO And Duplicate Content Within The Same Language
Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks
Technical SEO | | Tom3_150 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Our UE team has presented me with a site structure where the content (folders) does not match the hierarchical directory structure (in the CME)
Our UE team has presented me with a new site structure where the content (folders) does not match the hierarchical directory structure (in the CME). I.E Sub-sectors, sectors and product pages are ALL just 1 directory off the root. example.com/sector example.com/sub-sector example.com/productpage FYI 'normal' folder hierarchy would be; example.com/sector/ example.com/sector/sub-sector example.com/sector/sub-sector/productpage I cannot find any SEO disadvantages re; crawl, if anything the SE's will crawl more efficeitly with clearly less depth... higher 'deep content', and a better nav - which is technically a sound solution with link consistency throughout - 1 to 2 clicks to all pages. Only disadvantage might be a user confusion... which can be off-set with contextual breadcrumbs. Are there any PURE SEO disadvantages to a structure this illogical? Note - This does not abuse any Search Engine guidelines. Thanks for reading, Rich
Technical SEO | | richcowley0 -
301s vs. rel=canonical for duplicate content across domains
Howdy mozzers, I just took on a telecommunications client who has spent the last few years acquiring smaller communications companies. When they took over these companies, they simply duplicated their site at all the old domains, resulting in a bunch of sites across the web with the exact same content. Obviously I'd like them all 301'd to their main site, but I'm getting push back. Am I OK to simply plug in rel=canonical tags across the duplicate sites? All the content is literally exactly the same. Thanks as always
Technical SEO | | jamesm5i0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
How do I get rid of irrelevant back links pointing to missing pages on my site
Hi all, My site was hacked about a year ago and as a result I now have a ton of back links from irrelevant sites pointing to pages on my site that no longer exist. The followed back links section on the Competitive domain analysis tool shows about 3 pages worth of these horrible links. I have 2 questions: how bad is this for my site's SEO (which isn't good anyway, Page Rank 0) and how do I get rid of them? Any help would be much appreciated. Thanks, Andy WkXz0
Technical SEO | | getzen560 -
During a site platform transition, should we 301 redirect all URLs or only those with inbound links?
We have an ecommerce client transitioning to a new platform. Due to the nature of the platform, all the pages will have different URLs. There are between 7000-8000 total pages on the website. We wrote 301 redirects for all URLs which are showing inbound links. Unfortunately, automating this process is pretty difficult and hand writing URLs for 8000 links is unfeasible. Is it worth investing the time to 301 redirect all 8000 URLs, or are we safe with only doing those with inbound links? One other option would be to implement a generic redirect for all the rest of the old URLs that sends them to the homepage. Would this be a good compromise?
Technical SEO | | outofboundsdigital0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0