Prevent link juice to flow on low-value pages
-
Hello there!
Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages.
I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO.
What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this?
- Put a rel="nofollow" attribute on those links?
- Put a "robots" meta tag containing "noindex,nofollow" on those pages?
- Put a "Disallow" for those pages in a "robots.txt" file?
- Use efficient Javascript links? (that crawlers won't be able to follow)
-
Mmh, good point. Never heard that "privacy policy page" could be a trust signal. Is there an article somewhere that talks about this?
Well, I took those two pages as an example... but my question was about avoiding link juice to flow on non-SEO pages in general.
Thanks a lot for your answers!
-
Exactly, and what I also try to explain to people is that privacy policy type page is additional signal for Google when they try to understand the type of site you are and how trustworthy it is. Why in the world would you noindex something like that?
-
As I understand it nofollow still dilutes your link juice even though it does not pass PR (theoretically).
Google made this announcement to combat PR sculpting in 2009. Here is a post from Rand about it.
Unlsee something has changed that I am not aware of you could link in an iFrame and Google will not see it, nor will it dilute your PR passed out.
-
Great suggestions. I've recently combined some pages (login/register, about/contact/ToS/privacy, and a few others) and have been very happy with the results. I removed 8 links from every page.
I am also thinking about removing some more links from my product pages, to try and keep the most juice on those pages. Those pages don't need the same navigation as the homepage.
-
It depends on what your purpose is.
If you want them totally block from being index then putting the page in the robots.tx fil or using a robots meta tag would work fine.
If you just want to de-emphasize the page to the search engines you can use nofollows or javascript links on footer/header links.
One thing that we have done is to combine some of these pages (terms and privacy) into one page to cut down on the number of total links on each page.
You could also not include the privacy page link on every page (depending on your site) but just link it from certain pages that collect sensitive data (near the form).
I hope this helps. The main thing to remember is that each site is different so you will have to adjust your tactics depending on precisely what you are trying to accomplish.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitelinks to internal pages
Hi Moz Community, Is it only possible to get sitelinks to your main domain in the SERPs or can you also get them to internal pages in the SERPs? i.e. if this were to pop up in the SERP: www.mysite.com/page1
On-Page Optimization | | Brian_Dowd
Can I get sitelinks such as the examples below:
Sitelink1: www.mysite.com/page1/page1a
Sitelink2: www.mysite.com/page1/page1b
Sitelink3: www.mysite.com/page1/page1c
etc. Just curious and I haven't really find anything like that on the internet. How can I set them up? Thanks.0 -
What to do about pages I have deleted?
I have been working through the dead links on my page and recreating the page with new content for those pages that it still makes sense to have on the site. But I have a few that were just changes of the title, spelling mistakes or other ways of saying the same thing In other words I created a page called "areas of the UK we cover" but decided to change it to "areas covered" However, I must have created links to this page and now it is a dead link with a page authority of 19 I think it would be spammy to have two pages, one called "areas covered" and the other called "areas of the UK we cover. It's not a disallow in Robots.txt because the page does not exist Please note I do not have access to the header to add code for a 301 redirect. I'm still using webs.com but not for new sites. I also have a page called singing telegrams london, that I changed from singagrams london. These are two words for the same thing but they are two very different keywords would it be ok to recreate this page and create content for singagrams london. Help is much appreciated
On-Page Optimization | | singingtelegramsuk0 -
Should I link G+ and social accounts to internal pages?
Hi Gang, We are a multi-state law firm (Indiana, Michigan, New Mexico) that is looking to increase ranking for Michigan and New Mexico for 2015. (Michigan and New Mexico used to exist as subdomains under 2keller.com but have now been incorporated as pages under the main domain.) The question is whether or not we link directly to what we call the "main state pages" from various accounts like G+, FB, Yellow Pages, Yelp, etc.? For example, should we link from our New Mexcio G+ page to our New Mexico main page, or should it be linked to the main domain at 2keller.com where our general branding is on display? I seem to remember somewhere seeing a Whiteboard Friday or some other forum in which Rand had spoke to something similar. I believe he advised AGAINST trying to control the "user experience" for instances like these, but I can't locate same. I understand that clients/customers are looking for our main brand, but it would seem to be the better experience to get users in state specific locales to their state specific page. Am I thinking wrong about this? Thanks in advance! Wayne
On-Page Optimization | | Wayne760 -
Internal link question
Hello, I was wondering if internal links should be full urls? for instance my coder might put "page-title.html" but I was wondering if its better for seo to have it be the full url "http://www.blah-blah.com/page-title.html" thanks in advance..... I love this place!
On-Page Optimization | | Superflys1 -
Landing Pages
Howdy Guys, We currently have around 19 landing pages that are near enough identical for each make of car. The content on each page isn't identical but you can tell its a template. Do you think we should change this and just target models instead of makes. Thanks, Scott
On-Page Optimization | | ScottBaxterWW0 -
Link Building
I have to be doing something wrong. I have been trying to get homes for sale in Casa Grande AZ, and Casa Grande Real Estate to rank well in google. However, I am dropping in rank. What am I doing wrong http://azbestlistings.com/casa-grande-az-real-estate-homes-for-sale-in-casa-grande-az
On-Page Optimization | | sansonj0 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | | MarieA1 -
Optimization of home page
Hi there I have an issue which, despite searching hard, I simply cannot find the right solution for. We have an index page that used to rank pretty well for a main industry keyword. However following a revamp of the site last year the kw slipped and no longer brings in decent traffic levels. The problem seems to be that the old static site had a sprinkling of variable anchor text links that brought value to the home page. Instead of the main anchor being "home" we would revert to "main keyword" and variations across the site sometimes in t he content but mainly on the nav bars. However the new CMS design structure restricts us considerably with anchor distribution and so instead we opted for the site logo on the masthead to have an ALT tag for "main keyword" but so as not to game google too much we added .."home" to the tag. Probably pointless but we figured it could do no harm. This ALT text is site wide Problem now is that we have lost the spread of internal nav bar anchors and variety etc. We have slipped in the serps for "main keyword" and I cant help thinking we are not maximising the anchors as we should. So what Im coming to is this.... How can we tell if Google is picking up the ALT tage anchor as the main anchor to rank the site at the expense of all internal text anchors. Despite retaining lots of embedded anchors - according to the Moz metrics these are not being picked up because OSE suggests the ALT tag anchor is taking precedence. The serps probably support this view as well. Should we: a) Vary the masthead ALT if there is no way of avoiding this being the most important link / anchor on the page b) Remove the ALT anchor and instead opt for content links high on the page (we do have nav bar links saying "Home" site wide as well which may overrid the embedded links?) c) Leave the ALT alone and still push for content anchors as described in b) What is the best way to handle this..? Best wishes and thanks Morch
On-Page Optimization | | Morch0