Noindex
-
I have been reading a lot of conflicting information on the Link Juice ramifications of using "NoIndex". Can I get some advice for the following situation?
1. I have pages that I do not want indexed on my site. They are lead conversion pages. Just about every page on my site has links to them. If I just apply a standard link, those pages will get a ton of Link Juice that I'd like to allocate to other pages.
2. If I use "nofollow", the pages won't rank, but the link juice evaporates. I get that. I won't use "nofollow"
3. I have read that "noindex, follow" will block the pages in the SERPs, but will pass Link Juice to them. I don't think that I want this either. If I "dead end" the lead form with no navigation or links, will the juice be locked up on the page?
4. I assume that I should block the pages in robots.txt
In order to keep the pages out of the SERPs, and conserve Link Juice, what should I do? Can someone please give me a step by step process with the reasoning for what I should do here?
-
I have a private/login site where all pages are noindex, nofollow. Can I still monitor external site links with Google Analytics?
-
Yes, there is a way to keep them out of the SERPs and restrict them from getting link juice: using noindex + nofollow, but bare in mind you'll be loosing that link juice and impairing it's flow throughout your site, besides indicating Google that you don't "trust" those pages.
A workaround would be consolidating those links.
-
So what you are saying is that there is no way to keep the pages out of the serps and restrict them from getting link juice?
This is nuts. My conversion pages will be getting huge amounts of link juice - there are links to them on every page.
I'm not happy about this. Any workarounds?
-
Using robots.txt won't ensure that your pages are kept out of the SERPs, since any external link to those pages could get them indexed. If you need to make sure, the best way should be the noindex meta tag.
Now, in order not to loose your linkjuice, you should make sure to use "noindex, follow" in your meta, that way you're still preventing the pages from being indexed but you are allowing the juice flow through them.
If you want to pass the less possible juice to those pages, you should try to link them as little as possible or consolidate those links in fewer pages throughout your site.
Here's some useful information on the subject:
Google Says: Yes, You Can Still Sculpt PageRank. No You Can't Do It With Nofollow
Link Consolidation: The New PageRank Sculpting
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Can adding "noindex" help with quality penalizations?
Hello Moz fellows, I have another question about content quality and Panda related penalization. I was wondering this: If I have an entire section of my site that has been penalized due to thin content, can adding "noindex,follow" to all pages belonging to that section help de-penalizing the rest of the site in the short term, while we work to improve those penalized pages, which is going to take a long time? Can that be considered a "short term solution" to improve the overall site scoring on Google index while we work to improve those penalized pages, and, once ready, we remove the "noindex" tag? I am eager to know your thoughts on this possible strategy. Thank you in advance to everyone!
Intermediate & Advanced SEO | | fablau0 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
How would you handle this duplicate content - noindex or canonical?
Hello Just trying look at how best to deal with this duplicated content. On our Canada holidays page we have a number of holidays listed (PAGE A)
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search. From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A. Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A? Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing: Duplicate Content: Block, Redirect or Canonical - SEO Tips0 -
Duplicate on page content - Product descriptions - Should I Meta NOINDEX?
Hi, Our e-commerce store has a lot of product descriptions duplicated - Some of them are default manufacturer descriptions, some are descriptions because the colour of the product varies - so essentially the same product, just different colour. It is going to take a lot of man hours to get the unique content in place - would a Meta No INDEX on the dupe pages be ok for the moment and then I can lift that once we have unique content in place? I can't 301 or canonicalize these pages, as they are actually individual products in their own right, just dupe descriptions. Thanks, Ben
Intermediate & Advanced SEO | | bjs20101 -
Noindex, rel=cannonical, or no worries?
Hello, SEO pros, We need your help with a case ↓ Introduction: Our website allows individual contractors to create a webpage where they can show what services they offer, write something about themselves and show their previous projects in pictures. All the professions and services assigned accordingly are already in our system, so users need to pick a profession and mark all services they provide or suggest those which we missed to add. We have created unique URLs for all the professions and services. We have internal search field and use a autocomplete to direct users to the right page. **Example: ** PROFESSION Carpenter (URL: /carpenters ) SERVICES Decking (URL: /carpenters/decking) Kitchens (URL: /carpenters/kitchens) Flooring and staircases (URL: /carpenters/flooring-and-staircases) Door trimming (URL: /carpenters/door-trimming) Lock fitting (URL: /carpenters/lock-fitting) Problem We want to be found by Google search on all the services and give a searchers a list of all carpenters in our database who can provide a service they want to find. We give 15 contractors per page and rank them by recommendations provided by their clients. Our concern is that our results pages may be marked as duplicate since some of them give the same list of carpenters. All the best 15 carpenters offer door-trimming and lock-fitting. So, all the same 15 are shown in /carpenters, /carpenters/lock-fitting, /carpenters/door-trimming. We don't want to be marked as spammers and loose points on domain trust, however we believe we give quality content since we gave what the searchers want to find - contractors, who offer what they need. **Solution? ** Noindex all service pages to avoid duplicate content indexed by Google OR rel=canonical tag on service pages to redirect to profession page. e.g. on /carpenters/lock-fitting page make a tag rel=canonical to /carpenters. OR no worries, allow Google index all the professions and services pages. Benefits of indexing it all (around 2500 additional pages with different keywords) is greater than ttagging service pages with no index or rel=canonical and loosing the opportunities to get more traffic by service titles. We need a solution which would be the best for our organic traffic 🙂 Many thanks for your precious time.
Intermediate & Advanced SEO | | osvaldas0 -
Why do my https pages index while noindexed?
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing. The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated. Example page: https://www.michaelpadway.com/tag/insurance-coverage/ Thanks all!
Intermediate & Advanced SEO | | MarloSchneider0