Risk Using "Nofollow" tag
-
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages.
Questions:
-
By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique?
-
Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective.
-
I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be.
thank you very much
-
-
thx, Alan. Within real estate MLS - if I index all "MLS result pages" (ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/) I will have about 5,000 such MLS result pages (I mean 5,000 such category pages with each category often having more than 1 page). I have added unique quality content on Page 1 of about 300 such MLS result pages and I have added rel=next prev. For the other 4,700 pages I currently have "noindex, follow".
Question: is it OK to have such a large amount of pages with "noindex, follow" on or do I run the risk Google thinks "hmmm….though we do not index, seems like a lot of crap on this website….let us lower ranking even for the quality pages." Would I simply be better off letting everything index? I am concerned if I let those pages index that will dilute the value of my high quality pages. I am thinking if I completely delete those low relevancy pages from my website it would be ideal (in order for Google to see my site's value) but users looking to buy real estate would not see as many listings as on other websites and that could be a concern.
Any insight appreciated. thx
-
If you use nofollow, then every link pointing to those pages will throw away their link juice, you don't want that.
Follow means that link juice will flow though the links back to your indexed pages. Telling google not to index is doing them a favour as they don't want duplicates I don't think there any concern. -
it is a possibility it could be seen that way yes but that's generally unlikely but before you got a bit too much "into" nofollowing links etc. wanted to make you aware of it.
With the tag what you're sort of saying is"these pages are all very similar this is the first one and this is the last one" Google's pretty cleaver and most people don't give it credit if your site is about real estate etc. it will know your listings may be seen else where for example in the UK we have Rightmove & Zoopla they both list properties from else where but they also have value in other aspects of there sites which is why they work, so as long as your site is not just about the pages that are duplicates and you give worthy content on other areas generally you should be fine. Make the site really helpful for the user and the rest sort of falls into place you can also take the time to look at how they've solved the same problem.
Regards to the 3000 pages, if you can get some unique content on there fantastic but i know its not always easy. Your original question was about the risk of nofollow, there is no risk with it, now its really your choice with the noindex tag. I can imagine you can leave it on but you may risk not being all you can be, I would suggest taking a look at your competitors and other similar sites to get an idea of what they do in a similar situation.
you might find this answer helpful which is on the same subject - http://moz.com/community/q/real-estate-mls-listings-does-google-consider-duplicate-content
-
http://www.honoluluhi5.com/moana-pacific-i-2901-kakaako-condo-for-sale-201417440/ - I have 3000+ of such property pages which is shared amongst real estate firms across the web. Currently I have "noindex, follow". You would remove that tag and just let the pages index?
-
I am using rel=next prev. So maybe I should just drop the "noindex, follow" part, though many experts recommend using that tag. However, issue with these things (rel=next prev or "noindex, follow) is that Google will read the pages and may think "hmm....We've seen these real estate listings on many other sites and we therefore consider this low quality content..."
But you are saying don't use noindex type tags as it could be interpreted as sculpting?
-
You want to use the pagination tag like the canonical tag it will let you index the pages (sort of) but avoid duplicate content. Noindexing a site is a bit of a waste of SEO effort when there are other solutions so I'd leave that as a last ditch effort. If you've have unique content on the pages that's better than one (even if its low on the page)
What you don't want to do is make it look like your trying to manipulate your link juice / pagerank internally too much.
-
ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/
As you scroll down you will see a lot of high quality and unique content, including aerial photos which are my company's. I have 300+ pages like that - unique and very high quality. I am in process of reducing size of may by 75% and move the unique content up much higher on the page, since I fear the unique content is placed too low on page and that could impact ranking.
Also, I currently have "noindex, follow" on page 2 to n since all those real estate listings are duplicate content since it is shared across 100+ Real estate companies across the web. I am thinking maybe I should make that those pages 2 - n "noindex, nofollow" so Google does not waste time reading those pages.
Any thoughts highly appreciated... thanks very much
-
I think you've got a bit lost there. By adding the noindex site it makes no difference if you have no follow or not. Even if you have bad content by no indexing most of your site its almost like you've got a one page site. I really recommend taking the time to write some content it pays off down the line and doesn't take as long as you think.
Matt Cutts has said most of the internet is duplicate content so don't over analyze it too much links etc. can make a fairly large impact as long as the bulk of your website is unique and authoritative you will be on a good road.
-
No index and no follow are nearly the same thing (okay take that comment with a heap of salt)
-
link juice would matter as Google is ignoring that part of your site as you've told it not to index it so any link juice going that way is just going into a black hole.
-
I think you heard wrong, no-follow is safer than follow because its like saying "i don't endorse this link" and so it doesn't transfer link juice but reduced any risks but remember trying to manipulate link juice on your site is a risky game and most of the time you will come off worse of than just writing some content for products
I would take a look over here if you needed more reasons not to - https://www.mattcutts.com/blog/pagerank-sculpting/
"Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea?
A: I wouldn’t recommend it" -
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good practice to use "SAVE $1000's" in SEO titles and Meta Descriptions?
Our company sells a product system that will permanently waterproof almost anything. We market it as a DIY system. I am working on SEO titles and descriptions. This topic came up for discussion, if using "SAVE $1000's.." would help or hurt. We are trying to create an effective call to action, but we are wondering if search engines see it as click bait. Can you
Intermediate & Advanced SEO | | tyler.louth0 -
No content using Fetch
Wooah, this one makes me feel a bit nervous. The cache version of the site homepage shows all the text, but I understand that is the html code constructed by the browser. So I get that. If I Google some of the content it is there in the index and the cache version is yesterday. If I Fetch and Render in GWT then none of the content is available in the preview - neither Googlebot or visitor view. The whole preview is just the menu, a holding image for a video and a tag line for it. There are no reports of blocked resources apart from a Wistia URL. How can I decipher what is blocking Google if it does not report any problems? The CSS is visible for reference to, for example, <section class="text-within-lines big-text narrow"> class="data"> some content... Ranking is a real issue, in part by a poorly functioning main menu. But i'm really concerned with what is happening with the render.
Intermediate & Advanced SEO | | MickEdwards0 -
Ranking of Moz "A" grade page.
Hello, I built a site in Weebly recently and it was indexed by Google and the one page in fact ranked #1 for one keyword. I used absolutely no SEO optimization techniques for this. It then rapidly dropped out of sight (not surprising ). I have now optimized the site in general and specifically the page www.insolvencylifeline.co.za/voluntary-sequestration-process as recommended by Moz. All the optimization was on-page, except that I also used the SEOProfiler tool to submit the site to their list of search engines recommended and I manually linked to a number of reputable directories. I did this on 09/03. If I search for www.insolvencylifeline.co.za/voluntary-sequestration-process I can see the page has been cached on 10/3. However,if I search for any of my 3 search terms for example "voluntary sequestration" and then do an advanced search for "insolvencylifeline", I only get search results for pages cached before 9/3. My page www.insolvencylifeline.co.za/voluntary-sequestration-process which I know is fully optimized (“A” Moz grade) for the search term, does not rank at all. Also if I search for www.insolvencylifeline.co.za, I can see that the page also was cached on 10/3. However, it does not show www.insolvencylifeline.co.za/voluntary-sequestration-process at all and the other pages shown were all cached before 9/3. Does this mean that the page www.insolvencylifeline.co.za/voluntary-sequestration-process does not rank at all even though it is indexed? If so, any thoughts on why? Regards, Gerhard.
Intermediate & Advanced SEO | | Gerrhard0 -
Webmaster Tools "Not found" errors after sitemap update
Hello Mozzers - I found a sitemap with loads of URL errors on it (none of the URLs on sitemap actually existed) so I went ahead and updated sitemap - now I'm seeing a spike in "not found" errors in WMT - is this normal / anything to worry about when you significantly change a sitemap. I've never replaced every URL on a sitemap before! L
Intermediate & Advanced SEO | | McTaggart0 -
Using unique content from "rel=canonical"ized page
Hey everyone, I have a question about the following scenario: Page 1: Text A, Text B, Text C Page 2 (rel=canonical to Page 1): Text A, Text B, Text C, Text D Much of the content on page 2 is "rel=canonical"ized to page 1 to signalize duplicate content. However, Page 2 also contains some unique text not found in Page 1. How safe is it to use the unique content from Page 2 on a new page (Page 3) if the intention is to rank Page 3? Does that make any sense? 🙂
Intermediate & Advanced SEO | | ipancake0 -
"Authorship is not working for this webpage" Can a company G+ page be both Publisher AND Author?
When using the Google Structured Data testing tool I get a message saying....... **Authorship Testing Result - **Authorship is not working for this webpage. Here are the results of the data for the page http://www.webjobz.com/jobs/ Authorship Email Verification Please enter a Google+ profile to see if the author has successfully verified an email address on the domain www.webjobz.com to establish authorship for this webpage. Learn more <form id="email-verification-form" action="http://www.google.com/webmasters/tools/richsnippets" method="GET" data-ved="0CBMQrh8">Verify Authorship</form> Email verification has not established authorship for this webpage.Email address on the webjobz.com domain has been verified on this profile: YesPublic contributor-to link from Google+ profile to webjobz.com: YesAutomatically detected author name on webpage: Not Found.Publisher | Publisher markup is verified for this page. |
Intermediate & Advanced SEO | | Webjobz
| Linked Google+ page: | https://plus.google.com/106894524985345373271 | Question - Can this company Google plus account "Webjobz" be both the publisher AND the author? Can I use https://plus.google.com/106894524985345373271 as the author of this and all other pages on our site? 98emVv70 -
Google ranking for the term "locum tenens"
Hello- My company is having a very difficult time performing well for the term "locum tenens". This term literally defines our industry and target market (temporary physician staffing, essentially) and is by far the most searched term in our industry (30k / month, give or take). For us, “locum tenens” is like “ice cream” is to Ben & Jerry’s. Of course, there are other keywords we're concerned with, but this is by far the most important single term. We've moved up to page 3 a few times since launching our redesigned site in April, but seem to continuously settle on page 5 (we've been on page 5 for many weeks now). While I didn’t expect us to be on page 1 at this point, I having a hard time understanding why we’re not on at least 2 or 3, in light of the sites ahead of us. We have a ton of decent, optimized content and we’ve tried not to be too spammy (every page does have locum tenens on it many times, but it describes our service – it’s hard not to use it many times). We are working on developing backlinks and are avoiding any spammy backlink schemes (I get calls every day from companies saying they can give me 400 backlinks a month, which I have a hard time believing is a good long term strategy). It just sort of seems like our site is cursed for some reason that I can't understand. We are working with a competent SEO firm, and still have not made much progress for this term. So, I’m hoping maybe the community here might have some helpful advice. Our site is www.bartonassociates.com. Any insight you guys may have would be GREATLY appreciated. Thanks in advance and have a great day. Jason
Intermediate & Advanced SEO | | ba_seomoz0 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0