Noindex junk pages with inbound links?
-
I recently came across what is to me a new SEO problem.
A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that...
and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.)
Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...).
So I'm thinking about noindexing said pages to avoid Panda filters.
Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely.
What would you do? One of the options I laid out above or something else? Why?
p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link.
p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.
-
These links likely aren't bringing much if any traffic, so it's a moot point here, imho.
-
Sorry if I was unclear. My thinking was that a high bounce rate probably indicates that many visitors don't find the content relevant. If the inbound links you mentioned are bringing lots of traffic to your pages but people are just bouncing right off the site, the value of those links is greatly diminished. If this is the case, I don't think the pages are worth keeping. If people are actually staying on the site after landing on the page, then I would focus on improving those pages and not worry as much about how they find the pages.
-
I don't see the connection to bounce rate? You mean click traffic or search traffic.
-
I would also be interested to know what people think about this. We have an issue where a few years ago, an SEO firm produced a few dozen "articles" for our site which consisted entirely of keyword-stuffed junk with lots of hidden internal links to other relevant parts of the site. Each page has thousands of links to it from link farms and junk directories.
I suspect that there are actually many legitimate, reputable websites out there who suffer from this problem. Any website with many thousands of pages might very easily conceal the remnants of old, poorly-executed SEO efforts for years, particularly if the people making the SEO decisions are unaware of the difference between black hat/white hat practices. With the release of the farmer update, this could be a big problem.
For our situation, we wrestled with whether we should noindex the pages, remove them and implement a 301 redirect to something more relevant, or just leave them as they are. For now we have left the junk pages alone; only a couple of the pages rank within the first 50 results for their targeted keyword, and the pages receive very little traffic. However, if the pages you are talking about get a lot of traffic with a very high bounce rate, I would probably try something else.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://mza.seotoolninja.com/blog?page=2 treated to be on the same level in terms of page depth as https://mza.seotoolninja.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://mza.seotoolninja.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://mza.seotoolninja.com/blog/large-site-seo-basics-faceted-navigation1 -
Internal links to preferential pages
Hi all, I have question about internal linking and canonical tags. I'm working on an ecommerce website which has migrated platform (shopify to magento) and the website design has been updated to a whole new look. Due to the switch to magento, the developers have managed to change the internal linking structure to product pages. The old set up was that category pages (on urls domain.com/collections/brand-name) for each brand would link to products via the following url format: domain.com/products/product-name . This product url was the preferential version that duplicate product pages generated by shopify would have their canonical tags pointing to. This set up was working fine. Now what's happened is that the category pages have been changed to link to products via dynamically generated urls based on the user journey. So products are now linked to via the following urls: domain.com/collection/brand-name/product-name . These new product pages have canonical tags pointing back to the original preferential urls (domain.com/products/product-name). But this means that the preferential URLs for products are now NOT linked to anywhere on the website apart from within canonical tags and within the website's sitemap. I'm correct in thinking that this definitely isn't a good thing, right? I've actually noticed Google starting to index the non-preferential versions of the product pages in addition to the preferential versions, so it looks like Google perhaps is ignoring the canonical tags as there are so many internal links pointing to non-preferential pages, and no on-site links to the actual preferential pages? I've recommended to the developers that they change this back to how it was, where the preferential product pages (domain.com/products/product-name) were linked to from collection pages. I just would like clarification from the Moz community that this is the right call to make? Since the migration to the new website & platform we've seen a decrease in search traffic, despite all redirects being set up. So I feel that technical issues like this can't be doing the website any favours at all. If anyone could help out and let me know if what I suggested is correct then that would be excellent. Thank you!
Intermediate & Advanced SEO | | Guy_OTS0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
Home Page or Internal Page
I have a website that deals with personalized jewelry, and our main keyword is "Name Necklace".
Intermediate & Advanced SEO | | Tiedemann_Anselm
3 mounth ago i added new page: http://www.onecklace.com/name-necklaces/ And from then google index only this page for my main keyword, and not our home page.
Beacuase the page is new, and we didn't have a lot of link to it, our rank is not so well. I'm considering to remove this page (301 to home page), beacause i think that if google index our home page for this keyword it will be better. I'm not sure if this is a good idea, but i know that our home page have a lot of good links and maybe our rank will be higher. Another thing, because google index this internal page for this keyword, it looks like our home page have no main keyword at all. BTW, before i add this page, google index our main page with this keyword. Please advise... U5S8gyS.png j50XHl4.png0 -
Unnatural Inbound Links Warning in GWT
Hi all, A bit of a long questions so apologies in advance but please bear with me... My client has received an 'Unnatural Inbound Links' warning and it is now my task to try and resolve through a process of; Highlighting the unnatural links Requesting that the links be removed (via webmaster requests) Possibly using the Disavow Tool Submitting a Reconsideration Request So I downloaded my clients link profile from both OSE and GWT in CSV format and compared - the amount of links returned was considerably more in GWT than it was in OSE...? So I set about going through the links, first filtering into order so that I could see blocks of links from the same URL - I highlighted in colours; Red - Definitely need to be removed Orange - Suspect, need to investigate further Yellow - Seem to be ok but may revisit Green - Happy with the link, no further action So to my question which relates to, is it 'black & white' - is it a case of 'good link v 'bad link' or could there be some middle ground? (am I making this process even more confusing than it actually is?) As an example, here are some 'Orange' URL's; http://www.24searchengines.com/ (not exact URL as it goes to the travel section which is my clients niche) - this to me looks spammy and I would normally 'paint it red' and look to remove, however, when I go to the 'contact us' page; (http://www.24searchengines.com/texis/open/allthru?area=contactus) and follow the link to remove from directory, it takes me here; http://www.dmoz.org/docs/en/help/update.html DMOZ??? My clients has a 'whole heap' of these type of links; http://www.25searchengines.com/ http://www.26searchengines.com/ http://www.27searchengines.com/ http://www.28searchengines.com/ ...and many many more!! Here is another example; http://foodys.eu/ http://foodys.eu/2007/01/04/the-smoke-ring-bbq-community/ ...plus many more... My client is in the 'cruise niche' and as there is a 'cruise' section on the site I'm not sure whether this constitutes a good, bad or indifferent link! Finally, prior to me working with this client (1 month) they moved their site from a .co.uk to a .com domain and redirected all links from the .co.uk to the .com (according to GWT, over 16k have been redirected) - a lot of these 'spammy' links were to the .co.uk and have thus been redirected, should I even consider removing the redirection or will that have severe consequences? Apologies for the long (long) post, I know I'm heading in the right direction but some assurance wouldn't go amiss! 🙂 Many thanks Andy <colgroup><col width="1317"></colgroup>
Intermediate & Advanced SEO | | TomKing
| |0 -
On Page vs Off Page - Which Has a Greater Effect on Rankings?
Hi Mozzers, My site will be migrating to a new domain soon, and I am not sure how to spend my time. Should I be optimizing our content for keywords, improving internal linking, and writing new content - or should I be doing link building for our current domain (or the new one)? Is there a certain ratio that determines rankings which can help me prioritize these to-dos?, such as 70:30 in favor of link-building? Thanks for any help you can offer!
Intermediate & Advanced SEO | | Travis-W0 -
Are there any negative effects to using a 301 redirect from a page to another internal page?
For example, from http://www.dog.com/toys to http://www.dog.com/chew-toys. In my situation, the main purpose of the 301 redirect is to replace the page with a new internal page that has a better optimized URL. This will be executed across multiple pages (about 20). None of these pages hold any search rankings but do carry a decent amount of page authority.
Intermediate & Advanced SEO | | Visually0