Should I block non-informative pages from Google's index?
-
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are:
-Galleries
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentencesMy question is whether or not I should put a noindex in the meta tags.
I think it would be good because the ratio of quality to low quality pages right now is not good at all.
I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?
-
To the spiders, would the content in the lightbox be considered on the page?
-
I would discriminate these pages on the basis of income or search engine traffic rather than use their informativeness.
I have semiinformative pages that pull lots of traffic and make lots of money - and informative pages that make next to nothing.
-
More a technical answer than SEO-specific, but you could place the pop up content in a lightbox similar to your gallery items with a script like http://fancyapps.com/fancybox/, colorbox, etc. These will allow you to lightbox on page content in addition to just photos.
So you could technically have the price table displayed in the page for non-javascript enabled clients, and the lightbox script would show it when clicked, and you wouldn't have to worry about pop-up blockers or having the popup content be a separate page.
-
I know PR shaping is most commonly done with nofollows but the same core principle holds: you don't want the spiders to do something out of fear that you're "diluting" the site's value. Doing it with noindex is just as bad as nofollow, if not worse.
-
When it comes to popups, keep in mind that some users' popup blockers might prevent these from even loading. As is, I don't think it matters much whether you noindex these price list pages or not. You certainly could, as they're not going to appear in any search result, and they're not going to attract links.
I would play with ways to improve the user experience, but putting the large tables on the page probably isn't the way to do that. To me, I think a better option would be (somewhere above the fold) allowing the user to select the type (plain/patched/etc.) quantity, and other variables. They would then get a price quote (as on the bottom of the page), along with a button to continue the checkout process or otherwise continue to the next step. I'd also display the original price per item crossed out, the phrase "bulk discounts" somewhere close, and then the new price per item.
Telling people what they need to do next (it took me a while to find where to buy) and simplifying the pricing at the same time could help a lot. I also noticed that the price quote on the contact page seems to be loading inside the same cramped frame.
-
Hi there,
Sorry I didn't see this when I posted. PR sculpting generally refers to the practice of using internal nofollows - which I'm not a fan of either, not least because it doesn't work. I also agree that pages that users could find useful should generally remain in the index.
-
Thanks for that great information. This is a good example of what I'm taking about:
http://www.stadriemblems.com/scouting/neckerchiefs/index.htm
Under "Plain Neckerchief" click on "view pricelist" or "color chart"
So, you think a better practice would be to just include that pricelist on the same page instead?
-
Hi Marisa,
To determine which pages should be noindexed, first ask yourself first whether a user would want to land on the URL in question. Second, is the URL receiving traffic as an organic landing page right now? Third, does the content serve a purpose to the user? Does it need to exist?
If the answer to all of the above questions is "no," then go ahead and noindex the page. If you answer yes to one of the above, some evaluation is in order. Can you add content, improve the navigation and appearance, or make the page more useful rather than noindexing it?
Generally you can enhance gallery pages for search engines and users by labeling/captioning the images and making sure the alt text is in order. On category pages, add some content, label products, and provide them with a next action.
Do the popups contain useful, non-repeating, or important info? If so, can the content be placed on the page somewhere instead? The only way I would use a popup and noindex it is if the content in the popup is optional and duplicated, such as the often-seen "What's This?" that explains a field or term that is repeated across the site, and each instance makes a new URL.
I've never heard of anyone running into problems with Google for noindexing too much stuff. You're essentially just telling them that the page is not good for users to find. You will, however, tend to improve organic traffic and user experience by making each page useful and adding an appropriate amount of content.
Hope that helps,
Carson
-
I'm not a fan of this (commonly called page rank shaping). First, you're trying to tell Google what to index and what to ignore. Second, how do you know those pages have no value? What if I found an image in your gallery and linked to it off my blog? Now you're missing out on link juice. It might not be viewed as suspicious, but it won't help your site any.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Is there any way to prevent Google from using structured data on specific pages?
I've noticed that Google is now serving what looks like host-specific video cards on mobile for our site. Is there any way to control which videos are included in these lists without removing the structured data on those clip pages or user pages? We don't want to noindex those pages but we don't want content from those pages to appear as video cards. 1kzPW
Algorithm Updates | | Garrett570 -
Has Google Authorship been completely removed from SERPs?
Noticed today that when I search (non-personalised search, incognito etc.) some of my pages on Google ALL references to authorship have now been completely removed. Does anyone know when this change occurred? I might be a bit slow this week (or last week) with concentrating on projects. I know like others that photos went some time back but now there are no author details being displayed. Just the page title and description. David
Algorithm Updates | | David-E-Carey1 -
What's the current situation for exact match ecommerce domains
I'm looking for some current opinions on exact-match domains when it comes to ecommerce sites. I currently own an "OK" brandable domain in a specific ecommerce niche. The opportunity has now come along to own the exact match domain for this category. For example catfood.com Is it more advisable to focus on brandables these days? I know Google doesn't give weight to exact match anymore but is there still a correlation not causation factor to consider? IE: the links back would be more targeted possibly?
Algorithm Updates | | NoClueSEO0 -
Fetch as Google in GWT - Functionality
Hi, For example, some of the HTML improvements notices from GWT, says dupe meta descriptions or titles, for pages that have since been 301 redirected or had a canonical tag added. So, my idea is to force google to read it using "Fetch as Google" - hoping that it will now see 301 redirection or the fix we have implemented. Does this work? How long does it take? Lastly, should I just click the "fetch as google" or should I also click on the "Submit to index" button? Thanks!
Algorithm Updates | | bjs20100 -
Does Schema.org markup create a conflict with Power Reviews' standard microformat markup for e-commerce product pages?
Does anyone have experience implementing Schema.org markup on e-commerce websites that are already using Power Reviews (now Bazaar)? In Google's documentation they say that it's generally not a good idea to use two types of semantic markup for the same item (reviews in this case), but I wouldn't think that there would be a problem marking up other items on the page with Schema such as price, stock status, etc... Anyone care to provide some insight? Also in a related topic, have you all noticed that Google has really dialed back the frequency in which they display rich snippets for product searches? A few weeks ago the site that I'm referring to had hundreds of products that were displaying snippets, now it seems that only about 10% (roughly) of them are still showing. Thanks everybody.
Algorithm Updates | | BrianCC0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0