Best practices for robotx.txt -- allow one page but not the others?
-
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed).
What is the recommended best practice for this?
-
SEOmoz used to use Google Search for the site. I am confident Google has a solid method for keeping their own results clean.
It appears SEOmoz recently changed their search widget. If you examine the URL you shared, notice none of the search results actually appear in the HTML of the page. For example, load the view-source URL and perform a find (CTRL+F) for "testing" which is the subject of the search. There are no results. Since the results are not in the page's HTML, they would not get indexed.
-
If Google is viewing the search result pages as soft 404s, then yes, adding the noindex tag should resolve the problem.
-
And, because google can currently crawl these search result pages, there are a number of soft 404 pages popping up. Would adding a noindex tag to these pages fix the issue?
-
Thanks for the links and help.
How does seomoz keep search results from being indexed? They don't block search results with robots.txt and it doesn't appear that they add the noindex tag to the search result pages.(ex: view-source:http://www.seomoz.org/pages/search_results#stq=testing&stp=1)
-
Yeah, but Ryan's answer is the best one if you can go that route.
-
Hi Michelle,
The concept of crawl efficiency is highly misunderstood. Are all your site's pages being indexed? Is new content or changes indexed in a timely manner? If so, that would indicate your site is being crawled efficiently.
Regarding the link you shared, you are on the right track but need to dig a bit deeper. On the page you shared, find the discussion related to robots.txt. There is a link which will lead you to the following page:
https://developers.google.com/webmasters/control-crawl-index/docs/faq#h01
There you will find a more detailed explanation along with several examples of when not to use robots.txt.
robots.txt: Use it if crawling of your content is causing issues on your server. For example, you may want to disallow crawling of infinite calendar scripts. You should not use the robots.txt to block private content (use server-side authentication instead), or handle canonicalization (see our Help Center). If you must be certain that a URL is not indexed, use the robots meta tag or X-Robots-Tag HTTP header instead.
SEOmoz offers a great guide on this topic as well: http://www.seomoz.org/learn-seo/robotstxt
If you desire to go beyond the basic Google and SEOmoz explanation and learn more about this topic, my favorite article related to robots.txt, written by Lindsay, can be found here: http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions
-
-
Hi Ryan,
Wouldn't that cause issues with crawl efficiency?
Also, webmaster guidelines say "Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
-
Thank you. Are you sure about that?
-
what about if you use "<a title="Click for Help!">Canonical URL" tag ?</a>
You can put this code:
in
/searchhere?page.
-
The best practice would be to add the noindex tag to the search result pages but not the /searchhere page.
Typically speaking, the best robots.txt file is a blank one. The file should only be used as a last resort with respect to blocking content.
-
What you outlined sounds to me like it should work. Disallowing /searchhere? shouldn't disallow the top-level search page at /searchhere, but should disallow all the search result pages with queries after the ?.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Brightcove SEO Best Practices?
My company has decided to go with Brightcove as a video platform so we can better monetize all of the video content we create and better customize the experience as well. We have a pretty decent YouTube presence, and I won't let them stop using that because it would totally alienate us from part of our audience. So I was hoping someone could help me with the following: Are we able to keep videos hosted on YouTube as well as Brightcove without any risk of duplicate content? If we use the Brightcove player to embed videos in our on-site content, are we hindering potential organic search visibility? On the embeds, it's looking like it's using an iframe in our content (https://www.leafly.com/news/cannabis-101/sativa-indica-and-hybrid-whats-the-difference-between-cannabis-ty) - We're using a Brightcove WP plugin for the embed, but I was wondering if anyone had suggestions on a better way to implement/if this is even an issue at all. Are there any other general best practices/insights anyone has working with this platform? I found this article on their site, but I was wondering if there was anything else I should consider. Thank you in advance for any insights/answers!
Intermediate & Advanced SEO | | davidkaralisjr0 -
One domain or two for one company with two lines of business?
Let's say you are building a new company that is involved in two lines of business. Let's for example say one line of business is handling logistics for large conventions where the customer(s) are large corporation and the other line is for wedding planning. Let's say that for certain reasons the owner wants to operate under one brand name, say "PROEVENT" So they will market themselves as PROEVENT Convention Logistics and PROEVENT Wedding Planners. From an SEO perspective, if you have one side of the business doing B-to-B corporate business and the other doing B-to-C do you create two different websites on different domains (proeventconventions.com and proeventweddings.com) with unique design and content, or, do you just use provent.com in order to build better domain authority and on your marketing you use conventions.provent.com that takes you to the convention section of the website and weddings.provent.com takes you to the weddings section?
Intermediate & Advanced SEO | | Jazee0 -
301'd an important, ranking page to the wrong new page, any recourse?
Our 1,300 page site conversion from static html to Wordpress platform went flawlessly with the exception of 1 significant issue....an old, important, highly ranking page was 301 redirected to the wrong corresponding new page. The page it was redirected to is about a similar product, but not the same. This was an oversight that slipped through. It was brought to my attention when I noticed this new page was still holding the old page's rankings but the bounce rate skyrocketed (clearly because the content on the wrong new page was not relevant). Once identified, we cleaned up the redirect. My fear is that all the juice built up on the old .html page that ranked well has now permanently been passed to an irrelevant, insignificant page. -Is there any way to clean up this mistake? -Is there anything I can do to assist Google in associating the correct 'new' page with correct 'old' page after the wrong redirect was initially set-up? -Am I going to have to start from scratch with the new page in terms of trust, backlinks, etc. since google already noted the redirect? Thanks!
Intermediate & Advanced SEO | | seagreen0 -
How many inner links on one page?
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do? We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow? We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site? Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
Intermediate & Advanced SEO | | komeksimas1 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0 -
How can we optimize content specific to particular tabs, but is loaded on one page?
Hi, Our website generates stock reports. Within those reports, we organize information into particular tabs. The entire report is loaded on one page and javascript is used to hide and show the different tabs. This makes it difficult for us to optimize the information on each particular tab. We're thinking about creating separate pages for each tab, but we're worried about affecting the user experience. We'd like to create separate pages for each tab, put links to them at the bottom of the reports, and still have the reports operate as they do today. Can we do this without getting in trouble with Google for having duplicate content? If not, is there another solution to this problem that we're not seeing? Here's a sample report: http://www.vuru.co/analysis/aapl In advance, thanks for your help!
Intermediate & Advanced SEO | | yosephwest0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640 -
What are best SEO practices for product pages of unique items when the item is no longer available?
Hello, my company sells used cars though a website. Each vehicle page contains photos and details of the unit, but once the vehicle is sold, all the contents are replaced by a simple text like "this vehicle is not available anymore".
Intermediate & Advanced SEO | | Darioz
Title of the page also change to a generic one.
URL remains the same. I doubt this is the correct way of doing, but I cannot understand what method would be better. The improvement I am considering for pages of no longer available vehicles is this: keep the page alive but with reduced vehicle details, a text like: this vehicles is not available anymore and automatic recommendations for similar items. What do you think? Is this a good practice or do you suggest anything different? Also, should I put a NOINDEX tag on the expired vehicles pages? Thank you in advance for your help.0