Should I noindex the site search page? It is generating 4% of my organic traffic.
-
I read about some recommendations to noindex the URL of the site search.
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales).My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead.
Would you noindex this page or not?
Any thoughts?
-
One other thing to think about - do you have another method for your the bots to find/crawl your content?
We robot.txt all of our /search result pages - I agree with Everett's post they are thin content and ripe for duplication issues.
We list all content pages in sitemap.xml and have a single section to "browse content" that is paginated. We use re="next" and "prev" to help the bots walk through each page.
References
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744
Personally, I think Maile's video is really great and you get to see some of the cool artwork in her house.
http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
Important to note that if you do setup pagination, if you add any other filters or sort options in that pagination, no follow those links and noindex those result pages as you want to have only one route through your pagination for Goog to travel through. Also, make sure each page has a unique title and description, I just add Page N to the standard blurb for each page and that usually takes care of it.
If you close one door on your search pages, you can open another one using pagination!
Cheers!
-
Since numerous search results pages are already in the index then Yes, you want to use the NoIndex tag instead of a disallow. The NoIndex tag will slowly lead to the pages being removed from the SERPs and the cache.
-
Mike, Everett,
thanks a lot. Will go ahead and noindex.Our navigation path is easy to crawl.
So I add noindex, nofollow in meta or xrobots tag?We have thousands of site search pages already in the google index, so I understand x rotobs or meta tag are preferred to using robots.txt right?
-
This was covered by Matt Cutts in a blog post way back in 2007 but the advice is still the same as Mik has pointed out. Search results could be considered to be thin content and not particularly useful to users so you can understand why Google want to avoid seeing search results in search result pages. Certainly I block all search results in robots.txt for all out sites.
You may lose 4% of your search traffic in the short term, but in the long term it could mean that you gain far more.
-
Google Webmaster Guidelines suggests you should "Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved brand's shop to a new domain. will our organic traffic recuperate?
Hello, We are a healthcare company with a strong domain authority and several thousand pages of service related content at brand.com. We've been operating an ancillary ecommerce store that sells related 3rd party products at brand.com/shop for a little over a year. We recently invested in a platform upgrade and moved our site to a new domain, brandshop.com. We implemented page-level 301 redirects including all category pages, product detail pages, canonical and non-canonical URLs, etc.. which the understanding that there would not be any loss in page rank. What we're seeing over the last 2 months is an initial dive in organic traffic, followed by a ramp-up period of if impressions (but not position) in the following weeks, another drop and we've steady at this low for the last 2 weeks. Another area that might have hurt us, the 301 redirects were implemented correctly immediately post launch (on a wednesday), but it was discovered on the following Monday that our .htaccess file had reverted to an old version without the redirect rules. For 3-4 days, all traffic was being redirected from brand.com/shop/url to brandshop.com/badurl. Can we expect to recover our organic traffic giving the launch screw up with the .htaccess file, or is it more of an issue with us separating from the brand.com domain? Thanks,
Intermediate & Advanced SEO | | eugene_p
Eugene0 -
UK and USA site showing up in UK serps for Brand search
Hay Moz Community. I really hope you can shed some light as I am going round in circles. If you do a search in google UK for Oak Furniture Land you will see the two result is www.oakfurnitureland.co.uk and then www.oakfurnitureland.com The problem I have is only the UK site should be showing. Both have hreflang implemented and targeting in search console for UK and USA respectively but it seams that google does not want play ball. Any assistance is greatly appreciated Alex Mxdst
Intermediate & Advanced SEO | | Alexcox60 -
External resources page (AKA a satellite site) - is it a good idea?
So the general view on satellite sites is that they're not worth it because of their low authority and the amount of link juice they provide. However, I have an idea that is slightly different to the standard satellite site model. A client's website is in a particular niche, but a lot of websites that I have identified for potential links are not interested because they are a private commercial company. Many are only interested in linking to charities or simple resource pages. I created a resource section on the website, but many are still unwilling to link to it as it is still part of a commercial website. The website is performing well and is banging on the door of page one for some really competitive keywords. A few more links would make a massive difference. One idea I have is to create a standalone resource website that links to our client's website. This would be easy to get links from sites that would flat out refuse to link to the main website. This would increase the authority of the resource and result in more link juice to the primary website. Now I know that the link juice from this website will not be as good as getting links directly to the primary website, but would it still be a good idea? Or would my time be better spent trying to get a handful of links directly to the client's website? Alternatively, I could set up a sub-domain to set up the resource, but I'm not sure that this would be as successful.
Intermediate & Advanced SEO | | maxweb0 -
Indexing of internal search results: canonicalization or noindex?
Hi Mozzers, First time poster here, enjoying the site and the tools very much. I'm doing SEO for a fairly big ecommerce brand and an issue regarding internal search results has come up. www.example.com/electronics/iphone/5s/ gives an overview of the the model-specific listings. For certain models there are also color listings, but these are not incorporated in the URL structure. Here's what Rand has to say in Inbound Marketing & SEO: Insights From The Moz Blog Search filters are used to narrow an internal search—it could be price, color, features, etc.
Intermediate & Advanced SEO | | ClassicDriver
Filters are very common on e-commerce sites that sell a wide variety of products. Search filter
URLs look a lot like search sorts, in many cases:
www.example.com/search.php?category=laptop
www.example.com/search.php?category=laptop?price=1000
The solution here is similar to the preceding one—don’t index the filters. As long as Google
has a clear path to products, indexing every variant usually causes more harm than good. I believe using a noindex tag is meant here. Let's say you want to point users to an overview of listings for black 5s iphones. The URL is an internal search filter which looks as follows: www.example.com/electronics/apple/iphone/5s?search=black Which you wish to link with the anchor text "black iphone 5s". Correct me if I'm wrong, but if you no-index the black 5s search filters, you lose the equity passed through the link. Whereas if you canonicalize /electronics/apple/iphone/5s you would still leverage the link juice and help you rank for "black iphone 5s". Doesn't it then make more sense to use canonicalization?0 -
Does Google only look at LSI per page or context of the Site?
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words. For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans. Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
SEO for 1,000,000 page site
Dear All, I hope you can help me with another question about doing SEO for a large site: 1 - My domain is 11 year old, all time was a parking domain
Intermediate & Advanced SEO | | SteveTran2013
2 - We have 10,000 articles - unique content (500-1500 words)
3 - the remaining are automated content, however, they are also unique with data (numbers, figure) We are going to launch it in 2 weeks, and intend to do the following things: Stage 1: first 2 months - only post 10,000 articles with unique content, NO using automated ones.
Link building: get 5-10 authority links pointing to it, either article writings or link pages (authority links Yahoo directory/Dmoz) Stage 2: month 3 to 6: gradually put the automated content online while still posting unique and well written articles.
Link building: Start building links with PR websites, article submission. Do you think there are any problems with this plan? and if 5-10 links can improve our site ranking, given it has a lot of unique content? Thank you very much. BR/Tran1 -
How to generate xml sitemape for an ecommerce site with more than 50000 pages?
Hi, I am new to the forum and struggling hard to work on xml sitemap for an ecommerce site. Site is dynamic and more that 50,000 pages (including product pages). Challenges I am facing should I opt for category wise xml sitemap? how to include new product pages (dynamically) I was wondering if there is any tool that can generate xml site map online (I mean as soon as a new page is added to the site it will pick up automatically). thanks
Intermediate & Advanced SEO | | posy0 -
Site Search Tracking Of Non Existing Products
I am working towards optimizing the site search box of an ecommerce website and I wish to track the keywords which users are searching but which are yielding no results. Please see the image for the same. I wish to assimilate data on the same which would then allow me to add products which users are searching but which the site doesn't have. However my problem is that I don't know how you could obtain this data in analytics because these results manifest itself in the form of searchresults.php. I know that analyzing search refinements and percentage of exits in Google Analytics is an option but I want a more compact and simpler solution to the problem where I could see exactly all the data in one place. Does anyone have suggestions on how this can be done? Thanks in advance, Y35Mj.png
Intermediate & Advanced SEO | | pulseseo0