Should you use robots.txt for pages within your site which do not have high quality content or are not contributing a great deal so when Google crawls your site the best performing content has a higher chance of being indexed?
-
I'm really not sure what is best practice for this query?
-
Thank you for your answer John!
-
I would definitely not block these pages. You want to block as few pages as possible.
1. These pages can be used to boost internal links by linking to your important pages.
2. Google crawls thousands of pages...it will likely crawl all your important and unimportant files.
3. You can de-prioritize these page in the XML sitemap, telling the spiders that there are more important pages to crawl.
4. If these are similar pages, then use the URL parameter tool in Search Console to indicate a page might be a filtered version of a more important page.
-
Hi,
Yes you can block such pages in robots.txt. I would also like to let you know that If you don't want to index some pages you can use .
I would go for in your case.
Hope this helps.
Thanks
-
Is it possible to beef up those lower quality pages with better content? If they are important main content pages I would imagine you would want to improve those pages.
However, if you were going to block them I would recommend a tag within the header of those pages.
Hope that helps some.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing .com and .co.uk site
Hi, I am working on a site that is experiencing indexation problems: To give you an idea, the website should be www.example.com however, Google seems to index www.example.co.uk as well. It doesn’t seem to honour the 301 redirect that is on the co.uk site. This is causing quite a few reporting and tracking issues. This happened the first time in November 2016 and there was an issue identified in the DDOS protection which meant we would have to point www.example.co.uk to the same DNS as www.example.com. This was implemented and made no difference. I cleaned up the htaccess file and this made no difference either. In June 2017, Google finally indexed the correct URL, but I can’t be sure what changed it. I have now migrated the site onto https and www.example.co.uk has been reindexed in Google alongside www.example.com I have been advised that the http needs to be removed from DDOS which is in motion I have also redirected http://www.example.co.uk straight to https://www.example.com to prevent chain redirects I can’t block the site via robot.txt unless I take the redirects off which could mean that I lose my rankings. I should also mention that I haven't actually lost any rankings, it's just replaced some URLs with co.uk and others have remained the same. Could you please advise what further steps I should take to ensure the correct URL’s are indexed in Google?
Technical SEO | | Niki_10 -
Pages Crawl Per Day Gone Drasitcaly Down, is it google issue?
Hello Expert, In search console in Crawl Stats Pages Crawl per day going day by day i.e. from 4 lac pages per day now it is reduce upto 2 lac in last 15 days. So where is the issue? Where I am going wrong or it is issue from google end? Thanks!
Technical SEO | | Johny123450 -
My sites "pages indexed by Google" have gone up more than qten-fold.
Prior to doing a little work cleaning up broken links and keyword stuffing Google only indexed 23/333 pages. I realize it may not be because of the work but now we have around 300/333. My question is is this a big deal? cheers,
Technical SEO | | Billboard20120 -
How to know which pages are indexed by Google?
So apparently we have some sites that are just duplicates of our original main site but aiming at different markets/cities. They have completely different urls but are the same content as our main site with different market/city changed. How do I know for sure which ones are indexed. I enter the url into Google and its not there. Even if I put in " around " it. Is there another way to query google for my site? Is there a website that will tell you which ones are indexed? This is probably a dumb question.
Technical SEO | | greenhornet770 -
What are the potential SEO downsides of using a service like unbounce for content pages?
I'm thinking of using unbounce.com to create some content driven pages. Unbounce is simple, easy-to-use, and very easy for non-devs at my company to create variations on pages. I know they allow adding meta descriptions, title tags, etc and allow it to be indexable by Google, but I was wondering if there were any potential downsides to using unbounce as opposed to hosting it myself. Any help would be appreciated!
Technical SEO | | Seiyav0 -
How best to set up Google + business pages for clients
I wish to setup a business page on google+ business page for my clients but it requires a personal profile, my clients don't want a personal profile but do want the business page. Currently i have set them up with pages on my personal profile but do can i allow the client to manage it? so i am not sure this is the best way Whats the best way for web developers to setup Google+ accounts for clients?
Technical SEO | | Bristolweb1 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490 -
Dealing with Dead Pages on an Ecommerce Site
Hello everyone! I'm working on a project for a small jewelry store. They have a store in North Carolina and an ecommerce site (on Shopify - which I loathe!). I'm not exactly an SEO expert, but the client likes the way I handle social media and I know enough to get them much farther down the road than they are now. The big problem is that most everything sold is handmade and one of a kind. So, the site has LOTS of dead links. I'd love everyone's suggestions on how to: Best avoid this in the first place as new products are added and promoted via Facebook, Twitter, blog posts and so on Suggestions for managing the sold items - I don't think it seems wise to leave them up as "SOLD" The site is http://www.laurajamesjewelry.com I'm grateful for your assistance! And look forward to sharpening my SEO skills. ~Robin
Technical SEO | | RobinBertelsen0