Reason for robots.txt file blocking products on category pages?
-
Hi
I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google.
Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!!
Thanks
-
Thanks again AL123al!
I would be concerned about my internal linking because of this problem. I've always wanted to keep important pages within 3 clicks of the Homepage. My worry here is that while these products can get clicked by a user within 3 clicks of the Homepage, they're blocked to Googlebot.
So the product URLS are only getting crawled in the sitemap, which would be hugely ineffcient? So I think I have to decide whether opening up these pages will improve my linking structure for Google to crawl the product pages, but is that important than increasing the amount of pages it's able to crawl and wasting crawl budget?
-
Hello,
The canonical product URLS will be getting crawled just fine as they are not blocked in the robots.txt. Without understanding your problem completely, I think the guys before you were trying to stop all the duplicate URLS with parameters being crawled and just leaving Google to crawl the canonicals - which is what you want.
If you remove the parameter from robots.txt then Google will crawl everything including the parameter URLS. This will waste crawl budget. So better that Google is only crawling the canonicals.
Regarding the sitemap, being present on the sitemap will help Googlebot decide what to prioritise crawling but won't stop it finding other URLS if there is good internal linking.
-
Thanks AL123al! The base URL's (www.example.com/product-category/ladies-shoes) do seem to be getting crawled here & there, and some are ranking which is great. But I think the only place they can get crawled is the sitemap, which has has over 28,000 URLs on one page (another thing I need to fix)!
So if Googlebot gets to the parameter URL through category pages (www.example.com/product-category/ladies-shoes?cgid...) and sees it's blocked, I'm guessing it can't see it's important to us (from the website hierarchy) or the canonical tag, so I'm presuming it's seriously damaging or power in getting products ranked
In Screaming Frog, 112,000 get crawled and 68% are blocked by robots. 17,000 are URL's which contain "?cgid", which I don't think is too big for Googlebot to crawl, the websites has a pretty good authority so I think we have a pretty deep crawl.
So I suppose what really want to know is will removing "?cgid" from the robots file really damage the site? I my opinion, I think it'll really help
-
This looks like the products are being appended by a parameter ?cgid - there may be other stuff attached to the end of each URL like this below:
e.g. www.example.com/product-category/ladies-shoes?cgid-product=19&controller=product etc
but canonical URL is www.example.com/product-category/ladies-shoes
These products may have had a canonical to the base URL which means that there won't be any problem with duplicates being indexed. So all well and good.
Except.....Google has to crawl each of these parameter URLs to find the canonical. In a huge website this means that crawl budget is being consumed by unnecessary crawling of these parameterised URLs.
You can tell Google not to crawl the parameter URLs in search console (at least in the old version you can). But you can also stop Google crawling these URLS unnecessarily by blocking them in robots txt if you are sure that the parameters are not changing how the page is looking in search.
So long story short is that is why you may see that the URLS with parameters are being blocked in robots.txt. The canonical version URLS will be getting crawled just fine since they don't have any parameters and hence not being blocked.
Hope that makes sense?
-
Yes, it's in the robot.txt, that's the problem. Someone had to physically put it in there, but I've no idea why they would.
-
Did you check your robot txt file? Or check if any plugin creating this problem.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
19 Hours Excessive to Code Single Wordpress Page?
My developer says that is will take 19 hours to modify a listing page of the wpcasa London real estate theme because the existing template is difficult to customize. I am attaching an image of the existing page before customization and an image of a final mock up. Is 19 hours a reasonable amount of time to customize this page? Look forward to feedback. New Design is visible at: https://imgur.com/a/42XBqDD Alan IQ1i0kg
Web Design | | Kingalan10 -
Images are Blocked Resources in webmasters. Anything wrong?
Hi all, The images in our sub directory are hosted from a sub domain. This sub domain is blocked to robots. So, I can see all these images are shown as "Blocked Resources" in webmasters. Is anything wrong with this? If so, we also usually block robots to image files location in our website. What's the difference? Thanks
Web Design | | vtmoz0 -
Joomla Core Pages Delisted
Hey Everyone, This query may be more for a Joomla developer or someone that has had a similar issue. I'm not really looking for answers like, "check Google Search Console" or anything like that. We have a client who recently had all of their core pages delisted in Google but the blog is still being displayed in search results. For example, if you search "company name" they have a blog post that ranks #13 or so organically. I tested Google Search Console and Google is saying that the site is temporarily unavailable. We haven't made any changes or updates to Joomla's core structure so I'm unsure as to where this change is coming from. Here are some items we've checked: 1. Site searches within Google, resulted in seeing core pages are not indexed but blog pages are 2. Google Search Console - looked for manual actions (none found), looked at sitemap errors (nothing mentioned), looked at robots.txt (no issues here), attempted to fetch the site as Google (temporarily unavailable). 3. Called the hosting company (Rackspace) to discuss potential issues. They were extremely helpful but we were unable to find anything. The blog is actually a Module that was added so I'm thinking something has changed to block Google bots from the core Joomla structure but it hasn't blocked them from the blog structure. Without putting the company name or url on blast, has anyone heard of or experienced anything like this? Any help or insights would be much appreciated!
Web Design | | Leadhub0 -
Site with no ads hit by Page Layout update?
Hi there! Can a site that has no ads on it be hit by Google's latest Page Layout update? Can it be hit for just one or two keywords? My site (www.ink2paper.com) has a decline in Google organic traffic in early Feb so my suspicion is the Page Layout update. However I have no ads on the site. Digging into GWMT I find that it is only one or 2 keywords that seems to have taken a dive, mainly [photo paper]. I used to get around 80 imps a day for this term. Then on 6 Feb it was down to 50; 7 Feb = 34; 8 Feb just 4 impressions! I got a spike back at usual levels on 10 & 11 Feb, but since then it has been back down to only 5 or so impressions a day. [photographic paper] took a small hit at the start of February, but has nose dived since the start of April. The homepage performs well for Google organic traffic - low bounce (22%) and good ecom conversion rate (14%) - although this is likely to be largely branded traffic. I feel my site is a 'good' result for the search term [photo paper], although there is always room for improvement of course! Any suggestions as to why Google has stopped showing my site for these keywords? All help is greatly appreciated. Cheers,
Web Design | | SimonHogg
Simon0 -
Does stock art photo attribution negatively impact SEO by leaking Google Page Rank?
Greetings: Companies such as Shutterstock often require that buyers place credit attribution on their web pages when photos you buy from them appear on these pages.. Shutterstock requests that credit attribution links such as these be added: Songquan Deng / Shutterstock.com Do these links negatively impact SEO? Or do search engines view them as a positive? Thanks,
Web Design | | Kingalan1
Alan0 -
I've set up my own site which is still fairly new but I'm a bit concerned that there is a bloackage SEO wise somewhere because when I try to crawl the site on SEOmoz it only crawls one page.
I'm really baffled and none of my research has shed much light on it. My url is www.emporiumofmanliness.co.uk I'd really appreciate any help! Thanks
Web Design | | JoshED0 -
How much content is too much? Best Pages For Content?
To my understanding content has a lot to do with organic rankings if written correctly. My question is, how much content is too much and what pages are best to place content. Our company sells very costly products. Our customers call to purchase, we do not have an eCommerce site. Write now we have on average 350 words per page. We have about 200+ pages. Each page is written for that general category and each product has its own unique content. It seems to me that the pages with less content, tend to rank a bit better. As we are in the process of redoing our website, is there any recommendations on writing content, or adjusting the amount of text. I am thinking a lot of our text is informative only to a certain extent. Would writing content just for the main category page be better, and then on the actual product page, have only about 250 words as a description? Are there any other recommendations for SEO that are fairly new? Besides the Title, Description, Heading Tags, Image Alts, URLS etc.
Web Design | | hfranz0