Have I constructed my robots.txt file correctly for sitemap autodiscovery?
-
Hi,
Here is my sitemap:
User-agent: *
Sitemap: http://www.bedsite.co.uk/sitemaps/sitemap.xml
Directories
Disallow: /sendfriend/
Disallow: /catalog/product_compare/
Disallow: /media/catalog/product/cache/
Disallow: /checkout/
Disallow: /categories/
Disallow: /blog/index.php/
Disallow: /catalogsearch/result/index/
Disallow: /links.htmlI'm using Magento and want to make sure I have constructed my robots.txt file correctly with the sitemap autodiscovery?
thanks,
-
Hey thanks for the response. There are about 14,000 url's in the sitemap. It shouldn't freeze up - please would you try again.
http://www.bedsite.co.uk/sitemaps/sitemap.xml
I know what you mean about the allow all
-
Also, here is the best place to answer your questions.
From Google: "The Test robots.txt tool will show you if your robots.txt file is accidentally blocking Googlebot from a file or directory on your site, or if it's permitting Googlebot to crawl files that should not appear on the web. " You can find it here
-
The robots.txt looks fine. I always add an allow all, even knowing it is not necessary but it makes me feel better lol.
The problem you have is with the sitemap itself. How big is it? I cannot tell how many links you have because it locks up every time I go to it in both chrome and firefox.
I tried to send a tool that is designed to pull sitemaps as the SERPS do and it also freezes up.
How many links do you have?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps:
Hello, doing an audit found in our sitemaps the tag which at the time was to say that the url was mobile. In our case the URL is the same for desktop and mobile.
Technical SEO | | romaro
Do you recommend leaving or removing it?
Thank you!0 -
Is this a correct use of 302 redirects?
Hi all, here is the situation. A website I'm working on has a small percentage of almost empty pages. Those pages are filled "dynamically" and could have new content in the future, so, instead of 404ing them, we automatically noindex them when they're empty and remove the noindex once they have content again. The problem is that, due to technical issues we can't solve at the moment, some internal links (and URLs listed in sitemaps) to almost empty pages remain live also when pages are noindexed. In order not to waste Google crawler's time, sending it to noindexed pages through those links, someone suggested us to redirect those pages to our homepage with a 302 (not a 301 since they could become indexable again, so it can't be a permanent redirect). We did that, but after some weeks Search Console reported an increase in soft 404s: we checked it and it is 100% related to the 302 implementation. The questions are: is this a correct use of 302 redirects? Is there a better solution we haven't thought about? Maybe is it better to remove 302s and go back to the past situation, since linking to noindexed pages isn't such a big problem? Thank you so much!
Technical SEO | | GabrieleToninelli0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Best way to create robots.txt for my website
How I can create robots.txt file for my website guitarcontrol.com ? It is having login and Guitar lessons.
Technical SEO | | zoe.wilson170 -
How to use robots.txt to block areas on page?
Hi, Across the categories/product pages on out site there are archives/shipping info section and the texts are always the same. Would this be treated as duplicated content and harmful for seo? How can I alter robots.txt to tell google not to crawl those particular text Thanks for any advice!
Technical SEO | | LauraHT0 -
Multiple sitemaps for various media?
Hello, We have always included videos, pages, and images in the same sitemap.xml file, and after reading through the Google sitemap info page, I am wondering if we should break those up into respective types of sitemaps (i.e one for video, one for images, etc)? If this is so, how to name the files and submit them? And then, should I submit a sitemap.xml directory sitemap? Note: we have the normal amount of images, videos, pages..not an ecommerce site. Thanks in advance 🙂
Technical SEO | | lfrazer0 -
Should 301-ed links be removed from sitemap?
In an effort to do some housekeeping on our site we are wanting to change the URL format for a couple thousand links on our site. Those links will all been 301 redirected to corresponding links in the new URL format. For example, old URL format: /tag/flowers as well as search/flowerswill be 301-ed to, new URL format: /content/flowers**Question:**Since the old links also exist in our sitemap, should we add the new links to our sitemap in addition to the old links, or replace the old links with new ones in our sitemap? Just want to make sure we don’t lose the ranking we currently have for the old links.Any help would be appreciated. Thanks!
Technical SEO | | shawn811 -
Should I Edit Sitemap Before Submitting to GWMT?
I use the XML sitemap generator at http://www.auditmypc.com/xml-sitemap.asp and use the filter that forces the tool to respect robots.txt exclusions. This generator allows me to review the entire sitemap before downloading it. Depending on the site, I often see all kinds of non-content files still listed on the sitemap. My question is, should I be editing the sitemap to remove every file listed except ones I really want spidered, or just ignore them and let the Google spiderbot figure it all out after I upload-submit the XML?
Technical SEO | | DonB0