XML Sitemap instruction in robots.txt = Worth doing?
-
Hi fellow SEO's,
Just a quick one, I was reading a few guides on Bing Webmaster tools and found that you can use the robots.txt file to point crawlers/bots to your XML sitemap (they don't look for it by default).
I was just wondering if it would be worth creating a robots.txt file purely for the purpose of pointing bots to the XML sitemap?
I've submitted it manually to Google and Bing webmaster tools but I was thinking more for the other bots (I.e. Mozbot, the SEOmoz bot?).
Any thoughts would be appreciated!
Regards,
Ash
-
Thanks for the answer and link John!
Regards,
Ash
-
I think it's worth it as it should only take a few minutes to set up, and it's good to have a robots.txt, even if it's allowing everything. Put a text file named "robots.txt" in your root directory with:
<code>User-agent: * Disallow: Sitemap: http://www.yourdomain.com/none-standard-location/sitemap.xml</code>
Read more about robots.txt here: http://www.seomoz.org/learn-seo/robotstxt.
-
It is not going to make any difference. Time is better spend in fixing crawling & indexing issues of the website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit an additional sitemap to speed up indexing
Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages. Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
Intermediate & Advanced SEO | | ceecee0 -
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Have a Robots.txt Issue
I have a robots.txt file error that is causing me loads of headaches and is making my website fall off the SE grid. on MOZ and other sites its saying that I blocked all websites from finding it. Could it be as simple as I created a new website and forgot to re-create a robots.txt file for the new site or it was trying to find the old one? I just created a new one. Google's website still shows in the search console that there are severe health issues found in the property and that it is the robots.txt is blocking important pages. Does this take time to refresh? Is there something I'm missing that someone here in the MOZ community could help me with?
Intermediate & Advanced SEO | | primemediaconsultants0 -
Need help creating sitemap
Hello, The details of my question is sitemap related. Below is the background info: we are ecommerce site with around 4000 pages, and 20000 images. we dont have a sitemap implemented on our site yet. i have checked alot of sitemap tools out there, like g-sitecrawler, xml sitemap, a1 sitemap builder etc, and i tried to create sitemaps via them, but all them give different results. the major links are all there, but the results start to vary for level 2, level 3 links and so on. plus no matter how much i read up on sitemaps, the more i am getting confused. i read lots of seomoz articles on sitemaps, and due to my limited seo and technical knowledge, the extra information on these articles gets more confusing. i also just read an article on seomoz that instead of having one sitemap, having multiple smaller sitemaps is very good idea, specially if we are adding lots of new products (which we are). Now my question: My question is having understood the immense value of sitemap (and by having it very poorly implemented before), how can i make sure that i get a very good sitemap (both xml and html sitemap). i do not want to do something again and just repeat old mistakes by having a poorly implemented sitemap for our site. I am hoping that one of the professionals out there, can help me also make and implement the sitemap. If you can please point me to the right direction.
Intermediate & Advanced SEO | | kannu10 -
Sitemap.xml
Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
How to format a video sitemap for GWT using Vimeo iframe code
I noticed a thread on this forum about using the old embedd code from vimeo for videos, although I can't see this option in Vimeo in 2013. My Question is I have Iframe embedded videos from vimeo, How do I format a google video site map accordingly? or do I need a custom media player instead?
Intermediate & Advanced SEO | | robertrRSwalters0 -
Looking for re-assurance on this one: Sitemap approach for multi-subdomains
Hi All: Just looking for a bit of "yeah it'll be fine" reassurance on this before we go ahead and implement: We've got a main accommodation listing website under www.* and a separate travel content site using a completely different platform on blog.* (same domain - diffn't sub-domain). We pull in snippets of content from blog.* > www.* using a feed and we have cross-links going both ways, e.g. links to find accommodation in blog articles and links to blog articles from accommodation listings. Look-and-feel wise they're fully integrated. The blog.* site is a tab under the main nav. What i'd like to do is get Google (and others) to view this whole thing as one site - and attribute any SEO benefit of content on blog.* pages to the www.* domain. Make sense? So, done a bit of reading - and here's what i've come up with: Seperate sitemaps for each, both located in the root of www site www.example.com/sitemap-www www.example.com/sitemap-blog robots.txt in root of www site to have single sitemap entry: sitemap : www.example.com/sitemap-www robots.txt in root of blog site to have single sitemap entry: sitemap: www.example.com/sitemap-blog Submit both sitemaps to Webmaster tools. Does this sound reasonable? Any better approaches? Anything I'm missing? All input appreciated!
Intermediate & Advanced SEO | | AABAB0