Adding your sitemap to robots.txt
-
Hi everyone,
Best practice question:
When adding your sitemap to your robots.txt file, do you add the whole sitemap at once or do you add different subcategories (products, posts, categories,..) separately?
I'm very curious to hear your thoughts!
-
Just add the sitemap index file to your robots.txt and let them figure it out from there. You basically just want to point them to your sitemaps and they're able to do that from just the sitemap index. So there's not really a need to list all of them in there.
-
From a crawlability point of view, it does not matter. Search engines have no more problems crawling multiple sitemap files than they do crawling one very large XML sitemap file.
An advantage of splitting out your XML sitemaps is that if your site is very large, you are less likely to run into the 50 MB / 50,000 URL limit. If the site is quite small, you obviously won't benefit from this.
If you use multiple sitemaps, you may already know that you don't have to list them all in robots.txt. You can use a sitemap index file to point to your subcategory sitemaps (e.g. posts.xml etc.) Any modifications to the 'child' XML sitemaps do not need to be updated in robots.txt - you only need to remember to add/remove them from the XML index file and Google/Bing Search Console.
Since many site applications automatically generate XML sitemaps grouped by posts, categories and products etc., we find it's easier to use this default configuration - and simply add the sitemap index URL to robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
XML Sitemap Generators
I am looking to use a different sitemap generator that can do 5 thousand or more pages at once. Any recommendations? Thanks guys.
Technical SEO | | Chenzo0 -
Sitemap duplicate title
At the moment we have a html sitemap which is pulling the same h1's/ titles. How big a problem is the duplicate content issue which is medium priority in the moz pro softaware? Would you recommend changes as sitemap page 1 - page 2 etc. Thanks
Technical SEO | | VUK-SEO0 -
Type of sitemap
I have a client with a large sitemap in html for his web shop. I am wondering though if i would be better to have a xml sitemap for Google. Is there any advantage in type of sitemap?
Technical SEO | | auke18100 -
Are mobile annotation in PC xml sitemaps a replacement for mobile xml sitemaps?
These two links confused me as to what I should do... https://developers.google.com/webmasters/smartphone-sites/details https://support.google.com/webmasters/answer/34648?hl=en
Technical SEO | | JasonOliveira0 -
Interstitial ads
What is the best way to handle interstitial ads for the search engines? We saw in Google Webmaster Tools a big spike in 403 errors all from interstitial ad URLs. Would noindex work? Blocking the /interstitial in our robots.txt file? Will that affect the search engines from moving past the ad URL.
Technical SEO | | bonnierSEO0 -
Adding Rel Canonical to multiple pages
Hi, Our CMS generates a lot of duplicate content, (Different versions of every page for 3 different font sizes). There are many other reasons why we should drop this current CMS and go with something else, and we are in the process of doing that. But for now, does anyone know how would I do the following: I've created a spreadsheet that contains the following: Column 1: rel="canonical" tag for URL Column 2: Duplicate Content URL # 1 Column 3: Duplicate Content URL # 2 Column 4: Duplicate Content URL # 3 I want to add the tag from column 1 into the head of every page from column 2,3, and 4. What would be a fast way to do this considering that I have around 1800 rows. Check the screenshot of the builtwith.com result to see more information about the website if that helps. Farris bxySL
Technical SEO | | jdossetti0 -
What to include on a sitemap for a huge site?
I have a very large site and I'm not sure what all to include on the sitemap page. We have categories such as items1, items2 and in the items1 category are 100 vendors with their individual vendor pages. Should I link all 100 vendor pages on the sitemap or just the main items1 category?
Technical SEO | | CFSSEO0