Using one robots.txt for two websites
-
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this:
User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap
Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
-
Hi @gpainter,
Thanks for your help. I can't see anything specific in that link that says you can't have two sitemaps in one robots.txt. Where it mentions the sitemap it does say "You can specify multiple sitemap fields", although I'm not sure whether this means having multiple sitemap URLs under one mention of 'sitemap'?
-
@ciehmoz Hey I've replied to the other thread too.
The best case here will be to utilize different robots.txt files for both the websites.
You could've used the same robots.txt file only if the other site was on the same subdomain.
Don't forget to include the corresponding sitemaps to the new robots.txt file, hope this works out, cheers.
-
Hey @ciehmoz
Just replied to your other thread, you will need one robot.txt per site. Referring to two sitemaps in one robots.txt will confuse Google.
Info here - https://developers.google.com/search/docs/advanced/robots/robots_txt
Good Luck
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
Website Cached Version
Hi all Why my full content is not appearing in Text only version(cached version): http://webcache.googleusercontent.com/search?q=cache:zakoopi.com&es_sm=93&strip=1 Original website link: http://www.zakoopi.com/ How can I resolve this issue?
Technical SEO | | Obbserv0 -
One-Pager and SEO
We're building a page that is going to feature over 31 people as difference makers in their field. We're unveiling one a day for an entire month. The very early mockup of the page has name, pic, some bio info, and a link to open up a new window with the full bio. I would love to have all of the bio content for all of the people on the page (and indexable), but I'm not sure how to do that while still being able to hide the full bios until they are expanded. Anybody have any tips that are SEO-friendly and/or examples of a page that is built like this and ranks well. Thanks!
Technical SEO | | spackle0 -
Which address do I use for citations
Hello, When I created my google places, I entered my address and when I got my google places activated I noticed that the address google places was displaying was a short abbreviation of my address. So my question is when it comes to creating citations for my listing do I grab the address google places generated for me in the listing or the long version of my address? I've just heard when it comes to creating citations, you need to make sure it is identical across the board. I hope this makes sense. Thanks!
Technical SEO | | fbbcseo0 -
Robots.txt Download vs Cache
We made an update to the Robots.txt file this morning after the initial download of the robots.txt file. I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT. I believe that would put the cache time stamp at about 6 hours ago. However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent. Is there anyway to get Google to recognize the new file other than waiting this out??
Technical SEO | | Rich_A0 -
When to use canonical urls
I will be the first to admit I am never really 100% sure when to use canonical urls. I have a quick question and I am not really sure if this is a situation for a canonical or not. I am looking at a my friends building website and there are issues with what pages are ranking. Basically there homepage is focusing on the building refurbishment location but for some reason in internal page is ranking for that keyword and it is not mentioned at all on that page. Would this be a time to add the homepage url and a canonical on the ranking page (using yoast plugin) to tell Google that the homepage is the preferred page? Thanks Paul
Technical SEO | | propertyhunter0 -
Question about construction of our sitemap URL in robots.txt file
Hi all, This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file: http://www.ccisolutions.com/sitemap.xml As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file? Thanks! Dana
Technical SEO | | danatanseo0 -
Robots.txt - What is the correct syntax?
Hello everyone I have the following link: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I want to prevent google from indiexing everything that is related to "view=send_friend" The problem is that its giving me dublicate content, and the content of the links has no SEO value of any sort. My problem is how i disallow it correctly via robots.txt I tried this syntax: Disallow: /view=send_friend/ However after doing a crawl on request the 200+ dublicate links that contains view=send_friend is still present in the CSV crawl report. What is the correct syntax if i want to prevent google from indexing everything that is related to this kind of link?
Technical SEO | | teleman0