Help with Robots.txt On a Shared Root
-
Hi,
I posted a similar question last week asking about subdomains but a couple of complications have arisen.
Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one.
Thank you in advance.
-
Okay so if you have one root domain you can only have one robots.txt file.
The reason I asked for an example is in the case there was something you could put in the robots.txt to differentiate the two.
For example if you have
thisdomain.com and thatdomain.com
However if "thatdomain.com" uses a folder called shop ("thatdomain.com/shop") than you could prefix all your robots.txt file entries with /shop provided that "thisdomain.com" doesn't use the folder shop, Then all the /shop entries would only be applicable to "thatdomain.com". Does this make sense?
Don
-
It's not so much that one is a subdomain, it's that they are as different as Google and Yahoo yet they share the same root. I wish I could show you but I can't because of confidentiality.
The 303 wasn't put in place by me, I would have strongly suggested another method. I think it was set up so that both websites could be controlled from the same login but it's opened a can of worms for SEO.
I don't want the two separate robots files, the developer insists it has to be that way.
-
Can you provide me an example of the way the domains look... Specifically where the root pages are.
Additionally, if you are redirecting 303 one of the domains to the other why do you want two different robots.txt files? The one being 303 will always redirect to the other...?
Depending on the structures you can create one robots.txt file that deals with 2 different domains provided there is something unique about the root folders.
-
Thanks for your help so far.
The two different websites are different name domains but share the same root as it's been built this way on Typo3. I don't know of the developer's justification for the 303, it's something I wish we could change.
I'm not sure if there are specific tags you can put in the sole robots.txt to differentiate the two, have read a few conflicting arguments about how to do it.
-
Okay so if you're using a 303 then you're saying the content you want for X site is actually located at Y site.Which means you do not have 2 different sub domains. So there is no need for 2 robots.txt files and your developer is correct you can't use 2 robots.txt files. Since one site would be pointing to the other you only have one sub-domain.
However, 303 is in general a poor way to use a redirect and likely should be 301.. but I would have to understand why the 303 is being used to say that with 100% certainty. See a quick article about 303 here..
Hope this answers the question,
Don
-
It's Fasthosts. The developer is certain that we can't use the two separate robots files. The second website has been set up on a 303.
-
What host are you using?
-
The developer of the website insists that they have to share the same robots.txt, I am really not sure how he's set it up this way. I am beyond befuddled with this!
-
The subdomain has to be separated from the root in some fashion. I would assume depending on your host that there is a separate folder for the subdomain stuff. Otherwise it would be chaos. Say you installed forums on your forum subdomain and a e-commerce on your shop subdomain... which index.php page would be served?
There has to be some separation, review your file manager and look for the sub-domain folders. Once found you simply put a robots.txt into each of those folders.
Hope this helps,
Don
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does traffic for branded searches help a site rank for general terms?
A year or two ago we put up some websites which were specific to brands we own. Sure enough those sites (eg 'myBrand.com') started to rank pretty well for those brand terms eg 'mybrand curling tongs' (it's not curling tongs, btw, but you get the idea). We were getting a decent amount of traffic presumably from people who have bought or seen these products on our amazon/ebay stores. Before long, we see us starting to rank well for non branded searches eg 'curling tongs' even among decent competition. Next thing you know I'm getting told by the boss that we need to put up websites for all specific ranges, not just brands, because specificity is a bonus for ranking well. While there's probably a point that a site for MybrandCurlingTongs lends itself well to ranking for curling tongs, is there also an element that the branded searches we got (via making our brand known on amazon/ebay) helped the site gain recognition and authority? As such a new website about 'ionising hair dryers' would not rank well based on being specific, because it wouldn't be helped by a lot of branded traffic?
Intermediate & Advanced SEO | | HSDOnline2 -
Robots.txt for Facet Results
Hi Does anyone know how to properly add facets URL's to Robots txt? E.g. of our facets URL - http://www.key.co.uk/en/key/platform-trolleys-trucks#facet:-10028265807368&productBeginIndex:0&orderBy:5&pageView:list& Everything after the # will need to be blocked on all pages with a facet. Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Was my site hit by Panda or Penguin? Looking for diagnosis help
My URL is: www.westlakedermatology.com Hello Mozers, I'm looking for some help or guidance as to why my site fell off the "rankings cliff" on 9/5. In the forums I hear a lot of others with a similar issue, and some speculation it is due to a Panda refresh. However, looking at our site we have unique content with each page having over 300-400 words (so it's not light or duplicate content). We get a lot of leads that verbally tell us our content helped answer some of their questions so I'm pretty confident its good for users. Can anyone see an issue with the content on our site? In terms of Penguin, I think our backlink profile is clean, our physicians do take part in providing content to various high quality and relevant websites/blogs. But we do not buy links or do anything in violation of Google's guidelines. In terms of brand, we are the biggest dermatology and plastic surgery group in the Austin area. So any brand implications to search should be on our side. Just looking for some sort of guidance or help, any suggestions would be great! Thanks,
Intermediate & Advanced SEO | | iderma
Adam Paddock0 -
Do links to Blog articles help that much?
So here's my question/scenario.. When it comes to link-building, I'm noticing a trend that goes like this: If you have a website like www.insurancelondonontario.com and you want to rank highly for the target phrase 'insurance london ontario', you need to get links with that anchor-text pointing to the index page, which is the page you want to rank for that keyword. But what I'm noticing, is that a lot of link builders use a strategy where they create a good piece of content.. like "10 Ways to Decrease your Car Insurance Premiums" within the blog, and then build links to that article since it's easier as it's a good piece of content. My question is.. how much can this really help you to rank for 'insurance london ontario' if all your doing is building links to that blog article, and not the main page? I know it helps the overall domain authority, but is it enough to get you ranking for your goal phrase, or is it just a supporting method?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Schema.org helps ranking?
Hello everybody! I want to know if Schema.org has any impact on ranking? Thanks!
Intermediate & Advanced SEO | | seomasterbrasil0 -
Mobile friendly version (CSS) - helps in rankings on mobile searches?
Does anyone know if there are any theories or evidence that a mobile optimized website (CSS) has better chances of ranking on Mobile platforms - assuming links and other factors being equal? In other words, is Google able to identify that a website has been optimized for mobiles and gives them preference/weight to rank over other websites that are not mobile optimized?
Intermediate & Advanced SEO | | Syed10 -
Getting Google to Correct a Misspelled Site Link...Help!
My company website recently got its site links in google search... WooHoo! However, when you type TECHeGO into Google Search one of the links is spelled incorrectly. Instead of 'CONversion Optimization' its 'COversion Optimization'. At first I thought there was a misspelling on that page somewhere but there is not and have come to the conclusion that Google has made a mistake. I know that I can block the page in webmaster tools (No Thanks) but how in the crap can I get them to correct the spelling when no one really knows how to get them to appear in the first place? Riddle Me That Folks! sitelink.jpg
Intermediate & Advanced SEO | | TECHeGO0 -
Help Needed - 301 a .co.uk to a .com Serp Questions
Hey, really need some help deciding what to do... I have a .co.uk site, its my oldest and best site of my network and accounts for maybe 30-40% of my income. Although its a .co.uk site, it actually makes most of its from from USA traffic and targets many terms for the US market - but the problem is that due to it being a .co.uk it doesnt rank as well in G .com and over the last few years Google has defiantly widened the gap as such for the ability for a .co.uk to rank in G .com. Many terms that I used to be #1 for in G .com, I now rank position 5-10 only, but in G .co.uk I'm #1 and often with a duo listing so I wouldnt put the loss of rankings in G .com down to just losing rankings naturally. Now many of my key pages are gradually losing rankings in G .com which is not good and really frustrating Feedback Needed So my dilemma is do I risk my best site and 301 it to a .com hosted in the US for potential at a guess 50% increase in revenues and more future potential (If the 301 worked well and got some US rankings back - Im sure longtail would increase lots too) ? If people with experience with 301ing sites to a new domain could let me know how they did or if you're an SEO and have done this many times, how many times on average has Serps remained stable / unchanged ? Trying to work out the reward to risk ratio, like on average if the transition is seamless 90% of the time it would seem worth the gamble, but if its 50% then I would say its not worth it.
Intermediate & Advanced SEO | | goody2shoes0