Two Robots.txt files
-
Hi there
Can somebody please help me that one of my client site have two robot.txt files (please see below). One txt file is blocked few folders and another one is blocked completely all the Search engines. Our tech team telling that due to some technical reasons they using second one which placed in inside the server and search engines unable to see this file. www.example.co.uk/robots.txt - Blocked few folderswww.example.co.uk/Robots.txt - Blocked all Search Engines I hope someone can give me the help I need in this one.
Thanks in advance!
Cheers,
Satla -
Thank Riera
-
Hi Satia,
You mentioned that one robots.txt file placed in inside the server and search engine unable to see the file. If search engine won't see robots.txt file then what is the use of that robots.txt file?
AFAIK it must be placed under root directory an there is no way to keep two files with the same name. So you must have only one robots.txt and that should be placed under root directory.
Hope this helps.
Thanks
-
Hi Satla,
You're going to need to get rid of that 2nd version ASAP. The official standard for a robots.txt file is all lower case in the file name, so that's most likely what bots are seeing. But to err on the side of caution, I'd remove any possibilities of a "disallow: /" and remove that Robots.txt version.
Some servers are case sensitive, so you could run into issues here as well.
-
Hi,
I don't find any good reason why there are two files. There should be just one, where you specify everything you'd like to done.
If the tech team doesn't want to correct and leave just one file, may be because they are lazy or there might be some other issue that if they delete one file, the hole site blows up.Here, I leave you 2 Moz's articles about the robots.txt file.
What is Robots.txt? - Moz Learn
Learn About Robots.txt with Interactive Examples - Moz BlogTake into account that the name of the file must be in lower case. I've never seen any different and the servers are usually case sensitive with the filenames.
Hope its helpful.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dividing a website into two websites
I have a website which talks about Yoga and GMAT classes (2,000 pages). It has blog posts, Faqs, discussions, videos etc.I am planning to fork GMAT content (1,000) pages into a niche website to rank better. Is this a good idea or bad idea? Will Google think niche new Yoga site as duplicate content was there on Dance site in the beginning. I am doing this with assumption that niche website will have higher authority so that I can beat the competitors. What is the best approach to beat your competitors? I am primarily concerned of what happens if I fork half of the website into a new one. Will Google treat new website as a duplicate content website because, this content WAS present in old website and Google will have archives of that. Please help
On-Page Optimization | | Avinash_12340 -
How can I maximize my On Page for these two common usages
I have a page that is tuned for this query: 1886 ten dollar bill and I am happy with my placement. The same page performs poorly if someone types: 1886 $10 dollar bill Is there anything I can do short of having another page tuned for the $ rather than 10?
On-Page Optimization | | Banknotes0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
Two sites, one with a ccTLD domain, the other with TLD domain, same content
Hi there! I have a site which can be accessed with two different domains: one ccTLD for Spain: www.piensapiensa.es one TLD www.piensapiensa.com Should I take care of something regarding SEO? I have also a redirection from www.piensapiensa.com to piensapiensa.com. I have set up them in webmasters tools individually, with the same sitemap obviously. Thanks in advanced.
On-Page Optimization | | juanmiguelcr0 -
Two keyowrds for one page
Hi there! I just optimize two pages for the same keyword as I didn't find a especific keyword for each one independently. On the other hand, these keyword was "suitalbe" for the same two pages. Obviously Google will have to "make a decision" regarding what page should it be indexed in the first postion? In what aspects or elements of the page should I incide in order to give priority to one page more than the other one? Thnaks
On-Page Optimization | | juanmiguelcr0 -
One product two audiences, two pages or one
We have a product on the site that is used by two different groups of people, who refer to it with different terms. One group refers to it as "Lace yarn" plus around another 15 similar terms and the other group refers to it as "Crewel wool" with also 15 similar terms. I am having difficultly deciding how to approach this. At the moment it is on one page (http://www.renaissancedyeing.com/en/category/threads-yarns/crewel-wool/). would it be a good idea to split this into two pages?
On-Page Optimization | | SimonLuijk0 -
What does this mean on first step up setting up a campaign? "Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
I am BRAND new to this, and setting up my first campaign. I choose subdomain, and entered www.pdsaz.com. This is the message I receive: We have detected that the domain www.pdsaz.com and the domain pdsaz.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here.
On-Page Optimization | | cschwartzel0 -
Two points of view on optimizing our search pages. What should we go with?
So we're in the process of going back and forth with our designer about optimizing our search results, which also doubles as a landing page for visitors searching with keywords like "Meeting Rooms Seattle" and "Seattle Meeting Spaces" We're on the front page in the SERPs, but still have a way to go. This is our current page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington And this is something we've proposed for our designer to work with: http://imgur.com/JU1zg There search page text and links in the top left corner were to be placed for onsite SEO purposes ie we have no real text/content on the page for relevancy. We're currently in the process of writing the copy for each city on the search pages. Our designer made this argument: After giving it some thought I came to the conclusion that we may want to take a step back, and focus on the overall goal of this exercise. From what I have gathered, you would like to generate more click-throus and improve SEO, right? In my opinion, adding all of the provided copy and the link farm to the search results page would not necessarily help that. In fact, I think it would actually push the actual results way down. The content you provided me is more suited for a landing page, not a search results page (that is taking into consideration that you want similar content for other locations). Redfin has done a ton of great SEO work on their site. Using them as an example, if you go to Redfin.com, you will find tiny links in the footer that say "home for sale in seattle" etc. If you click on those, it puts you on a page like this: http://www.redfin.com/cities/1/seattle?src=homepage and then from there you can click to a neighborhood page like this: http://www.redfin.com/city/1387/WA/Bellevue. I would recommend that we create a set of location pages with the content the client is asking for, that are specifically optimized for SEO, and provide links in the footer of the site to get to those pages. Then the links on the new landing pages would land the user on the search results page. By keeping two different pages for two different purposes separate would help keep content more organized and help user find specific info they are looking for. As a quick fix we could put one line of text under the H1 text on search results as well, maybe with a strong tag. By doing that we will be able to keep the page looking clean and easy to navigate through. Anyways, that's just my two cents. Any ideas/input on this?
On-Page Optimization | | eVenuesSEO0