Removing robots.txt on WordPress site problem
-
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap.
Checked source code and the robots instruction has gone so a little lost. Any ideas please?
-
Hi,
I edited the robots.txt file for my website http://debtfreefrombankruptcy.com yesterday to allow search engines to crawl my site. However, Google isn't recognizing the new file and is still saying that my sitemap is blocked from search. Here is a link to the file itself:
http://www.debtfreefrombankruptcy.com/robots.txt
The Blocked URLs tester said that the file allows Google to crawl the site, but in actuality it still isn't recognizing the new file. Any advice would be appreciated. Thanks!
-
I can help you out as this issue DROVE ME NUTS.
1. I didnt have a Robots.txt (yet)
2. I had Yoast installed
3. Im pretty sure it created a Robots.txt even though it doesnt exist in my root (.com/here)
4. My Google webmaster tools shows this
User-agent: Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /category//* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /? Disallow: /?Allow: /wp-content/uploadsAllow: /assets Create a Robots.txt
1. login to wordpress 2. Click SEO in your side toolbar (Yoast WordPress Plugin settings) 3. Go to edit files under SEO (in the side toolbar)
And now you have the option to edit your Robots.txt file.
-
Hi Sophia
I just checked and see your homepage indexed in google.co.uk with a cache date of April 26th. You should be all set!
-Dan
-
Quick update - by amending the robots text file and switching sitemap plugin over to Yoast I finally got the sitemap to index without robots.txt warnings although the Home page of site was not indexed, 'oh dear'. 5 out of the 7 pages in the sitemap were indexed by Google so It's a start but some more investigating to be done on my side.
-
Dan,
Cant thank you enough! The sitemap request is still pending in Google - maybe I sent too many requests But it's time to sit back and wait for the good news hopefully. Thanks again.
-
Hi Sofia
I just ran the same validator on your sitemap and it went through fine - see screenshot
I intended to mean that you should just be sure Google Webmaster Tools accepts the sitemap as valid - if so, there's no need to run through a 3rd party validator. Apologies if I didn't state it clearly!
Let me know, but from what I can see it looks good!
-Dan
EDIT - Looking more closely, it looks like your ran the homepage through the validator - you would actually enter the sitemap address its self in the validator - http://containerforsale.co.uk/sitemap.xml
-
Hi Dan,
I followed the above advice and switched to the Yoast generated sitemap but after testing on http://www.xml-sitemaps.com/validate-xml-sitemap.html I got the following result - no idea what it means but it looks nasty...
Schema validating with XSV 3.1-1 of 2007/12/11 16:20:05Schema validator crashed
The maintainers of XSV will be notified, you don't need to
send mail about this unless you have extra information to provide.
If there are Schema errors reported below, try correcting
them and re-running the validation.Target: http://containerforsale.co.uk
(Real name: http://containerforsale.co.uk
Server: Apache/2.2.22 (Unix) mod_ssl/2.2.22 OpenSSL/0.9.8e-fips-rhel5 mod_bwlimited/1.4)The target was not assessedLow-level XML well-formedness and/or validity processing output
Warning: Undefined entity raquo
in unnamed entity at line 16 char 83 of http://containerforsale.co.uk
Warning: Undefined entity nbsp
in unnamed entity at line 160 char 10 of http://containerforsale.co.uk
Error: Expected ; after entity name, but got =
in unnamed entity at line 274 char 631 of http://containerforsale.co.u -
Sofia
You are using Yoast SEO plugin for WordPress, so use the XML sitemap within Yoast. You don't need a separate plugin for the XML sitemap. And yes, within Yoast turn the sitemap on.
Hope that helps!
-Dan
-
Indeed, thanks everyone - it's really appreciated!
I have updated the robots.txt as indicated and re submitted site map but looks like Google still has problems with my site since the error warning for robots is there after the processing is done.
Quick question - I am using a plugin called Google XML Sitemaps which has the following tick box option.
'Add sitemap URL to the virtual robots.txt file'.
The virtual robots.txt generated by WordPress is used. A real robots.txt file must NOT exist in the blog directory!'Should this box be ticked or un-ticked please? Fyi I currently don't have the box ticked.
-
Thanks guys for all the responses and helping!
Three Things to try
1.Fix Robots.txt
Sofia - I just checked your robots.txt now and it reads;
User-agent: * Disallow: Sitemap: http://containerforsale.co.uk/sitemap.xml.gz
- with the sitemap on the same line as disallow - I'd check on that and make sure its on a separate line.
- ALSO, you don't need the .gz on the sitemap file just sitemap.xml
2. Re-submit Sitemap
- RESUBMIT your sitemap to webmaster tools and make sure its valid.
3. Submit URL to Webmaster Tools (only last resort)
this is only last case scenario, shouldn't have to do this on the homepage if everything is correct.
- go to fetch as googlebot ->run the fetch ->then submit URL
- do this for the homepage
- see article on google blog for reference
Let us know if you're all set, thanks!
-Dan
-
Ok thanks Brent, I changed to
User-agent: *
Disallow:
Sitemap: http://containerforsale.co.uk/sitemap.xml.gz
Guess I will just have to wait for Google to refresh now...
-
yes, the urls being blocked are includes from your Wordpress program.
-
Thanks for the heads up.
The warning just says 7 Url''s blocked by robots.txt. - have seen this issue posted on the WordPress boards by others but no real insight into solutions.
Perhaps I should try your idea of
Change the robots.txt file to this:
User-agent *
Disallow:
-
Well there is a robots.txt file. You can view it here: http://containerforsale.co.uk/robots.txt
What warnings are you getting in your sitemap submission area? It appears to look alright: http://containerforsale.co.uk/sitemap.xml But I tried to validate it and got a 504 Gateway Time-out error. http://www.xml-sitemaps.com/index.php?op=validate-xml-sitemap&go=1&sitemapurl=http%3A%2F%2Fcontainerforsale.co.uk%2Fsitemap.xml&submit=Validate
-
Its weird, the front page warning on Google webmaster for robots has disappeared now, but still got the warnings in the sitemap submission area. My host suggests I just wait a bit longer for Google to update because he said same as you - that there doesn't seem to be any robot.txt file.
-
Doesn't appear to be blocked, so maybe it has something to do with your /wp-includes/ directory.
Change the robots.txt file to this:
User-agent *
Disallow:
-
Hey Guys,
Thanks for your replies...the domain is http://containerforsale.co.uk ,My host told me to look in the Public HTML file folder for the robots.txt file and just delete it but can't see it in there?
My host said he found a tester site and it doesn't report any issues:
http://www.searchenginepromotionhelp.com/m/robots-text-tester/robots-checker.php
This is the display I get from http://containerforsale.co.uk/robots.txt
User-agent *
Disallow: /wp-admin/
Disallow: /wp-includes/ -
Hi Sofia,
Two things you need to consider when troubleshooting this:
The actual robots.txt file (located in the root directory of your site) and the meta-robots tags in the section of your HTML. When you say you checked the source code and the robots instructions were missing, I think you were talking about the meta-robots tags in the actual HTML of your site.
Webmaster Tools is probably referring to the actual robots.txt file in your domain's root path, which would differ entirely and not be visible by checking the HTML on your site. Like Nakul and Brent said, if you'll let us know your site's URL and paste the content of your robots.txt file here, I'm sure one of us can help you resolve the problem fairly quickly.
Thanks!
Anthony
-
copy whatever you have in your robots.txt file here and we will tell you the issue.
SEOmoz has a great article about Robots.txt files here: http://www.seomoz.org/learn-seo/robotstxt
-
The robots.txt would probably not be a part of the Wordpress Configuration. Allow indexing is controlled via Meta Data by the Wordpress Architecture.
I would look for something like this in yourdomain.com/robots.txt
disallow /
or something like that. If that does not help, PM me your site URL and I would be glad to look it up for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website URL, Robots.txt and Google Search Console (www. vs non www.)
Hi MOZ Community,
Technical SEO | | Badiuzz
I would like to request your kind assistance on domain URLs - www. VS non www. Recently, my team have moved to a new website where a 301 Redirection has been done. Original URL : https://www.example.com.my/ (with www.) New URL : https://example.com.my/ (without www.) Our current robots.txt sitemap : https://www.example.com.my/sitemap.xml (with www.)
Our Google Search Console property : https://www.example.com.my/ (with www.) Question:
1. How/Should I standardize these so that Google crawler can effectively crawl my website?
2. Do I have to change back my website URLs to (with www.) or I just need to update my robots.txt?
3. How can I update my Google Search Console property to reflect accordingly (without www.), because I cannot see the options in the dashboard.
4. Is there any to dos such as Canonicalization needed, or should I wait for Google to automatically detect and change it, especially in GSC property? Really appreciate your kind assistance. Thank you,
Badiuzz0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Clarification regarding robots.txt protocol
Hi,
Technical SEO | | nlogix
I have a website , and having 1000 above url and all the url already got indexed in Google . Now am going to stop all the available services in my website and removed all the landing pages from website. Now only home page available . So i need to remove all the indexed urls from Google . I have already used robots txt protocol for removing url. i guess it is not a good method for adding bulk amount of urls (nearly 1000) in robots.txt . So just wanted to know is there any other method for removing indexed urls.
Please advice.0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Will sitemap generated in Yoast for a combined wordpress/magento site map entire site ?
Hi For an ecommerce site thats been developed via a combination of wordpress and magento and has yoast installed, will the sitemap (& other yoast features) map (& apply to) the entire site or just wordpress aspects ? In other words does one need to do anything else to have a full sitemap for a combined magento/wordpress site or will Yoast cover it all ? This link seems to suggest should be fine but seeing if anyone else encountered this and had problems or if straightforward ? http://fishpig.co.uk/wordpress-integration/docs/plugins.html cheers dan
Technical SEO | | Dan-Lawrence0 -
How do you handle Wordpress sitemaps within your site?
I have a regular site map on my site and I also have a Wordpress site installed within it that we use for blog/news content. I currently have an auto-sitemap generator installed in Wordpress which automatically updates the sitemap and submits it to the search engines each time the blog is updated. The question I have (which I think I know the answer to but I just want to confirm) is do I have to include all of the articles within the blog in the main site's sitemap despite the Wordpress sitemap having them in there already? If I do include the articles in the main website's sitemap, they would also be in the Wordpress sitemap as well, which is redundant. Redundancy is not good, so I just want to make sure.
Technical SEO | | iresqkeith0 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0 -
Robots.txt File Redirects to Home Page
I've been doing some site analysis for a new SEO client and it has been brought to my attention that their robots.txt file redirects to their homepage. I was wondering: Is there a benfit to setup your robots.txt file to do this? Will this effect how their site will get indexed? Thanks for your response! Kyle Site URL: http://www.radisphere.net/
Technical SEO | | kchandler0