Robot.txt file issue on wordpress site.
-
I m facing the issue with robot.txt file on my blog.
Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue.
The search result shows that "A description for this result is not available because of this site's robots.txt – learn more."
Any suggestion to over come with this issue
-
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
Hi
You are most likely still seeing "A description for this result...." etc in Google because they may not have re-cached the page yet.
If the site is crawlable with a tool like Screaming Frog SEO Spider, or a header checker like http://urivalet.com/ - it's also accessible to Google, you just need to wait for them to re-crawl.
Let us know if you need more help!
-
The most likely lines in your robots.txt to cause this:
User-agent: * Disallow: /
-
Hi there
Check out your robots.txt in Wordpress:
Admin Dashboard>Settings>General>Privacy -- what is setting?
I would also read up on Robots.txt and see if you can find anything that may stand out to you. Then, I would take a look at your XML sitemap and upload it to Google Webmaster Tools and Bing Webmaster Tools.
I, like the others, would like to see your robots.txt file as well to see a live example. This shouldn't be anything major!
-
It's so easy to block your entire site from Google with robots.txt. Are you using Yoast SEO as the SEO plugin? With this, there shouldn't really be any need to block anything yourself.
Drop your URL here and we can take a look.
-Andy
-
Could you post the URL to your robots.txt file or post the content of the file itself?
-
If you wish to block your pages from search engines, then you should use noindex and not disallow in robots.txt. In many cases, a robots,txt is doing more harm than good.
What is your URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Onsite audit and fix issues
I am not sure if I am posting in the correct area here, but I am looking for someone I can hire to do an onsite audit and fix those issues. Thanks.
On-Page Optimization | | Granitebusters0 -
Word Count - Content site vs ecommerce site
Hi there, what are your thoughts on word count for a content site vs. an ecommerce site. A lot of content sites have no problem pushing out 500+ words per page, which for me is a decent amount to help you get traction. However on ecommerce sites, a lot of the time the product description only needs to be sub-100 words and the total word count on the page comes in at under 300 words, a lot of that could be considered duplicate. So what are your views? Do ecommerce sites still need to have a high word count on the product description page to rank better?
On-Page Optimization | | Bee1590 -
301 Redirect to external site
Hi guys, We have a client who is getting their website redesigned through us. They are discontinuing couple of their services which will not get featured in the new site. They are fairly well ranked for these services and my client wishes to 301 redirect these pages to an external site owned by his friend so that they benefit out of the ranking. The question is: Will my client's website's general ranking get affected due to 301 redirecting to an external site? The external site is not spammy or red-flagged by Google (at the moment, at least). Thanks in advance!
On-Page Optimization | | RameshNair
Ramesh Nair0 -
Site Structure. Which is better?
Ideally, which model is better for site structure: 1. Homepage -> Categories -> Individual Pages (See example here http://www.wordtracker.com/attachments/bead-site-structure.gif) OR 2. Homepage -> Categories -> Sub-categories -> Indicidual Pages In the 2nd model, are the individual pages too far away from the homepage?
On-Page Optimization | | brianflannery0 -
SEO For Replacement Site
I have a client with a website that has gotten a bit outdated. We've already built his new website and optimized it, but I'm trying to figure out the best way to replace the site while doing the least amount of damage to his current Google rankings. He's ranking #1 for some very competitive keywords that are responsible for the bulk of his revenue, so we want to jeopardize that. We've already built a new site and written all new content, although the homepage page title, h1 header and meta descriptions will all remain what they currently are. I'm also trying to keep the keyword density as close to the current site as possible. I am aware of transferring all existing site URLS using 301 redirects. Can anyone provide any tips that I should use when replacing the site? Should I expect a slight rankings drop or am I worrying about nothing?
On-Page Optimization | | atstickel0 -
How Can I Get Yahoo to Index My Site?
How Can I Get Yahoo to Index My Site? I have installed Bing webmaster tool two months ago -- is Yahoo that slow. My site has been out since May 2010 and for a year and a half, I only have 40 pages index. HELP!!!!
On-Page Optimization | | AppleCapitalGroup0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
Formatting issues WordPress 3.1.2
The and _formatting tags format the text in the WYSIWYG editor but not in the browser (I've tried in Firefox and Internet Explorer. It doesn't help if I add non depreciate Html _and Is this a theme Issue? Is there a way to fix it? Does anyone know why and are added instead of and ?__
On-Page Optimization | | catherine-2793880