Help needed with robots.txt regarding wordpress!
-
Here is my robots.txt from google webmaster tools. These are the pages that are being blocked and I am not sure which of these to get rid of in order to unblock blog posts from being searched.
http://ensoplastics.com/theblog/?cat=743
http://ensoplastics.com/theblog/?p=240
These category pages and blog posts are blocked so do I delete the /? ...I am new to SEO and web development so I am not sure why the developer of this robots.txt file would block pages and posts in wordpress. It seems to me like that is the reason why someone has a blog so it can be searched and get more exposure for SEO purposes.
IS there a reason I should block any pages contained in wodrpress?
Sitemap: http://www.ensobottles.com/blog/sitemap.xml
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
User-agent: *Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
Disallow: /trackback
Disallow: /commentsDisallow: /feed
-
I've just looked at the home pages of the two sites and they are pretty much the same apart from substituting plastics with bottles. I'm not an expert but I would have thought Google might think this is duplicate content.
In my opinion I would concentrate on one of the sites say plastics and have the bottle specific stuff as a subsection. I'm not sure how the sites rank etc so that may be easier said than done.
As for the site map / robot question, if you continue with two sites then I would recommend generating a new one for the copied site.
-
So basically this site was duplicated and apparently the robots.txt file was duplicated. There is no sitemap for the blog created for the enso plastics site, so I am not sure how to proceed at this point. Should I just create a new robots.text file for enoplastics and replace this one? Or do I edit this one, and go create a sitemap for my blog?
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
I'm a bit confused. You reference ensoplastics.com up the top and then show the robots text from ensobottles.com
Are they using the same robots content? The sites use different url naming so ensobottles uses rewrite whereas the other site uses ?p=
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
Clarification regarding robots.txt protocol
Hi,
Technical SEO | | nlogix
I have a website , and having 1000 above url and all the url already got indexed in Google . Now am going to stop all the available services in my website and removed all the landing pages from website. Now only home page available . So i need to remove all the indexed urls from Google . I have already used robots txt protocol for removing url. i guess it is not a good method for adding bulk amount of urls (nearly 1000) in robots.txt . So just wanted to know is there any other method for removing indexed urls.
Please advice.0 -
Need joomla help. site rankings dropped since upgrade
Hi, really having problems here with www.in2town.co.uk our site was always in the top ten for a large number of important keywords and since our upgrade from joomla 1.5 a month ago to joomla 3.0 we have vanished out of a lot of the search engines for important keywords I am concerned that we could be doing something wrong. two of our important keywords is lifestyle magazine and also gastric band hypnotherapy, but we are nowhere to be seen for these keywords. I must be doing something wrong. The site is an old site, we have been around for many years and have always ranked well up until now. We use K2 and i am just wondering if we have set up things wrong. for example under this article it comes under the category of trip advisor http://www.in2town.co.uk/trip-advisor/t ... -term-stay now when you delete the article name and just keep the trip advisor you get a list of the articles and i am wondering if this is damaging my site. http://www.in2town.co.uk/trip-advisor any help in helping me to understand why my site has dropped in rankings would be great.
Technical SEO | | ClaireH-1848860 -
Canonical needed after no index
Hi do you need to point canonical from a subpage to main page if you have already marked a no index on the subpage, like when google is not indexing it so do we need canonicals now as is it passing any juice?
Technical SEO | | razasaeed0 -
Robots.txt anomaly
Hi, I'm monitoring a site thats had a new design relaunch and new robots.txt added. Over the period of a week (since launch) webmaster tools has shown a steadily increasing number of blocked urls (now at 14). In the robots.txt file though theres only 12 lines with the disallow command, could this be occurring because a line in the command could refer to more than one page/url ? They all look like single urls for example: Disallow: /wp-content/plugins
Technical SEO | | Dan-Lawrence
Disallow: /wp-content/cache
Disallow: /wp-content/themes etc, etc And is it normal for webmaster tools reporting of robots.txt blocked urls to steadily increase in number over time, as opposed to being identified straight away ? Thanks in advance for any help/advice/clarity why this may be happening ? Cheers Dan0 -
301 Redirect Help
How would you 301 redirect and entire folder to a specific file within the same domain? Scenario www.domain.com/folder to www.domain.com/file.html Thanks for your Input...
Technical SEO | | dhidalgo11 -
Robots.txt query
Quick question, if this appears in a clients robots.txt file, what does it mean? Disallow: /*/_/ Does it mean no pages can be indexed? I have checked and there are no pages in the index but it's a new site too so not sure if this is the problem. Thanks Karen
Technical SEO | | Karen_Dauncey0 -
Allow or Disallow First in Robots.txt
If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page
Technical SEO | | irvingw0