Help needed with robots.txt regarding wordpress!
-
Here is my robots.txt from google webmaster tools. These are the pages that are being blocked and I am not sure which of these to get rid of in order to unblock blog posts from being searched.
http://ensoplastics.com/theblog/?cat=743
http://ensoplastics.com/theblog/?p=240
These category pages and blog posts are blocked so do I delete the /? ...I am new to SEO and web development so I am not sure why the developer of this robots.txt file would block pages and posts in wordpress. It seems to me like that is the reason why someone has a blog so it can be searched and get more exposure for SEO purposes.
IS there a reason I should block any pages contained in wodrpress?
Sitemap: http://www.ensobottles.com/blog/sitemap.xml
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
User-agent: *Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
Disallow: /trackback
Disallow: /commentsDisallow: /feed
-
I've just looked at the home pages of the two sites and they are pretty much the same apart from substituting plastics with bottles. I'm not an expert but I would have thought Google might think this is duplicate content.
In my opinion I would concentrate on one of the sites say plastics and have the bottle specific stuff as a subsection. I'm not sure how the sites rank etc so that may be easier said than done.
As for the site map / robot question, if you continue with two sites then I would recommend generating a new one for the copied site.
-
So basically this site was duplicated and apparently the robots.txt file was duplicated. There is no sitemap for the blog created for the enso plastics site, so I am not sure how to proceed at this point. Should I just create a new robots.text file for enoplastics and replace this one? Or do I edit this one, and go create a sitemap for my blog?
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
I'm a bit confused. You reference ensoplastics.com up the top and then show the robots text from ensobottles.com
Are they using the same robots content? The sites use different url naming so ensobottles uses rewrite whereas the other site uses ?p=
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We need a bit of help from someone to fix the following issues causing speed issues on our website.
Hi We need a bit of help from someone to fix the following issues causing speed issues on our website.Does anyone know of someone that can help? Reduce server response time Optimize images Eliminate render-blocking JavaScript and CSS in above-the-fold content Avoid landing page redirects Leverage browser caching Minify CSS Minify JavaScript Minify HTML
Technical SEO | | Bev.Aquaspresso0 -
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
Help regarding updated content
Hi, Some time back we created tutorials on a test tool Quality Center (http://www.guru99.com/quality-center-tutorials.html) which now needs upgrading.
Technical SEO | | Riya8520
Currently the tool has been renamed to HP ALM.
Our dilemma is whether we should create new pages for the new tutorials or update the existing tutorials itself ? To add to our pain, most of the end users still refer the new ALM with its old name Quality Center. Also we here hit by penguin 2.1 and since then have been very precautions from SEO standpoint.
Please help
Regards
Krishna Rungta0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Wordpress and Redirects?
I want to update my permalinks - actually I want to change the URL's to fit the content and keywords better. I can choose "edit" the URL, but don't I need a redirect? I don't see any htaccess Plugin installed.......is that what I need to be able to change my URL's in Wordpress?
Technical SEO | | cschwartzel0 -
Regarding Canonical Url
We have a e-commerce website. Our own homegrown:-) We recently visited Google Webmaster tools and could see that Google mention we have double Meta tags for some main and subcategories. Each Product Category on our site have a subcategory/ Sub url - "Bestseller", "On Sale", "just arrived". The sub url is not a really a real category and we can therefore not make totally unique description and title for does urls. domain.com/category domain.com/category/bestseller
Technical SEO | | areygie
domain.com/category/on-sale
domain.com/category/just-arrived We are thinking about 2 solutions. 1. Canonical Url on subcategory pointing to main category.
2. Or add a word bestseller, on sale or just arrived in front of the meta title/description. We can do this from code. I personally opt for option 1. But I am little unsure what is the best way to go. Thanks in advance for your advice0 -
WordPress & Page Numbers
Hi, I am working on a large WP site for a client and have an issue with duplicate content and page numbers. I am using the Yoast SEO plugin but can't seem to resolve the issue. Let me give an example: If I go to a popular category, for example F1, there are over 10 pages of content for the category and although the URL changes, the Title and Meta Description stay the same. Now...if I was using a template for the title and description I could add the page number variable, but as I am overwriting the template with SEO specific category information I can't use variables and hence the problem! This is such a common problem I know somebody will have an answer! Thanks
Technical SEO | | JonathanSmith0 -
Robots.txt and robots meta
I have an odd situation. I have a CMS that has a global robots.txt which has the generic User-Agent: *
Technical SEO | | Highland
Allow: / I also have one CMS site that needs to not be indexed ever. I've read in various pages (like http://www.jesterwebster.com/robots-txt-vs-meta-tag-which-has-precedence/22 ) that robots.txt always wins over meta, but I have also read that robots.txt indicates spiderability whereas meta can control indexation. I just want the site to not be indexed. Can I leave the robots.txt as is and still put NOINDEX in the robots meta?0