BEST Wordpress Robots.txt Sitemap Practice??
-
Alright, my question comes directly from this article by SEOmoz
http://www.seomoz.org/learn-seo/robotstxt
Yes, I have submitted the sitemap to google, bing's webmaster tools and
and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with?
<code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code>
<code>???</code>
because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read other questions. but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt http://www.seomoz.org/q/robots-txt-question-2 http://www.seomoz.org/q/quick-robots-txt-check. http://www.seomoz.org/q/xml-sitemap-instruction-in-robots-txt-worth-doing I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right.
User-agent: *
Disallow:
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments**ERASE EVERYTHING??? and changed it to**
<code>
<code>
<code>User-agent: *
Disallow: </code>Sitemap: http://www.example.com/sitemap_index.xml</code>
<code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code>
<code>?????????</code> ```</code>
-
This is a duplicate question and was answered in another thread.
Thanks!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I submitted Sitemaps from AIO SEO to google search console, if I now delete the AIO plugin, do my sitemaps become invalid?
I use Yoast as SEO for my new Wordpress website https://www.satisfiedshoes.com/, however I couldn't get the sitemaps with Yoast as it was giving me error 404, and regardless of what I tried, it wasn't working. So I then got the All In One SEO while still having Yoast installed, I easily got the AIO sitemaps and then submitted them successfully to the Google search console. My question is that now I got the sitemaps on Google, since I'd rather use Yoast, If I want to delete AIO, will the sidemaps given to Google become invalid? There is no point keeping both SEO plugins active right? Thank You
Technical SEO | | iamzain160 -
Should a login page for a payroll / timekeeping comp[any be no follow for robots.txt?
I am managing a Timekeeping/Payroll company. My question is about the customer login page. Would this typically be nofollow for robots?
Technical SEO | | donsilvernail0 -
301 Clean-Up - Best Practices & Procedure?
Hello Again, I have taken over managing a website for about 2 months and have fixed a whole heap of problems. Im now turning my attention to the URL rewrites as there are ALOT of them. I have fixed the most problematic offenders that were blocking products and all sorts of mischief but I now want to clean them up. The website is on Magento, and there are 240 custom URL rewrites. Question 1: Am i correct that I should edit the links on my website so that they link directly to the new page instead of utilising the re-direct for best SEO results. Question 2: If my website doesn't utilise the URL rewrite (fixed in question 1) its only purpose is to transfer link juice from any external link the page had before. If this page didnt have any external inbound links then I can delete the URL rewrite as it serves no purpose. Question 3: If Q1 and Q2 are correct, what is the quickest way to check the inbound links to a page quickly so I can make a quick decision on if i should remove the re-write. Many Thanks in advance!
Technical SEO | | ATP1 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Robots.txt
www.mywebsite.com**/details/**home-to-mome-4596 www.mywebsite.com**/details/**home-moving-4599 www.mywebsite.com**/details/**1-bedroom-apartment-4601 www.mywebsite.com**/details/**4-bedroom-apartment-4612 We have so many pages like this, we do not want to Google crawl this pages So we added the following code to Robots.txt User-agent: Googlebot Disallow: /details/ This code is correct?
Technical SEO | | iskq0 -
Robots.txt to disallow /index.php/ path
Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle these issues. The URLs cause indexation errors with google (404). Now, I fixed this issue once before, but the problem persist. So I thought, instead of wasting more time, couldnt I just disallow all paths containing /index.php/ ?. I don't use that extension, but would it cause me any problems from an SEO perspective? How do I disallow all index.php's? Is it a simple: Disallow: /index.php/
Technical SEO | | Mikkehl0 -
Same URL in "Duplicate Content" and "Blocked by robots.txt"?
How can the same URL show up in Seomoz Crawl Diagnostics "Most common errors and warnings" in both the "Duplicate Content"-list and the "Blocked by robots.txt"-list? Shouldnt the latter exclude it from the first list?
Technical SEO | | alsvik0 -
Robots.txt
Hello Everyone, The problem I'm having is not knowing where to have the robots.txt file on our server. We have our main domain (company.com) with a robots.txt file in the root of the site, but we also have our blog (company.com/blog) where were trying to disallow certain directories from being crawled for SEO purposes... Would having the blog in the sub-directory still need its own robots.txt? or can I reference the directories i don't want crawled within the blog using the root robots.txt file? Thanks for your insight on this matter.
Technical SEO | | BailHotline0