BEST Wordpress Robots.txt Sitemap Practice??
-
Alright, my question comes directly from this article by SEOmoz
http://www.seomoz.org/learn-seo/robotstxt
Yes, I have submitted the sitemap to google, bing's webmaster tools and
and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with?
<code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code>
<code>???</code>
because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read other questions. but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt http://www.seomoz.org/q/robots-txt-question-2 http://www.seomoz.org/q/quick-robots-txt-check. http://www.seomoz.org/q/xml-sitemap-instruction-in-robots-txt-worth-doing I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right.
User-agent: *
Disallow:
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments**ERASE EVERYTHING??? and changed it to**
<code>
<code>
<code>User-agent: *
Disallow: </code>Sitemap: http://www.example.com/sitemap_index.xml</code>
<code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code>
<code>?????????</code> ```</code>
-
This is a duplicate question and was answered in another thread.
Thanks!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best tool for getting a SiteMap url for a website with over 4k pages?
I have just migrated my website from HUGO to Wordpress and I want to submit the sitemap to Google Search Console (because I haven't done so in a couple years). It looks like there are many tools for getting a sitemap file built. But I think they probably vary in quality. Especially considering the size of my site.
Technical SEO | | DanKellyCockroach2 -
Should a login page for a payroll / timekeeping comp[any be no follow for robots.txt?
I am managing a Timekeeping/Payroll company. My question is about the customer login page. Would this typically be nofollow for robots?
Technical SEO | | donsilvernail0 -
Automate XML Sitemaps
Quick question, which is the best method that people have for automating sitemaps. We publish around 200 times a day and I would like to make sure as soon as we publish it gets updated in the site map. What is the best method of updating a sitemap so it gets updated immediately after it is published.
Technical SEO | | mattdinbrooklyn0 -
Spamming and Wordpress
Hi, I have a Wordpress site for which I was ranking #1 for my main key phrase. Then I noticed that my site had plummeted in ranking. Investigating I found the cause to be a hacking issue where my code has lots of content for and backlinks to Viagra sites! How do I best work on retrieving my ranking and making sure that the site in question gets penalized?
Technical SEO | | vibelingo0 -
Restricted by robots.txt does this cause problems?
I have restricted around 1,500 links which are links to retailers website and links that affiliate links accorsing to webmaster tools Is this the right approach as I thought it would affect the link juice? or should I take the no follow out of the restricted by robots.txt file
Technical SEO | | ocelot0 -
Local SEO best practices for multiple locations
When dealing with local search for a business with multiple locations, I've always created an individual page for each location. Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far. But, the one thing you can always count on with Local is that the game keeps changing. So I'd like to hear what you think... How do you deal with multiple locations these days? Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now) How does Schema.org fit in to your tactics in this area, if at all? Cheers (Edit: dear SEOmoz, stop eating my line breaks)
Technical SEO | | BedeFahey0 -
Can I Disallow Faceted Nav URLs - Robots.txt
I have been disallowing /*? So I know that works without affecting crawling. I am wondering if I can disallow the faceted nav urls. So disallow: /category.html/? /category2.html/? /category3.html/*? To prevent the price faceted url from being cached: /category.html?price=1%2C1000
Technical SEO | | tylerfraser
and
/category.html?price=1%2C1000&product_material=88 Thanks!0 -
Restricted by robots.txt and soft bounce issues (related).
In our web master tools we have 35K (ish) URLs that are restricted by robots.txt and as have 1200(ish) soft 404s. WE can't seem to figure out how to properly resolve these URLs so that they no longer show up this way. Our traffic from SEO has taken a major hit over the last 2 weeks because of this. Any help? Thanks, Libby
Technical SEO | | GristMarketing0