XML Sitemap Issue or not?
-
Hi Everyone,
I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues.
Issue: Url blocked by robots.txt.
Description: Sitemap contains urls which are blocked by robots.txt.
Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml
Value: http://www.example.org/author/admin/
My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked.
Do you think i m having a major problem or everything is fine?What should I do? How can I fix it?
FYI: Wordpress is what we use for our website
Thanks
-
Hi Dan
Thanks for your answer. Would you really recommend using the plugin instead of just uploading the xml sitemap directly to the website's root directory? If yes why?
Thanks
-
Lisa
I would honestly switch to the Yoast SEO plugin. It handles the SEO (and robots.txt) a lot better, as well as the XML sitemaps all within that one plugin.
I'd check out my guide for setting up WordPress for SEO on the moz blog.
Most WP robots.txt files will look like this;
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/
And that's it.
You could always just try changing yours to the above setting first,
before switching to Yoast SEO - I bet that would clear up
the sitemap issues.
Hope that helps!
-Dan ```
-
Lisa, try checking manually which URL is not getting indexed in Google. Make sure you do not have any no follows on those pages. If all the pages are connected / linked together, then Google will crawl your whole site eventually, just a matter of time.
-
Hi
when generating sitemap there are 46 URLs detected by xml-sitemaps.com but when adding the sitemap to WMT only 12 get submitted and 5 are indexed which is really kind of worrying me. This might be because of the xml sitemap plugin that I installed. May be something is wrong with my settings(doc attached 1&2)
I am kind of lost especially that SEOmoz hasn't detected any URLs blocked by Robot.txt
It would be great if you could tell me what should I do next ?
Thanks
-
The first question i would ask is how big is the difference. If the difference is a large in the # of pages on your site and the ones indexed by Google, then you have an issue. The blocked pages might be the ones linking to the ones that have not been indexed and causing issues. Try removing the no follow on those pages and then resubmit your sitemap and see if that fixes the issue. Also double check your site map to make sure you have correctly added all the pages in it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IP Redirect causing Indexing Issue
Hi, I am trying to redirect any IP from outside India that comes to Store site (https://store.nirogam.com/) to Global Store site (https://global.nirogam.com/) using this methodThis is causing various indexing issues for Store site as Googlebot from US also gets redirected!- Very few pages for "store.nirogam.com/products/" are being indexed. Even after submission of sitemap it indexed ~50 pages and then went back to 1 page etc. Only ~20 pages indexed for now.- After this I tried manually indexing via "Crawl -> Fetch as Google" - but then it showed me a redirect to global.nirogam.com. All have their "status -> Redirected" - This is why bots are not able to index the site.What are possible solutions for this? How can we tell bots to index these pages and not get redirected?Will a popup method where we ask user if they are outside India help in solving this issue?All approaches/suggestions will be highly appreciated.
Technical SEO | | pks3330 -
Duplicate content issue on Magento platform
We have a lot of duplicate pages (600 urls) on our site (total urls 800) built on the Magento e-commerce platform. We have the same products in a number of different categories that make it easy for people to choose which product suits their needs. If we enable the canonical fix in Magento will it dramatically reduce the number of pages that are indexed. Surely with more pages indexed (even though they are duplicates) we get more search results visibility. I'm new to this particular SEO issue. What do the SEO community have to say on this matter. Do we go ahead with the canonical fix or leave it?
Technical SEO | | PeterDavies0 -
How do I setup sitemaps for an international website?
I am adding translated versions of my sites to a subdomain for example es.example.com. Will I add each subdomain into Google Webmaster Tools? Will each need its own sitemap?
Technical SEO | | EcommerceSite0 -
Home page canonical issues
I think I’ve got a canonical issue with a client’s site that I’m having problems with I’ve noticed in their analytics that they receive traffic from themselves. I’ve used ‘ rel canonical’ throughout the site to avoid any dup issues and I have 301’ed every other variation of the home page I can think of. I don’t have full access to the back end of the host to control any of the iis as it’s an asp site. They seem to be getting traffic from their site under the URL of, example.com I’ve 301 redirected www.example.com/home.asp www.example.com/default.asp www.example.com/index.asp to www.example.com And 'rel canonical' the home page to www.example.com but still seem to be having the same problem any ideas? Thanks
Technical SEO | | FarkyRafiq0 -
Benefits to having an HTML sitemap?
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority. Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
Technical SEO | | BostonWright0 -
Mobile sitemaps - how much value?
Hi, We have a main www website with a standard sitemap. We also have a m. site for mobile content (the mobile site only contains our top pages and doesn't include the entire site). If a mobile client accesses one of our www pages we redirect to the m. page. If we don't have a m. version we keep them on the www site. Since we already have a www sitemap, is there much value in creating a mobile site map? The mobile site (although missing all pages) is pretty robust and contains most content people are looking for. Will the mobile sitemap help for Mobile searches (more so than our standard sitemap)? I'm also planning on rel canonical the m. pages to the www. pages (per other suggestios on SEOMoz) Thanks
Technical SEO | | NicB10 -
Sitemap with References to Second Domain
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content. I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me... The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic. Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
Technical SEO | | ShaMenz0 -
A sitemap... What's the purpose?
Hello everybody, my question is really simple: what's the purpose of a sitemap? It's to help the robots to crawl your website but if you're website has a good architecture, the robots will be able to crawl your site easily! Am I wrong? Thank you for yours answers, Jonathan
Technical SEO | | JonathanLeplang0