Not All Submitted URLs in Sitemap Get Indexed
-
Hey Guys,
I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right?
Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different?
Thanks for your help on this
Cheers
-
Thanks Dan, but I have registered the right URL (http).
However today I have again 100% indexed from the submitted URLs (changed nothing). Really Crazy.
Cheers,
Heiko
-
This can happen if you don't have the correct version of your URL registered in webmaster tools, so something to check
-
Hi There
One thing to check - do you have the exact version of domain registered in webmaster tools? So www or non-www and http or httpS? This has to be exact, webmaster tools considers them all different sites and you can get limited data if the wrong one is registered.
That would be the biggest cause of discrepancy. If this is not the case, there are many times Webmaster Tools data can lag behind, or be different than the index. I would go with what you see in actual Google searches though as the "final answer".
-
I get the same thing. Nobody on here seems to know the answer (I asked a similar question in the last week or so) - if the pages are there when you do a manual search then I wouldn't sweat it. I have taken the view that it's not worth worrying about!
Good luck Amelia
-
I didn't change the sitemap in the last 4 months. At the beginning the numbers match exactly, so submitted and indexed URLs where the same. But this week I recognized, that now of about 20% are not indexed any more. So I already got confused, but the manual check showed that everything is ok.
However, I just would like to know, why there is this difference in webmaster tools....
Cheers
-
this is clear, but has nothing to do with my original question. I just wanted to know why webmaster tools doesn't display the right number of indexed pages from the sitemap. It would just be the easiest way to recognize when some pages will get de-indexed for whatever reason.
-
Hi there
This is pretty common. Google sometimes shows varying numbers in Webmaster Tools and what actually appears in the index. When did you submit your sitemap?
Here are some reasons that Google may not index all of your pages.
Check your robots.txt to be sure, but give yourself a bit of time for the indexing number in WMT to update. The good news is that you are seeing your pages in search - so that's a positive.
I would also check to see if you have any duplicate or thin content on the website, dynamic URLs in your sitemap, check how deep your pages go (this is especially important due to crawl budgets), and also your website's canonical tag situation.
These are some things I would look into. Hope this helps! Good luck!
-
sitemap does not ensure you are in the index. they just inform the search engine about your site.
in fact Bing suggest you only put hidden pages and important pages in sitemap.
IMO they are overrated unless you have something special to inform them of, or a very large site , they will find it crawling your site normaly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After you remove a 301 redirect that Google has processed, will the new URL retain any of the link equity from the old URL?
Lets say you 301 redirect URL A to URL B, and URL A has some backlinks from other sites. Say you left the 301 redirect in place for a year, and Google had already replaced the old URL with the new URL in the SERPs, would the new URL (B) retain some of the link equity from URL A after the 301 redirect was removed, or does the redirect have to remain in place forever?
Technical SEO | | johnwalkersmith0 -
Over 500 thin URLs indexed from dynamically created pages (for lightboxes)
I have a client who has a resources section. This section is primarily devoted to definitions of terms in the industry. These definitions appear in colored boxes that, when you click on them, turn into a lightbox with their own unique URL. Example URL: /resources/?resource=dlna The information for these lightboxes is pulled from a standard page: /resources/dlna. Both are indexed, resulting in over 500 indexed pages that are either a simple lightbox or a full page with very minimal content. My question is this: Should they be de-indexed? Another option I'm knocking around is working with the client to create Skyscraper pages, but this is obviously a massive undertaking given how many they have. Would appreciate your thoughts. Thanks.
Technical SEO | | Alces0 -
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Site indexed by Google, but (almost) never gets impressions
Hi there, I have a question that I wasn't able to give it a reasonable answer yet, so I'm going to trust on all of you. Basically a site has all its pages indexed by Google (I verified with site:sitename.com) and it also has great and unique content. All on-page grades are A with absolutely no negative factors at all. However its pages do not get impressions almost at all. Of course I didn't expect it to be on page 1 since it has been launched on Dec, 1st, but it looks like Google is ignoring (or giving it bad scores) for some reason. Only things that can contribute to that could be: domain privacy on the domain, redirect from the www to the subdomain we use (we did this because it will be a multi-language site, so we'll assign to each country a subdomain), recency (it has been put online on Dec 1st and the domain is just a couple of months old). Or maybe because we blocked crawlers for a few days before the launch? Exactly a few days before Dec 1st. What do you think? What could be the reason for that? Thanks guys!
Technical SEO | | ruggero0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
URL Change, Old URLs Still In Index
Recently changed URLs on a website to remove dynamic parameters. We 301'd the old dynamic links (canonical version) to the cleaner parameter-free URLs. We then updated the canonical tags to reflect these changes. All pages dropped at least a few ranking positions and now Moz shows both the new page ranking slightly lower in results pages and the old page still in the index. I feel like I'm splitting value between the two page versions until the old one disappears... is there a way to consolidate this quickly?
Technical SEO | | ShawnW0 -
Will rel canonical tags remove previously indexed URLs?
Hello, 7 days ago, we implemented canonical tags to resolve duplicate content issues that had been caused by URL parameters. These "duplicate content" had already been indexed. Now that the URLs have rel canonical tags in place, will Google automatically remove from its index the other URLs with the URL parameters? I ask because we have been tracking the approximate number of URLs indexed by doing a site: search in Google, and we have barely noticed a decrease in URLs indexed. Thanks.
Technical SEO | | yacpro130 -
Xml Sitemap
Hi mozzers, I am about to submit a sitemap for one of my clients via webmaster tools. The issue is that I have way too many urls that I don't want them to be indexed by Google such as testing pages, auto generated pages... Is there way to remove certain URL from the XML sitemap or is this impossible? If impossible, is the only way to control these urls is to "No index" all these pages that i don't want the search engine to see? Thanks Mozzers,
Technical SEO | | Ideas-Money-Art0