Not All Submitted URLs in Sitemap Get Indexed
-
Hey Guys,
I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right?
Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different?
Thanks for your help on this
Cheers
-
Thanks Dan, but I have registered the right URL (http).
However today I have again 100% indexed from the submitted URLs (changed nothing). Really Crazy.
Cheers,
Heiko
-
This can happen if you don't have the correct version of your URL registered in webmaster tools, so something to check
-
Hi There
One thing to check - do you have the exact version of domain registered in webmaster tools? So www or non-www and http or httpS? This has to be exact, webmaster tools considers them all different sites and you can get limited data if the wrong one is registered.
That would be the biggest cause of discrepancy. If this is not the case, there are many times Webmaster Tools data can lag behind, or be different than the index. I would go with what you see in actual Google searches though as the "final answer".
-
I get the same thing. Nobody on here seems to know the answer (I asked a similar question in the last week or so) - if the pages are there when you do a manual search then I wouldn't sweat it. I have taken the view that it's not worth worrying about!
Good luck Amelia
-
I didn't change the sitemap in the last 4 months. At the beginning the numbers match exactly, so submitted and indexed URLs where the same. But this week I recognized, that now of about 20% are not indexed any more. So I already got confused, but the manual check showed that everything is ok.
However, I just would like to know, why there is this difference in webmaster tools....
Cheers
-
this is clear, but has nothing to do with my original question. I just wanted to know why webmaster tools doesn't display the right number of indexed pages from the sitemap. It would just be the easiest way to recognize when some pages will get de-indexed for whatever reason.
-
Hi there
This is pretty common. Google sometimes shows varying numbers in Webmaster Tools and what actually appears in the index. When did you submit your sitemap?
Here are some reasons that Google may not index all of your pages.
Check your robots.txt to be sure, but give yourself a bit of time for the indexing number in WMT to update. The good news is that you are seeing your pages in search - so that's a positive.
I would also check to see if you have any duplicate or thin content on the website, dynamic URLs in your sitemap, check how deep your pages go (this is especially important due to crawl budgets), and also your website's canonical tag situation.
These are some things I would look into. Hope this helps! Good luck!
-
sitemap does not ensure you are in the index. they just inform the search engine about your site.
in fact Bing suggest you only put hidden pages and important pages in sitemap.
IMO they are overrated unless you have something special to inform them of, or a very large site , they will find it crawling your site normaly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
Hello all, I am Greek and I have a quite strange question for you. Greek characters are generally recognized as special characters and need to have UTF-8 encoding. The question is about the URLs of Greek websites. According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages. To give you an example let's look at A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks. On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)! So the question is: For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ? Thank you very much for your help! Regards, Lenia
Technical SEO | | tevag0 -
Bing and Yahoo Indexing
I have a young site (6 most) that is almost completely indexed by Google but Bing and Yahoo will only index a few pages. Does anyone have any tips for getting more pages indexed in Bing and Yahoo. The site is registered with Bing Webmaster tools and has a valid XML sitemmap.
Technical SEO | | waynekolenchuk0 -
Getting images indexed in the SERPS
Good Afternoon form 13 degrees C totally Sunny Wetherby UK 🙂 Am i right in thinking that the only way to get images appearing like this in your serps: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/innovia-merchant-immages-serpscopy.jpg is to be hooked up to Google Merchant? Which kind of means if the sight your working on has no images then this type of enhancement is out of bounds? Thanks in advance, David
Technical SEO | | Nightwing0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
What tool do you use to check for URLs not indexed?
What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing. Thanks, Mark
Technical SEO | | elephantseo3