Only half of the sitemap is indexed
-
I have a website with high domain authority and high quality content and blog. I've resubmitted the sitemap half a dozen times. Search console getr half way through and then stops. Does anyone know any reason for this?
I've seen the usual responses of 'google is not obligated to crawl you' but this site has been fully crawled in the past. It's very odd
Does anyone have any ideas why it might stop half way - or does anyone know a testing tool that might illuminate the situation?
-
Hi Andrew
Here a few things to check or rule out:
-
Are those pages accessible to be crawled (not blocked with robots.txt etc)
-
Are they also internally linked? (ie;s crawl with Screaming Frog, starting at the homepage and see if they turn up)
-
Is the page actually indexed (search the URL in Google) but just not showing up in Search Console?
-
How long are you waiting before resubmitting - also does it literally get half way down the list, or do you mean 50% are not indexed?
Overall, I would just submit the sitemap and you don't need to keep resubmitting. I would rather do some crosschecks to make sure the URL is accessible (crawlable) and even maybe indexed already, just not showing in the report. Usually, there's some other issue with the URL besides a sitemap issue - and like you mentioned, I'm not sure how long you're waiting, but it can indeed take weeks for them to show up.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What kind of impact does a 404 have in a sitemap regarding ranking?
We recently had a site update where our robots file disallowed our sitemap for about two weeks. When we found the problem and resubmitted the sitemap to Google Search Console, it found a 404 error. Does this have any impact on ranking or visibility if we are still recovering from the disallow?
Algorithm Updates | | GaryBlanchard0 -
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
A page will not be indexed if published without linking from anywhere?
Hi all, I have noticed one page from our competitors' website which has been hardly linked from one internal page. I just would like to know if the page not linked anywhere get indexed by Google or not? Will it be found by Google? What if a page not linked internally but go some backlinks from other websites? Thanks
Algorithm Updates | | vtmoz0 -
Does Sitemap really matters today?
Hi. Our website has multiple subdomians and domain platform is wordpress. But we haven't submitted sitemap to Google. Is this Okay? Or it's mandatory to submit sitemap? Is submitting a sitemap gonna help us in ranking improvement?
Algorithm Updates | | vtmoz0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Are xml sitemaps a thing of the past?
We had an internal debate about the importance of having a sitemap.xml on your website. Basically, there is Google documentation that indicates a sitemap.xml is due diligence: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156184 And other authoritative forums, blogposts, etc. which indicate that sitemap creation and maintenance is a waste of your time, e.g. http://webmasters.stackexchange.com/questions/4803/the-sitemap-paradox/ A bigger question is: Are there cases in which not having a sitemap.xml actually became detrimental or risky? Thanks in advance!
Algorithm Updates | | HZseo0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0