XML Sitemap Index Percentage (Large Sites)
-
Hi all
I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages).
What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap.
I'm trying to figure out whether,
- The average index % out there
- There is a ceiling (i.e. will never reach 100%)
- It's possible to improve the indexing percentage further
Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further.
I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet.
However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites.
Any suggestions/insights would be appreciated. Thanks.
-
I've worked on a site that was ~100 million pages, and I've seen indexation percentages ranging from 8% to 95%. When dealing with sites this size, there are so, so many issues at play, and there are so few sites of this size that finding an average probably won't do you much good.
Rather than focusing on whether or not you have enough pages indexed based on averages, you should focus on two key questions: "do my sitemaps only include pages that would make great search engine entry pages" and "have I done everything possible to eliminate junk pages that are wasting crawl bandwidth."
Of course, making sure you don't have any duplicate content, thin content, or poor on-site optimization issues should also be a focus.
I guess what I'm trying to say is, I believe any site can have 100% of it's search entry worthy pages indexed, but sites of that size rarely have ALL of their pages indexed since sites that large often have a ton of pages that don't make great search results.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Is my site being penalized?
I've gone through all the points on https://mza.bundledseo.com/blog/technical-site-audit-for-2015 but the site only ranks for its brand name after months. The website is not ranking in the top 100 for any main keywords (2,3,4 word phrases), only for a handful of very long phrases (4+). All of the content is unique, all pages are indexed, the website is fast and doesn't contain any crawl errors and there are a couple of links pointing to it. There is a sitewide follow link in the footer pointing to another domain, its parent company and vice-versa. This is not done for any SEO reasons but the companies are related and also the products are supplementary of each other. Could this be an issue? Or is my site being penalized by something else?
Intermediate & Advanced SEO | | Robbern0 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
Sitemaps
I am working with a site that has sitemaps broken down very specifically. By page type: article, page etc and also broken down by Category. Unfortunately, this is not done hierarchically. Category and page type are separate maps, they are not nested. My question here is: Is is detrimental to have two separate sitemaps that point to the same pages? Should we eliminate one of these taxonomies, or maybe just try to make them hierarchical? IE item type -> category -> pagetitle Is there an issue with having a sitemap index that points to a nested sitemap index? (I dont think so, but might as well be sure. Thanks Moz Community! Can't delete my question, but turns out that isn't how they are structured. Food for thought anyway I suppose.
Intermediate & Advanced SEO | | MarloSchneider0 -
Indexing techniques
Hi, I just want a confirmation about my indexing technique, if is good or can be improved. The technique is totally whitehat and can be done by one person. Any suggestions or improvements are welcome. I create the backlinks ofcource first 🙂 I make a list on public doc from Google. On the doc are only ten links. After I digg it , and add some more bookmarks 5-6. I tweet the digg and each doc. (my 2 twitter accounts have page authority 98) I like them in Fb. I ping them thru ping serviecs. Thats it. Works ok for moment. Is anything what I can do to improve my technique? Thanks lot
Intermediate & Advanced SEO | | nyanainc0 -
Where to link to HTML Sitemap?
After searching this morning and finding unclear answers I decided to ask my SEOmoz friends a few questions. Should you have an HTML sitemap? If so, where should you link to the HTML sitemap from? Should you use a noindex, follow tag? Thank you
Intermediate & Advanced SEO | | cprodigy290 -
Load balanced Site
Our client ecommerce site load from 3 different servers using load balancing. abc.com : IP: 222.222.222 Abc.com: IP: 111.111.111 For testing purpose 111.111.111 also point to beta.abc.com Now google crawling site beta.abc.com If we block beta.abc.com using robots.txt it will block google bot also , since beta.abc.com is really abc.com I know its confusing but I been trying to figure out. Ofcourse I can ask my dev to remove beta.abc.com make a seperate code and block it using .htaccess
Intermediate & Advanced SEO | | tpt.com0