Google Sitemap only indexing 50% Is that a problem?
-
We have about 18,000 pages submitted on our Google Sitemap and only about 9000 of them are indexed. Is this a problem?
We have a script that creates a sitemap on a daily basis and it is submitted on a daily basis. Am I better off only doing it once a week? Is this why I never get to the full 18,000 indexed?
-
My robots, tags and redirects are all good now. Any other things to look at?
-
Have you done some troubleshooting? If there's that much of a % change, did you check your robots, tags, redirects, etc. to see if any of the technical side may be hindering indexing?
-
It is a large e-commerce site with pretty much the exact situation described. We re did the site about 6 weeks ago and the site before was always close to 100% indexed. It was about 17900 out of 18000.
-
Great answer Donford. We have a large site, with many items that are basically the same but usually have one different attribute value. So Google will typical index a parent page and list the rest as:
Results 1 - 15 of 15 – Medium Duty - Swivel Top Plate - Capacity to 400 lbs ...
So even though the page may not be in the primary index, it will still help the visitor get to what they are looking for. So I would advise grabbing a snippet of text on a page not indexed and using it as a query to see if this is the case.
-
Google will index more as they find value in more links. The last ecommerce site I worked on had 12,000 pages as of the end of the year they were 85% indexed.
It is quite common from my experience for larger sites to take awhile to be fully indexed if ever at all. Here is what Goolge says about ensuring proper setup, but other then what they say, its all about content and uniqueness. A particular challenge for some e-commerce sites whom sell items that are similar in nature. Like 1/2"x1" screw vs 5/8" x 1" screw. Its very hard to develop unique content for items that similar.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Google Search Console indexes website for www but images for non www.
On the google search console, the website data is all showing for the www.promierproducts.com. The images however are indexed on the non www version. I'm not sure why.
Intermediate & Advanced SEO | | MikeSab1 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Sitemaps on the fly
Has anyone submitted pages that generate sitemaps on the fly as opposed to only submitting static XML files to Bing? For instance, sitemap.php vs sitemap.xml, video sitemap.php vs videositemap.xml?
Intermediate & Advanced SEO | | alhallinan0 -
Google Places Listing Active In Two Seperate Google Places Accounts?
Hi is there any issues with having a google places listing in two seperate google places accounts. For example we have a client who cannot access their old google places account (ex-employee had their login details which they can't get) and want us to take control over the listing. If we click the "is this your listing" manage this page button - and claim the listing, will this transfer the listing to our control? Or will it create a duplicate? Are there any problems having the listing in different separate accounts. Is it a situation in which the last person who manages the listing takes control? And the listing automatically deactivates from the old account? Do all the images remain aswell? Thanks,
Intermediate & Advanced SEO | | MBASydney
Tom0 -
Indexing issue?
Hey guys when I do a search of site:thetechblock.com query in Google I don't seem to see any recent posts (nothing for August). In Google webmaster I see that the site is being crawled (I think), but I'm not sure. I also see the the sitemaps are being indexed but again it just seems really odd that I'm not seeing these in Google results. SEO seems all good too with SEO Moz. Is there something I'm not getting?
Intermediate & Advanced SEO | | ttb0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0