Is site: a reliable method for getting full list of indexed pages?
-
The site:domain.com search seems to show less pages than it used to (Google and Bing).
It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed.
Should I now accept the "pages" listed in site: is an unreliable metric?
-
Keep in mind that for a site:domain.com search, Google now includes pages from OTHER SITES that are using the canonical tag to point to your site. So, even though it says there are 300 pages indexed, 30 of those pages might be on other sites that use the canonical tag pointing to your site. The numbers of pages indexed that you're looking at may not be entirely accurate because of this.
-
I just haven't seen where the pages reduced, but I only use that operator for a general search. I have never gone through all the pages, etc. For that I would use any of the crawler tools. It would be interesting to see a download of search, GSC, and then something like Screaming Frog to see what we see.
As soon as I wrote that I checked our site and realized what you are saying. For Google we get "About 281 results," as I go to last page of results it changes to "page 13 of 126 results."
Then out of curiosity I tried Bing and now I am scratching my head: "763 results." When I go to last possible page I get, "247-256 of 256 results." I think that means my 281 results from Google are mostly on Bing!!!! (in case someone does not realize my humor, that last statement can be defined as either jest or sarcasm.)
So, when doing the site: I get 126 with Google but search console has 428...
Certainly interesting. I will keep playing with it.
Best
-
Hi Robert,
Thanks for your input.
The reason for doing it is part of an SEO site review process to examine pages indexed in Google compared to a site crawl in a tool like screaming frog and the indexed pages defined in GSC.
In terms of the "page 24 of 201 results" example, I mean that when you first use the site:domain.com Google will give you an estimated number of results, e.g. 3000 but actually as you click through the pages you find that the number of results is reduced - sometimes significantly.
-
I am not sure I understand where you say, " ...it will end and change to "page 24 of 201 results." I have used the site: operator a long time and I think it is reasonably accurate. One thing I notice is the occasional "some pages have been ... duplicate" and do you want to see those? So, if you include all of those what's the magic number?
Is there a reason you want the data that demands an exact result? I am not sure of anything that would give you that. The question is "indexed" within the given search engine. If you crawl with screaming frog, etc. you may see pages that are not indexed, so the comparison is not apples to apples. Just curious as to what you are wanting to know exact indexed pages for?
Interesting question.
-
Typically, the site: command in Google is unreliable. There are lots of reasons why, one being that there may be pages indexed that aren't "good enough", for whatever reason, to show up in the search results. When we look at the site pages indexed, we typically will use the site: command, then click a few pages deep and look at the number it shows (not the first number of pages it shows).
For SEO auditing purposes, we're looking to see if there is a significant difference between the number of pages indexed and the number of pages that we find when we we crawl the website ourselves.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Why would Google not index all submitted pages?
On Google Search console we see that many of our submitted pages weren't indexed. What could be the reasons? | Web pages |
Technical SEO | | Leagoldberger
| 130,030 Submitted |
| 87,462 Indexed |0 -
How to get google to forget my old but still working page and list my new fully optimized page for a keyword?
Hi There! (i am beginner in seo) I have dynamic and static pages on our site. I created a static page for a specific keyword. Fully optimized it, (h1, alt, metas, etc.....maybe too optimized). My problem is that this page is alive for weeks, checked it in GWT and it is in robots.txt, google sees it, and indexed it. BUT whenewer i do a search for that keyword, we still appear with the dynamically created link in the google listings. How could i "redirect" google, if sy make a search for that keyword than shows our optimized page? Is there a tool for that? I cant delete the dynamic page... Any ideas? Thx Andrew
Technical SEO | | Neckermann0 -
How to remove the specific link from Google Listed Index?
I am working on SEO for an e-commerce client. When I search for brand name in the Google it displays the top link with tabular index of categories. Whereas I want to remove the category called Coffee from the tabular index because it redirecting to the Home page which is not relevant. For your ref. attached is the screenshot. pXjdaCH.png
Technical SEO | | mountain.penguine0 -
Is it detrimental to make a site wide change from .html to .shtml (all pages)?
We have an established website with decent domain authority. My developer inherited the site from another developer and is recommending that we convert all pages from the .html to the .shmtl From an SEO perspective, would this hurt us? Also, if this is not an issue, would updating the canonical help us, or does the canonical setting only deal with the "www." vs. "non-www"? Any insights will be appreciated greatly. Thanks!
Technical SEO | | BVREID0 -
How to get Google to index another page
Hi, I will try to make my question clear, although it is a bit complex. For my site the most important keyword is "Insurance" or at least the danish variation of this. My problem is that Google are'nt indexing my frontpage on this, but are indexing a subpage - www.mydomain.dk/insurance instead of www.mydomain.dk. My link bulding will be to subpages and to my main domain, but i wont be able to get that many links to www.mydomain.dk/insurance. So im interested in making my frontpage the page that is my main page for the keyword insurance, but without just blowing the traffic im getting from the subpage at the moment. Is there any solutions to do this? Thanks in advance.
Technical SEO | | Petersen110 -
Decreasing the size of a site to increase SEO value of remaining pages?
My website has thousands of pages and I have so many keywords on the bottom of page 1 and on page 2 of SERPs. I am considering making the site smaller to lessen the dilution of the overall domain authority and in theory the remainder pages should get pushed up in rank. Do you feel this theory is flawed? Is it better to 301 or remove the pages if they don't have backlinks directly to the internal page? These are pages I would re-enable down the road once overall domain authority is increased. thanks, David couponcactus.com
Technical SEO | | CouponCactus0 -
What pages of my site does Google rank as the most important?
If I type site:youtube.com into Google, are the results listed by what Google considers to be the most important pages of the site? If I change my sitemap should this order change? Thanks!
Technical SEO | | Seaward-Group0