Index dropped 20 pages at once since yesterday
-
Hi community,
I just realized that my indexed pages dropped from the amount of 95 to 75 and I don't know why. I did some title tag arrangements because we are launching with our first product (before that it was just a blog). I did these changes 1 week ago and fetched to google the homepage and some subdomains.
Thanks for your help.
Kind regards
Marco
-
Excellent news - glad it has all returned
-Andy
-
Everything is back to normal.
Thanks again, Andy.
-
No problem at all
-
Andy, thanks so much for that piece of quality content!
-
I think this might explain your issue...
Gary Illyes @methode
Bad news: we might have a problem with reporting the number of indexed URLs in the SC Sitemaps feature. Good news: we're looking into it
Soooo, it looks like the number of indexed pages in Webmaster Tools (Search Console) is being reported incorrectly.
That would explain what you are seeing
-Andy
-
Thanks a lot Andy,
maybe it has something to do with "fetching as google", which I did around 1 week ago with they main homepage as I changed some major keywords (titel tag) and moved them to other sites (blog categories etc) when I added our first product/collection.
Nevermind - time will tell
Thanks again!
-
You aren't disturbing Marco
It really isn't uncommon to see discrepancies like this. I see them every day! A 25% drop like this suggests to me that Google is perhaps doing a little bit of a reshuffling.
I would wait just a bit to see if the number of pages starts to increase as there isn't an awful lot else you can do - it sounds like it has all been done.
-Andy
-
That is what I did around 7 days ago, so before the drop of index.
-
Try to identiy pages that has more links and use "fetch as google" and chose to re-crawl the URL.
-
Thanks for the tip - I will do that. But I don't understand how this has something to do with this issue. At the moment all search engines show my sites ranking (so probably indexed). Just Search Console is giving me some wrong(?) information. I don't think there can be a discrepancy in a way that Google Search Console shows me de-indexing in advance of google serp.
Cheers
-
did you also verify your site under bing/yahoo webmaster tools. if not yet i suggest you do you will be surprised how faster and how effective is bing yahoo indexation. wait for couples of days en perform the site: www.domain.com you will see how index differ with google
-
Hi,
thanks for your help. "site:domain"-check in bing and yahoo show the exact same as google. So all the pages are currently in serps.
-
do similar check on bing and yahoo. they are much more effective and speedy than google but problem is market share that google have
-
Hi Andy,
sorry for disturbing, but I just did a check with "site:paleotogo.de" in google search and it found all pages.
In Search Console it tells me my blog has 54 out of 75 indexed (this was the sudden drop I spoke about earlier). But when I look further into the blog sitemap it shows me all 75 pages when I count them.
What is happening here? I really don't understand
Cheers, Marco
-
No problem at all. Just update here if it hasn't rectified itself soon and we can take another look
-Andy
-
Thanks for your help, Andy - really appreciate it. Hope everything will turn out well.
Just realized I forgot to mention the page it is about: www.paleotogo.de
-
Ah sorry, I missed that bit Marco.
When Google drop pages from the index, it can be for a whole host of reasons. However, Google never indexed 100% of pages (or very, very rarely). If you were at 95% and now at 75%, then this would suggest to me that Google has either lost some level of trust in the pages or you will just have to wait until the pages are re-indexed and Google has decided what to do with them again.
I would be tempted to wait for a little bit as over time, you should see pages being re-indexed again. If you have already re-submitted the sitemap, just make sure there is no problem with that - rebuild it and then re-submit if you haven't already, just to be sure.
-Andy
-
Hi Andy,
it's not about the drop of a specific keyword, it's about the drop of indexed pages going from 100% indexed to around 75% indexed.
Cheers
Marco
-
Hi Moosa,
thanks for your help! I did a resubmit 2 hours ago and just checked again, but it didn't index all of the pages (20 left again).
Strange.
-
Hi Marco,
It all depends on what you were tracking as phrases before, and now after you have made the changes. If you were tracking for the phrase "Red Sneakers" and changed the title to "Blue Sneakers" then you would expect to see a drop with your original phrase. Have you updated your tracking to compensate for the changes?
-Andy
-
Ideally just by changing the title tags the index should not be dropped but if this is something you have noticed, you can always update your sitemap.xml and resubmit it to Google and in the next crawl it should be indexed again.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page optimisation
It's not possible to add a keyword to the homepage url. Do we still add a secondary keyword along the primary one on the home page title and meta description etc. How do we make our primary keyword to dominate in this case. Thanks
On-Page Optimization | | Timberwink0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Home page cannibal
I was wondering if others had the same problem I have. It appears Google loves that home page too much and I'm having a difficult time getting it to rank the page I really want. And that happens if a keyword I want to rank for only appears on the home page one time with a keyword density of .1%. Take vanillaqueen.com for example. The home page ranks on the first page for "bulk vanilla beans" and not http://vanillaqueen.com/shop/category/vanilla-beans/ or http://vanillaqueen.com/five-reasons-why-buying-bulk-vanilla-makes-good-sense/ And I'll add another one that I recently took on. This is a personal injury attorney in a large city so there is a ton of competition who have been doing SEO for a very long time. (Fortunately he also does business and civil litigation law to keep the business going). Last month, according to webmaster tools, he got a couple of clicks (hey, it's something!) on "personal injury attorney [his city]" on page 2 in the SERPS, but it was his home page. http://bit.ly/1Gvumlm **In this case I don't mind people landing on the home page, but does the fact that another page that is much better optimized for those keywords indicate a penalty on that page? And is his rank lower because the better page is not ranking and Google has to find the next best thing in the home page? ** Has anyone else experienced that and what have you done to get Google to not go home? P.S. The law site is a huge challenge because of the competition. Any help you pros out there can offer to get this underdog out of hiding will be much appreciated. We're starting a smart, strategic content marketing plan now that I'm very excited about.
On-Page Optimization | | katandmouse1 -
Listing all services on one page vs separate pages per service
My company offers several generalized categories with more specific services underneath each category. Currently the way it's structured is if you click "Voice" you get a full description of each voice service we offer. I have a feeling this is shooting us in the foot. Would it be better to have a general overview of the services we offer on the "Voice" page that then links to the specified service? The blurb about the service on the overview page would be unique, not taken from the actual specific service's page.
On-Page Optimization | | AMATechTel0 -
How to schedule the on page reports myself
The on page reports are scheduled on mondays, but is there a way to schedule it my self.
On-Page Optimization | | JoostBruining0 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
Should I convert PDFs to pages?
I have a client that has a lot of content in pdf files that are linked to from their website. The content on the site itself is quite thin. Should I recommend to them that they convert at least some of pdf files to actual pages on their website? That way there could be a title tag, meta-description, header tags, etc associated with the content. What role do pdf files play in SEO? Thanks!
On-Page Optimization | | bvalentine0