Why Is this page de-indexed?
-
I have dropped out for all my first page KWDs for this page
https://www.key.co.uk/en/key/dollies-load-movers-door-skates
Can anyone see an issue?
I am trying to find one....
We did just migrate to HTTPS but other areas have no problem
-
Hi
Yes, there was an issue with my rank tracking software - phew
Thank you
-
OK great thanks for your help
I'll keep an eye on everything
-
You are 5th for Heavy Duty Dolly
I don't see what the problem is - the page is doing really well??
Regards
Nigel
-
This is not an issue.
Its totally normal to have some http indexed pages left. Even more common if the migration is recent. Dont be afraid of this Becky.
-
Yeap, give it a little time to google re-crawl all the new site. I´d give it nearley a month to consider that google has seen completely the new version of the site, always checking the number of indexed pages in GSC and the resuls appearing for a site: search
Being out of top 100 gives you a clue that you are in the middle of the transition.and for the keyword: _Heavy Duty Dolly; _ I do see your page. Check attached image.
Best luck.
GR. -
Hi Becky
I just searched in a normal browser so it could be Google skewing the results for you.
For indexed pages
site:key.co.uk inurl:http:
Regards
Nigel
-
How did you find these http pages?
I did a search in Incognito, but I couldn't see anything myself.
I'll try again, thanks!
-
Hi
Thanks for this. Yes I've checked in Google Console, I can find the page in indexed pages but the indexed pages are a lot less since migration:
HTTP - indexed 13013, blocked 12,891
HTTPS - indexed - 2814 / blocked robots.txt 5713
Do I just wait?
One keyword example for that page would be #Heavy Duty Dolly' & 'load moving dolly'
Were position 1 now out of top 100.
We're working on page speed/load time for the whole site, but why would it affect that one page so badly?
-
Hi Becky,
Without knoing those relevant search terms, there's almost no analysis to be done.
I´ve noticed that it took very long time to load, here a GTmetrix report.Remember that migrating to HTTPs makes google to re-crawl all your website's pages and re evaluate all ranking factors.
My advise is to wait a little longer. It might take a few weeks.Also, always monitor the Google Search console profile, there could be some message. Take a look into indexed pages, there could be also that there are less pages indexed now than before migration.
Hope I've helped.
Best luck.
GR. -
Hi Becky
Load Movers - Pos 3
Wooden dollies - Pos 1Maybe open an incognito browser with history cleared.
I don't see a problem
Regards
Nigel
PS You still have 748 http pages indexed but it's only 10% of the total
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Dynamic pages
Hello Team, How can we create dynamic pages or more pages on website but maintaining SEO standards.
Intermediate & Advanced SEO | | Obbserv0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Google Indexed my Site then De-indexed a Week After
Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
Intermediate & Advanced SEO | | Travis-W
The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT! A week ago Google de-indexed almost all of those new pages. Check out this image, it kind of boggles my mind and makes me sad. http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
Does it come down to our site not having enough Domain Authority?
My client really needs an answer about how we are going to get these pages indexed.0 -
How to make Google include our recipe pages in its main index?
We have developed a recipe search engine www.edamam.com and serve the content of over 500+ food bloggers and major recipe websites. Our legal obligations do not allow us to show the actual recipe preparation info (e.g. the most valuable from the content), we can only show a few images, the ingredients and nutrition information. Most of the unique content goes to the source/blog. By submitting XML sitemaps on GWT we now have around 500K pages indexed, however only a few hundred appear in Google's main index and we are looking for a solution to include all of them in the index. Also good to know is that it appears that all our top competitors are in the exactly same situation, so it is a challenging question. Any ideas will be highly appreciated! Thanks, Lily
Intermediate & Advanced SEO | | edamam0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0