Page drops from index completely
-
We have a page that is ranking organically at #1 but over the past couple of months the page has twice dropped from a search term entirely. There don't appear to be any issues with the page in Search Console and adding the page on https://www.google.com/webmasters/tools/submit-url seems to fix the issue.
The search term we're tracking that drops is in the URL for the page and is the h1 of the page.
Here is a screenshot of the ranking over the past few months: https://jmp.sh/akvaKGF
What could cause this to happen? There is nothing in search console that shows any problems with the page. The last time this happened the page completely dropped on all search terms and showed up again after submitting the url to google manually. This time it dropped on just one search term and reappeared the next day after manually submitting the page again.
-
I had a simlar issue on a couple Wordpress freebie sub domains I made while conducting reputation management for clients. What had ended up happening was The site would index immediately and then 24 hours later be ghosted completely.
Turns out I was submitting the news sitemap that it automatically generated and being that I wasn't in their list of approved news sitemaps, I guess it just ripped everything out, as I'm sure the news sitemap and the regular one had the same pages listed just with more detail on the news one.
I doubt it's the exact same occurrence but if you recently submitted a sitemap, I'd check it closely, as it has been known to trigger a similar problem, at least for me!
-
Thanks, Nigel. Your responses are actually quite helpful pointers. There's a possibility that Google is flagging it as duplicate content as perhaps the content on this page is a bit sparse. We have two posts - first post is a "What is this type of document you need" and the second post which is a link to a template for that doc. The template one is the one that has dropped twice. Here is the search we're dropping from occasionally. Interestingly enough, Google is indexing the public Google Doc our page points to and including that in search results.
Excuse the bitly links, just trying to avoid the search terms showing up for others to find.
To answer your questions directly:
- Google seems to be respecting canonicals
- Page is in sitemap
- Perhaps too much repetition? Maybe we should expand the content a bit
- This may well have happened as we have seen a few sites "republish" some of our content.
-
Hi Russell
I would have to see the URL but it looks like a duplicate content problem. Have you recently written a blog post with a very similar title?
Is Google respecting your canonicals?
Is the page in your sitemap?
Is it over optimised? too much repetition of teh main keyword?
Has someone stolen all of the content creating cross-site duplication?There isn't a lot to go on but I agree it's very unusual!
Regards Nigel
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many links pointing to our privacy policy page: Hurting our ranking efforts of main pages?
Hi community, As per the "Links" report from GSC, there are millions of pages pointing to our privacy policy page. We can expect high number of links to this page being ours an open source product. But these links are overtaking the count of links pointing to our homepage which are very artificial from few spammy or low quality sites. "Privacy policy" anchor text is also been the top anchor text. Our homepage ranking dropped and I suspect this is the culprit. Google might be considering this is the important page being linked on top with anchor text. Shall I Disavow these sites and will this makes Google stop counting links, and the anchor text coming from these sites as well? Suggestions please. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
Sudden Recent Drop in Impressions in GWT - WTF?
I noticed this recent drop in impressions in Google Webmaster Tools. It started mid-February, and I know there was the page layout algorithm on the 6th, and I've heard mention of a Panda update around the 11th, so I started to wonder what was resposible. A manual penalty was just recently removed, too. As I dug deeper, I discovered other problems. For one a misredirected blog causing 404s, plus a redirected site whose duplicate pages were never removed from Google's index. There are also two exact match domains 301 redirected to the site, but there were no links or content prior to the redirect. In a site:operator search, one is showing a duplicate homepage. When the wordpress.com blog was redirected, it was not redirected to the /blog subdirectory. Could the resulting 404s which go back as far as I can see in GWT (3 month limit) be the cause of this drop? We're talking about hundreds of blog pages and their links. FYI the main nav in /blog pointed to the old site until 2/7 when I pointed them to the existing domain (so hundreds, if not thousands of links were being redirected) The million dollar question is: is it just the 301 redirect issue causing the problem here? It looks like I might just have exacerbated it when I fixed the nav menu links. Will fixing the redirect rescue the impressions? My plan of attack includes killing the 301 redirects from the exact match domains with no backlinks, and removing the old site from Google's index from within GWT. Any yays or nays? FYI, a 301 redirect of .index.html, default.asp, and non-www was done 1/8,
White Hat / Black Hat SEO | | kimmiedawn
the reconsideration request was sent 1/24, manual penalty lifted 2/10. Index.html still redirects twice, going to www.site.com/index.html before resolving at .com. Same with default.asp. IarDs8u0 -
Dealing with internal pages with bad backlinks - is this approach OK?
Hi all, I've just been going through every page of my company website, and found a couple of internal pages with nasty backlinks/profiles. There are a significant number of article marketing and rubbish directory pages pointing to these internal pages. These internal pages have low PR, yet are performing well in terms of SERPs. I was planning to: (1) change URLs - removing current (soon to be former) URLs from Google via Webmaster Tools. Then (2) remove website's 404 for a while so nasty links aren't coming anywhere near the website (hopefully nasty links will fail to find website and broken links will result in link removal - that's my thinking anyway). PS. I am not planning to implement any kind of redirect from the old URLs. Does this sound like a sensible approach, or may there be problems with it? Thanks in advance, Luke
White Hat / Black Hat SEO | | McTaggart0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
Shadow Page for Flash Experience
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0