Sudden dramatic drops in SERPs along with no snippet and no cached page?
-
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices.
Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end.
On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results.
Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities:
1. Google came to crawl when the site was unavailable.
However, there are no messages in the account or crawl issues otherwise noted to indicate this.2. There is a malicious link spam or other attack on our site.
3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering.
Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past.
the site is: www.jamestowndistributors.com
Any help or ideas are greatly, greatly appreciated.
Thank You
DMG
-
Point taken on the multiple H1s on the home page. Good call.
Digging deeper into the matter, it appears a backup version of the site was up over the weekend and on monday. It is possible that a different robots.txt file was up.
Looking at crawl results for 1/16/2012, Google is showing some really strange things. Thousands of allowed/regularly crawled/.indexed pages are listed as 'URL restricted by robots.txt' for 1/16/2012.
We immediately made sure the latest robots.txt file (the one that should be up) had these pages as allowed, and it does, so we made sure that is the one in use. It may or may not have been over the weekend.
I resubmitted both regular and mobile sitemaps last night and both are showing with a checkbox today.
Hopefully this all sorts itself out in a few days. We certainly can't afford any meaningful long term issues with our natural search visibility.
Thank you.
Of course, I welcome additional feedback.
DMG
-
From looking at the site, there are seo best practices not being followed, but I don't think that is the reason for the drop (You have multiple H1 tags on the home page, there should be only one)
The Schema code shouldn't impact the site in any negative way as well.
Even if someone was link spamming your site, Google wouldn't count that against you, otherwise seo would only be about link bombing your competition!
If Google came to crawl the site and nothing was there, it might also have been the deep crawl, and that might account for the missing snippets.
The key here is not over-react. It sounds like you have everything in order, and might have just been a victim of bad timing.
I know it's hard to be patient, but give it a week to see if the snippets return once the fresh bot has hit your site. You might also want to send a message to Google through your Webmaster account, they are pretty responsive.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Dedicated landing pages vs responsive web design
I've been doing some research into web design and page layout as my company is considering a re-design. However, we have come to an argument around responsive webdesign vs SEO. The argument is around me (SEO specialist) arguing that I want dedicated pages for all my content as it's good for SEO since it focuses keywords and content properly, and it still adheres to good user journeys (providing it's done correctly), and my web designer arguing that mobile traffic is on the rise (which it is I know) so we should have more content under 1 URL and use responsive web design so that users can just scroll through content instead of having to keep be direct to different pages. What do I do... I can't find any blogs, questions, or whiteboards that really touches on this topic, so can anyone advise me on whether I should: Create dedicated landing pages for each bit of content which is good for SEO and taking users on a journey around my site OR All content that is relative to a landing page, put all under that one URL (e.g. "About us" may have info on the company, our team, our history, careers) and allow people to scroll down what could be a very long page on any device, but may effect SEO as I can't focus keywords/content under one URL properly, so it may effect rankings. Any advice SEO and user experience whizzes out there?
Web Design | | blackboxideas0 -
On Page Local SEO
What do you believe is the best approach when it comes to Local SEO for businesses in 2013?
Web Design | | BlueRockDigital0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Site structure- category pages
Hi, I'm relatively new to SEO but have tried to apply all best practices to my site. However, I've hit a stumbling block when it comes to whether or not to index my category pages. http://istudyenglishonline.com/category/expressions-idioms/ General info: the site has been created with Wordpress and has a directory of English idioms. Each idiom is associated with one or more categories that it falls under (emotions, sports, food etc). Each category has its own page where the list of idioms will be. As each idiom often has more than one associated category, the same idiom will appear in different category pages, thus creating duplicate content. However, I have given each category page its own unique description. The issue is, when there are numerous idioms, the category page will have more than 1 page. I don't have the ability to create a unique description for each subsequent page of the main category. I know that the very model for some vertical search engines (such as indeed.com) is to create such landing pages and that the more "categories" that they have assigned to their job ads, in this case, the more pages created and the more pages indexed in Google. This seems to work very well for them. My question is, am I doing things right? Should I be doing anything to the subsequent category pages to avoid duplicate content? My plan was to have so many idioms associated with so many categories that I have a fair number of landing pages indexed in google, thus attacking the long tail keywords. However, I'm not sure if I am going the right way. Any advice would be much appreciated!
Web Design | | villarroel0 -
For a web design firm, should i make a google plus local page or company page?
I have a web design firm located in India, At this moment we are focusing on local clients as the current competition in local market is very low. But in few months we will shift our focus to outsourcing. So I wanted to know if we should make a google plus local page and connect it with my google places account and website or should I make a google plus business page and connect it to website? Our major focus is on seo. Thanks
Web Design | | hard0 -
Does page speed worth for SEO?
I always broken my head to try to follow all pagespeed guidelines. I increase my pagespeed significantly, but i didnt saw any effect in my SEO performance. In my keywords, my concorrents are crap on it (I have score of 90 and they are at 60-70).Does google gives importance to it?
Web Design | | Naghirniac0 -
Best Home Page design examples - from a SEO perspective. Could You give me a few?
I am int he process of redesigning a home page and wanted to know from our community if anyone had examples on some great Homepage Designs from an SEO perspective. Things I am looking for : 1. Homepage URL/Design 2. Navitagion 3. Call to action Thank you, much appreciated. The home page I am redesigning is www.gunshotdigital.com Vijay
Web Design | | vijayvasu0