Organic search traffic dropped 40% - what am I missing?
-
Have a client (ecommerce site with 1,000+ pages) who recently switched to OpenCart from another cart. Their organic search traffic (from Google, Yahoo, and Bing) dropped roughly 40%. Unfortunately, we weren't involved with the site before, so we can only rely on the wayback machine to compare previous to present.
I've checked all the common causes of traffic drops and so far I mostly know what's probably not causing the issue. Any suggestions?
- Some URLs are the same and the rest 301 redirect (note that many of the pages were 404 until a couple weeks after the switch when the client implemented more 301 redirects)
- They've got an XML sitemap and are well-indexed.
- The traffic drops hit pretty much across the site, they are not specific to a few pages.
- The traffic drops are not specific to any one country or language.
- Traffic drops hit mobile, tablet, and desktop
- I've done a full site crawl, only 1 404 page and no other significant issues.
- Site crawl didn't find any pages blocked by nofollow, no index, robots.txt
- Canonical URLs are good
- Site has about 20K pages indexed
- They have some bad backlinks, but I don't think it's backlink-related because Google, Yahoo, and Bing have all dropped.
- I'm comparing on-page optimization for select pages before and after, and not finding a lot of differences.
- It does appear that they implemented Schema.org when they launched the new site.
- Page load speed is good
I feel there must be a pretty basic issue here for Google, Yahoo, and Bing to all drop off, but so far I haven't found it. What am I missing?
-
Hi Adam,
Not to point out something that is likely well taken-care of, but did the GA / Analytics code populate across the site?
Also, is there any heavy JavaScript on the site, especially above analytics code, that might prevent analytics code from loading properly. We had this happen with a client a few years ago. We built custom analytics for this client (they did not want to run GA). Client placed our code in the footer. Client placed slow-loading CRO code in the header. CRO code took so long to load that people had often clicked away from the page they landed on before our code had had a chance to record their visit, as JavaScript generally loads in the same order as it's placed on the page. We had them move our little piece of code up to the top of the page. Problem was solved (in the mean time, we were recording a 20,000 visit loss each week!).
I'm just wondering if this is a tracking issue since all search traffic, not just Google has been affected. It would be quite rare to find an issue that has the same effect at the same time to both Bing and Google's algos. They're similar, but they're not identical and Bing generally tends to take longer to respond to change than Google as well.
Any chance you have raw server logs to compare analytics stats to?
-
I don't see anything that I would think would trigger that. Let me PM you the URL.
-
Did the layout of the header area change significantly? If, for instance, the header area went from 1/10th of the "above the fold" area to 1/3rd, that might run the entire site afoul of the "topheavy" part of Panda.
-
Thanks for the suggestions!
-
The homepage, category, and product pages have all lost traffic.
-
So far, I haven't found any noteworthy changes in content.
-
I've been wondering if this might be part of the issue.
-
I've reviewed Majestic link data, and only see a few deleted backlinks, so I'm thinking it's not a backlink issue.
-
-
Thanks for the suggestion. So far the only significant difference in optimization I've found has been that they added Schema.org markup.
-
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
Did the new cart generate product pages that were differently optimized than the old cart? (if cart-generated product pages were used)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Load Balancer issues on Search Console
The top linked domains in search console are coming from our load balancer setup. Does anyone know how to remove these as unique sites pointing back to our primary domain? I was told Google is smart enough to ignore these as duplicate domains but if that was the case, why would they be listed as the top linked domains in search console? Most concerned....
Intermediate & Advanced SEO | | DonFerrari21690 -
404 vs 410 Across Search Engines
We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks
Intermediate & Advanced SEO | | sb10300 -
Site Migration and Traffic Help!
Hi Moz, I recently migrated my website with the help of an SEO company using 301 redirects. The reason for the move was to change our CMS from .aspx to Drupal/Wordpress. The homepage (www.shiftins.com) and the blog (www.shiftins.com/blog) were the only two pages that kept the same url. Everything else was redirected. It's been about two months since the redirects were completed and traffic has dropped off about 90%. I'm starting to worry that something was not done properly and my traffic may never return. The process for the redirects seem correct when I checked the work the SEO company did. All pages were duplicated, redirected to individual pages, then the old pages were de-indexed. Are there any insights the community can provide? Please help!
Intermediate & Advanced SEO | | shictins1 -
How to measure traffic for a keyword
Sitting in Country A I want to see how much traffic a particular keyword receives in Country B. Whats the best way to do it? Also, will the search results differ if I am analyzing the above sitting in Country A viz-a-viz Country B. In other words, will the IP of the country I am making the search from play a role in the results?
Intermediate & Advanced SEO | | KS__0 -
Maintaining SEO with Ecommerce Search Refinement
Hey Everyone, i have an interesting scenario I'd appreciate some feedback on. I'm working on restructuring a client site for a store design, and he had previously built a bunch of landing pages mostly for SEO value- some of them aren't even accessible from the main nav and contain a lot of long-tail type targets. These pages are generating organic traffic but the whole thing is pretty not user-friendly because it's cumbersome to drill down into specific categories (that many of the landing pages fulfill) without going through 3 or 4 pages to get there. For example, if I want to buy orange shoes, i can see specific kinds of orange shoes, but not ALL the orange shoes, even though there is an SEO page for orange shoes that is otherwise inaccessible from the main navigation. If that wasn't too confusing, essentially the usability solution to this is implementing some search refinement so that the specific sub categories can be drilled into easily with less steps. My issue is that I'm hesitant to implement this even though I know it would be an overall benefit to the site, because the existence of these SEO pages and being wary of destroying the organic traffic they're already receiving. My plan was to see to it that the specific category pages are built with the necessary keywords and content to attract those organic visits, but I'm still nervous it might not be enough. Does anyone have any suggestions for this circumstance, but also just maximizing SEO efforts on a site with search refinement and how to minimize loss. From a usability standpoint, search refinement is great, but how do you counter the significant SEO risks that come with it? Thanks for your help!
Intermediate & Advanced SEO | | BrandLabs0 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
Block search bots on staging server
I want to block bots from all of our client sites on our staging server. Since robots.txt files can easily be copied over when moving a site to production, how can i block bots/crawlers from our staging server (at the server level), but still allow our clients to see/preview their site before launch?
Intermediate & Advanced SEO | | BlueView13010 -
High search volume keywords
The problem is that our index is not in serps anymore with the high volume keywords (Pfizer, Roche, johnson & johnson).
Intermediate & Advanced SEO | | bele
We still keep these keywords in title, but it brings not much results. We made page www.domain.com/pfizer , added there Pfizer products with unique descriptions.
Product pages started to drive visitors, but not the www.domain.com/pfizer page. If we add a blog to the top of this page and add unique posts about Pfizer company news, would it help?
In this case this page would be unique, refreshed with new info, and have rotating pfizer products. Maybe some other suggestions?0