Leaking organic traffic - how to debug?
-
Hi all,
We've been running an eCommerce marketplace for more than 2.5 years now. Most of our traffic and revenue have been from organic traffic, which have been growing steadily with our inventory and brand, peaking at March this year. From there, we started losing organic traffic (and revenue) each month, at a rate of about 15-20% - for no reason we can understand. In addition, some of our older pages no longer appear in search results (unless we add the name of the site to the search query).
We launched a redesign on the end of May, which seemed to initially improve engagement, but didn't affect this trend of lower organic traffic.
Our webmaster tools doesn't show anything special - if anything, we made an effort to clean-up every 404 that appears there and other small issues.
We did make the following changes very recently, but it did not seem to have a positive effect (so far):
- We have deep pagination for some categories of the site, and we just added rel=prev,next in the head of every paginated series on the site.
- We started generating a dynamic sitemap and submitted it to google. For some reason only about a fourth of the pages on the sitemap are indexed.
In addition, the "index status" as reported by webmaster tools shows some weird numbers. First, the number there is way bigger than the amount of pages we have - possibly all the combinations of our listing categories and pagination. That number was constant for a while, before taking a deep earlier this year, rising back up and declining again for the last couple of months. Screenshot of the graph
What would be the first steps you'd take to understand the core of the problem? we're really at a loss here.
-
Awesome Dan, many of the points you raise are things we are looking at and the video was spot on. Many thanks for the effort and wisdom!
-
Hey Adam
The best place to start is really diagnose where and when the exact organic traffic loss is happening.
Is there an exact date in analytics you dropped off for organic traffic? Or is it gradual? If it's an exact date, you can match it up to the algo history and determine if it's panda or penguin related.
Then secondly I would look at your average position report in webmaster tools under queries to see if your rankings have dropped. Sometimes traffic loss can occur without ranking drop, so you'd want to see if that's in fact what it is. Or maybe you're tracking rankings.
If it is lost ranking then you may want to start segmenting organic traffic in analytics. Is the loss across all keywords? Can you pinpoint a few high volume keywords that were hit? Or were certain pages hit?
Let's walk through the site a little on video;
(the first few seconds is a little jumpy, sorry!)
First time trying this in the Moz Q&A. Let me know what you think - helpful?
Lastly - your question about the crazy numbers for indexation in WMT. I see this a lot too. This tells me Google is not sure what pages on your site are important or not. There's probably a lot of extra pages which are not important being crawled. I would mitigate this with a great XML sitemap as well as noindexing of things like subpages and pages that present the same content but just filtered and/or sorted differently.
Hope that helps!
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
Is this organic search sketchiness worth unwinding?
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it. The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search. So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to. Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site. And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site! At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages. However, on some level it may have actually helped the pages linked to on the main site. The whole thing is so sketchy I wonder if I should reverse it. I could also just leave it alone and not risk hurting the pages that the blog currently links to. What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue. To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site. Please let me know what you think. Thanks!
Intermediate & Advanced SEO | | 945010 -
Site Migration and Traffic Help!
Hi Moz, I recently migrated my website with the help of an SEO company using 301 redirects. The reason for the move was to change our CMS from .aspx to Drupal/Wordpress. The homepage (www.shiftins.com) and the blog (www.shiftins.com/blog) were the only two pages that kept the same url. Everything else was redirected. It's been about two months since the redirects were completed and traffic has dropped off about 90%. I'm starting to worry that something was not done properly and my traffic may never return. The process for the redirects seem correct when I checked the work the SEO company did. All pages were duplicated, redirected to individual pages, then the old pages were de-indexed. Are there any insights the community can provide? Please help!
Intermediate & Advanced SEO | | shictins1 -
Https Loss of Search traffic
Hey guys, We moved our site to from http to https. We subsequently lost 25% in our search traffic in 1 Month. We changed a few other pieces such as images, added new content etc. Has anyone got any suggestions on how we start to understand what happened? Thanks in advance.
Intermediate & Advanced SEO | | Johnny_AppleSeed0 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Does Having A High PPC Bounce Rate Affect Organic SERPs?
Hi Mozzers. My website uses a landing page for Google Adwords traffic targeting keywords like HR Software, HR Systems etc. The design of the landing page is similar to our website but a key difference is that, being a landing page, we've removed the navigation (it is still possible to navigate to the main website by clicking on the logo). We've A/B tested this and found that by removing the navigation we get more people converting/signing up for the free trial of our service. We track conversions using Google Analytics. Depending on the keyword the conversion rate is between 2.5% and 5%. However, because we've removed the navigation the bounce rate is really high, circa 80% for our landing page compared to an average for our website of approx 40%. Would having such a high bounce rate harm our organic rankings for the rest of the website? Thanks
Intermediate & Advanced SEO | | OctopusHR0 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Community, We are about to move one of our most popular country sub directories from brandname.com/de/.. to brandname.de . We have just purchased the domain so while the domain has been registered in 2009 the URL has zero domain authority. What is the best strategy to execute the move while being cautious about loosing too much organic search volume the subdirectory is receiving right now? Obviously it will take some time to build up DA on the TTLD so maybe it is a good idea to keep the country directory for a little longer and start on the TTLD with just a static landing page, place some links, wait until it receives some DA builds up and then perform the move. Thoughts? /TomyPro
Intermediate & Advanced SEO | | tomypro0 -
Removing large section of content with traffic, what is best de-indexing option?
If we are removing 100 old urls (archives of authors that no longer write for us), what is the best option? we could 301 traffic to the main directory de-index using no-index, follow 404 the pages Thanks!
Intermediate & Advanced SEO | | nicole.healthline0