Which Algorithm Change Hurt the Site? A causation/correlation issue
-
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally
Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released.
Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases
We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day
-
It's really tough (and even inadvisable) to try to pin a traffic change to an algorithm update based solely on spikes in a graph. On rare occasion, it's pretty clear (Penguin is a good example, I've found), but in most cases there's just a lot of gray areas and the graph leaves out a mountain of data.
The big issue I see here is potentially seasonality and knowing what happened to the site and business. For example, you can look at #6 and #7 and call these dips, but that sort of ignores the spike. Is the dip the anomaly, or is the spike the anomaly? What drove up traffic between #4 and #6? Maybe that simply stopped, was a one-time event, or was seasonal.
Why was there volatility between #7 and #14 and then relative stability after #14? You could call #14 a "drop", but not knowing the timeline, it's hard to see how the curve might smooth in different windows. What it looks like is a period of highly volatile events followed by an evening out.
Without knowing the industry, the business, the history, and without segmenting this data, trying to make claims just based on dips and spikes in the graph is pretty dangerous, IMO. This could have virtually nothing to do with the algorithm, in theory.
-
I don't understand how dates would help? Was it not clear that the red lines are the dates of algo updates?
By abstracting the data the hope was to gain insight into how to read the graphs in relation to updates, and not just get help related to specific updates which wouldn't help much the next time we have to deal with a traffic drop problem. More a question of who to think rather than what to think.
Trying to read between the lines are you saying different algo changes take different amounts of time to kick in and that's why a more detailed graph is more useful? For example if #1 was the first penguin change, would your response be different if it was the first panda change?
-
You can use the Google Penalty Checker tool from Fruition: http://fruition.net/google-penalty-checker-tool/
I would not believe 100% on the tool results, but you can at least have an initial Analise, you'll need to go deeper to double check if this initial Analise is 100% relevant or not.
- Felipe
-
This doesn't tell me anything. If you at least had dates in there you could compare traffic dips to Google Algo Updates/Refreshes.
I understand you can't reveal the domain but I will be shocked if somebody here can tell you anything without further information. This place is full of brilliant minds, but that would take some sort of a mind-reader to tackle...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Been stuck on seo duplication issues shopify
hey there we have been working on some of our webshops and recently started with analytics/moz,but we have basicly hit a brick wall when it comes to www.krawattenwelt.de since we have had 5k high priority issues (duplicate content) and 20k medium priority issues now i have tried a large amount of solutions regarding the duplicate content issues but it didnt work so we basicly reverted it back to for now and i have the feeling i am really running out of options is there anyone who has an idea on how to do this? duplicate content issues are as follows example:http://krawattenwelt.de/collections/budget-9-15 issues with:http://krawattenwelt.de/collections/budget-9-15/modell_normal and with:http://krawattenwelt.de/collections/budget-9-15/modell_normal?page=1
Reporting & Analytics | | WebMaster2050 -
Automated XML Sitemap for a BIG site
Hi, I would like to do an automated sitemap for my site but it has more than a million pages. It would need to be a sitemap index with a separation on different parts of the site (i.e. news, video) and I'll want a news sitemap and video sitemap as well (of course). Does anyone have any recommended way of making this and how much would you recommend it getting updated? For news and , I would like it to be pretty immediate if possible but the static pages don't need to be updated as much. Thanks!
Reporting & Analytics | | mattdinbrooklyn0 -
.com site referral traffic to ccTLDs
We have 7 international domains set up along with our main .com site. All of the ccTLds are showing their main referral traffic as coming from the .com site in GA, and most of those being from mobile. Each site is set up correctly with geo-targeting and hreflang tags. Has anyone experienced this before?
Reporting & Analytics | | ggpaul5620 -
Large event site - how should I structure my URLs?
Hi guys, I'm working on a new website which is consolidating a number of existing event sites into one. The existing sites use a variety of URL structures: www.eventsite1.com/events/event-name www.eventsite2.com/festival-program/event-name www.eventsite3.com/event-name This inconsistency has led to issues with tracking category usage properly in analytics - for instance, with eventsite3.com, events fall within categories (www.eventsite3.com/category-name) but as soon as you drill into an event detail page (www.eventsite3.com/event-name) from the category page, the category is lost to analytics. This is compounded when one event lives within multiple categories, as I can't figure out which category is the most effective for a particular event. I've seen other event sites establish a canonical URL for a primary category, display it in the URL (i.e. www.eventsite4.com/primary-category/event-name) yet still let that event get hit via the secondary categories (www.eventsite4.com/secondary-category/event-name). This way, the categories get passed to analytics without any duplicate content issues (i.e. via the setting of canonicals) Basically, I want to make sure that whatever instruction I give to the devs for the new site re: URL structure is correct from an SEO perspective and analytics perspective. Do I even need to worry about having the category in the URL? Can someone please help me with this? Hope this makes sense Cheers
Reporting & Analytics | | cos20300 -
What type of links/redirect is Yahoo! using?
So I'm trying to figure out exactly what type redirect or hyperlinking Yahoo! is using on their article pages. For example:
Reporting & Analytics | | William.Lau
https://shopping.yahoo.com/blogs/fashionate/spring-clean-your-beauty-routine--10-tips-on-looking-fresh-this-season-000058218.html Hover over an external link, it shows you the ending URL. Right or left click it, it gives you a 302 redirect. When you actually left click it, it adds and "id" attribute, I assume for tracking. However, when you left click the the hyperlink, it no longer shows as a 302. I have limited working knowledge of web development techniques, so anyone with advance knowledge or have actually done this, it'd be helpful to understand this more.0 -
Why seomoz shows me "missing meta discription" on this plugin: http://villasdiani.com/wp-content/plugins/dopbsp/frontend-ajax.php ? Should I edit? how?? IIs it posible??
Good day to all! I am very confuse about results on seomoz Crawl Diagnostics Summary, especially with 6 Crawl Warnings Found. It says title too short: http://villasdiani.com/category/mombasa/, http://villasdiani.com/category/watamu/ , http://villasdiani.com/sitemap/ why would Google punish me for this??? why should I make longer title for sitemap? or Watamu? It is the name of the place - Watamu or Mombasa. It is very confusing for me. I have very big mess with the website and it is not ranking:-( what I have done:-( and is it possible to meta description for this plugin: http://villasdiani.com/wp-content/plugins/dopbsp/frontend-ajax.php how?? I even do not know where is it.
Reporting & Analytics | | VillasDiani0 -
Conversion rates by browser & OS - any feedback/experts/experience?
Hi, Ive been evaluating conversion rates by operating system and by browser for a client. Ive picked up significant and somewhat disturbing trends. As you'd expect the bulk of traffic is coming from a Windows/Internet Explorer combination. This is unfortunately one of the worst combinations (Windows/Firefox & Windows/Safari did worse. Chrome/Windows was significantly the best combination with Windows). Windows also performs much worse than Mac. E.g. Windows/Firefox performs worse than Mac/Firefox. Overall conversion rate for Mac is 7.07% compared to 5.69% Windows. This is based on hundreds of thousands of visits and equates to tens of thousands of dollars difference in revenue. Generally later versions of browsers perform better on both main operating systems e.g IE 9.0 converts at 6.33% compared to 8.0 at 5.80% on Windows and Firefox 4.01 on the Mac converts at 7.57% compared to 3.6.16 at 6.54% (although this dataset is smaller than Windows/IE). Page load speeds (recorded in the clients analytics) are significantly faster on Mac than Windows (as expected really). Being Windows/IE and specifically Windows IE8 represents the bulk of traffic should we be addressing this? Will any optimisation negatively affect better performing Mac/Browser combinations? Understanding that Mac users equate to 'better' converting visitors - what else could be done there? Anyone have thoughts or experience on optimising pages for improved conversion rates via IE and Windows? Thanks in advance, Andy
Reporting & Analytics | | AndyMacLean0 -
Googlebot encountered extremely large numbers of links on your site??? How Do I resolve this?
I am working on a site with over 30 million pages. Every time I get about One Million indexed I get a Message in the Google Webmasters Tools saying "Googlebot encountered extremely large numbers of links on your site" The indexing then starts dropping like a Rock. I need to get the site indexed. Please Help!
Reporting & Analytics | | GlobalFlex0