Penalty or Algorithm hit?
-
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
-
Michael,
if you got hit with on the 24th of february, this was the Panda algorithm update.
First, if you sure that your content is 100% unique and a high quality site? i would go to
http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en&start=800
This thread is dedicated to people that have a high quality site that has been negatively affected by this change. a Google employee will look closer .
On the other hand, the stuff you can do to help your site are (this is my opinion, still webmasters and SEOs trying to figure our how they can get out or what are the criteria that triggered the panda update on their site)
- Trustworthy UI (user interface). your website is old (it looks like an old 1). see if there is a possibility to make a new site built on a robust CMS.
- site speed
-
Can I post analytics data or do I have to edit it first....
CLIFFS:
Site Traffic Drops 50% on Feb 24th and continues to ...
Traffic rises back to 100% March 3rd - 8th
Traffic drops back down to 50% on March 9th - Day after host advised me poorly and changed IP...
Traffic has been 50% of what it had been in the last few to this day, it is killing me financially
-
Okay, if this is the case, what are webmasters recommended to do ? Increase site speed? Any links appreciated.
Thanks
-
I think your talking about askthetrainer.com
after short analysis am sure that is not a penalty.
your site might be harmed from the Google Panda update
-
Michael, answering your question fully will require analyzing your site, and probably your traffic and GWT data. If you can post your site url, I'll take a brief look at it. If you can put together a traffic data graph showing your drops in traffic and how they coincided with what changes you were making to your site, that would be helpful, too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are top 3 directives to prepare for a Google algorithm update?
Company's site fluctuated in keyword rankings last Friday, due to Unnamed algorithm. Our directives are on-page optimization and continual content generation. What are other directives to take?
Algorithm Updates | | ejcruz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Do panda/penguin algorithm updates hit websites or just webpages ?
If I have a website that been affected by the panda/penguin update, do bad links affect the entire site or just the page the bad link(s) are linked to? If it is the latter and penguin/panda actually affect webpages, not websites (as is the common reference/conception), then wouldn't simply creating a new URL, targeting this new URL, shifting meta-tags and restarting link-building efforts again (this time using the right quality strategies) be a really common-sense approach instead of the tediousness of the disavow approach that so many go down?
Algorithm Updates | | Gavo0 -
Considering the Panda algorithm updates, would you recommend reducing high amounts of inbound links from a single website?
My website has a significant number of inbound links (1,000+) from a single website, due to a sponsorship level contribution. Both my website and the other are authorities in the industry and in search results (PR of 5). Since even ethical websites can suffer a penalty from each iteration of Panda, I'm considering significantly removing the number of links from this website. Do you think that measurable change would be seen favorably by Google or would the drop in links be detrimental?
Algorithm Updates | | steelintheair0 -
When was the last algorithm update? One of my pages has dropped significantly this week
One of my pages dropped 22 places last week and I'm not sure why - can any body give me some suggestions to why this might have happened?
Algorithm Updates | | lindsayjhopkins0 -
Penalty for Mixing Microdata with Metadata
The folks that built our website have insisted on including microdata and metadata on our pages. What we end up with is something that looks like this in the header: itemprop="description" content="Come buy your shoes from us, we've got great shoes."> Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other. Can anyone provide insight on this?
Algorithm Updates | | markcely0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0