Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
-
Hello, really hoped for your feedback here.
So my site is http://www.1stwebdesigner.com
I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014.
Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more.
Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo.
What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years.
Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years.
Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw
-
You're very welcome.
OK, so if we can see that the drop happened May 20, 2014, then there is a high possibility that Panda has affected your site. We can't always say that with certainty though. I remember the week following that date, I had three separate sites that came to me saying that they had been horribly hit by Panda. One of the sites had done a redesign and launched May 19. The redesign changed all of the title tags and the urls, so it was not Panda's fault that they dropped in rankings. Another site had also redesigned and the developer forgot to put the Google analytics tracking code on the site. So, rankings didn't change but it looked like traffic plummeted. And the final site had accidentally noindexed 90% of their site. I'm not kidding!
Still, a big drop on May 20 (give or take one day) means that there are on-site quality issues to address. While it's never wrong to audit your backlinks I'd spend more time reviewing on page factors. When Panda first came out I spent a lot of time looking at thin and duplicate content. While that stuff, especially thin content is still important, there are a LOT more factors that go into Panda. I firmly believe now that Panda is Google's way of figuring out what users prefer seeing. As such, I focus on three things:
-
Improving technical SEO.
-
Removing or vastly improving thin content. (And yes, redirecting to an appropriate page or category page is fine.)
-
Figuring out how the site can be the best of its kind for users. This involves looking at competitors' sites with a non-biased eye and also looking at analytics data to see if there are pages in the index that users are not happy with (i.e. they consistently bounce or spend very little time on the page compared to others.)
Here is more on my approach to Panda:
https://mza.bundledseo.com/blog/have-we-been-wrong-about-panda-all-along
Best of luck!
Marie
-
-
Wow, definitely helps, Marie! Didn't expect to receive a helpful hand from such an SEO expert as you
Thank you so much!
First of all, on Webmasters there is no manual action.
Thank you for bringing attention to on-site issues, that's actually what I am doing now, over 8 years we had more than 2000 published articles and right now I am cleaning them out after seeing that 20% of articles receive 80% of the traffic.
By any chance can you suggest the best way to deal with deleted posts? Right now I am redirecting them all with 301 Redirect to related similar post, or if none is suitable I redirect to home page, trying not to leave 404. What would you recommend doing?
Really good points, well, I never syndicated posts, lots of link building included submitting it to related web design news websites (pligg, digg voting type sites or where news appeared in sidebar).
I did exchange blogroll links maybe with 10 max friends sites, but nothing massive.
And regarding May 20, it indeed happened then. Comparing Tuesday, May 20 to Monday, May 19.
Actually after research..and following your disavow article how to work with massive amounts of links in spreadsheets and trying out Detox tool, I found, there are certainly links i would want to disavow, but it's not as significant as I thought. I really think, that what you are saying is right - that simply Google stopped counting some links and that' why rankings did drop.
Action plan is clear at least, first focus on fixing everything possible on-site before touching Disavow tool. Really, really appreciate your helpful reply and taking time to give such an insightful feedback to simple guy like me
Thank you again, Marie!
-
First off, thank you Peter for the recommendations of my articles.
It looks like there are a number of issues with your site. I wouldn't focus so much on just one algorithm, but rather, spend time looking at the entire site and its issues.
What phrases are you trying to rank for? Your title tag on your home page says, "The Community of Web Design Professionals" which I am guessing is not your money keyword. Keywords in the title tag (without stuffing), especially at the front of the tag are very important.
Next, I would get rid of the huge full page popup. It could possibly cause the page layout algo to affect your rankings. It can also cause people to quickly bounce away and look for another site which can be a signal to Google that users don't enjoy your site.
The Google pagespeed insights tool gives you a very low score. Improving on pagespeed could make a significant difference.
You've got 1490 pages in Google's index. Do you really have 1490 high quality pages? Is it possible that there is some content indexed that doesn't need to be? For example, when I take text from this article: http://www.1stwebdesigner.com/find-qualified-designer/ and search for it on quotes on Google I see 27 results. The only pages that you want from your site in Google's index are high quality, unique and compelling articles and resources. With that said, if you wrote this article and other people copied it, I wouldn't be worried. But, if you purposely syndicated it, then it could be an issue.
It's hard to say whether links are the issue. It doesn't seem like any of your drops coincided with a Penguin date so I am guessing that Penguin is not suppressing you. (Can't say for sure without having a deep look). But, it is possible that you were previously getting some benefit from self made links that now, Google can determine are self made as opposed to truly earned and they may be discounting those links. You mentioned that you had created a lot of links on your own. As Google gets better and better at figuring out which links are truly earned as opposed to self made, it may be that you've lost significant pagerank that used to come through these links.
With that said, have you checked Google Search Console --> Search Traffic --> Manual actions to make sure there is no manual action there? I see a lot of blogroll links and sometimes if there has been excessive link exchanging going on a site can get a manual action.
Now, back to the algo stuff. Can you tell via Google analytics if the big drop in May happened on May 20, 2014? If not, it's not Panda. If it is exactly on this date though, then I'd be spending more time looking at on-site issues than links.
Hope that helps!
Marie
-
@Peter - I read everything and was in the process of doing the audit, but Google Sheets keeps crashing, site has around 1 million backlinks and formulas just keep crashing when working with even 100,000 links from only webmaster. Any tips?
-
Thanks Peter, on my way then, excellent help, really appreciate it..will let you know how it goes, if you will be interested
-
So you need to get all links from:
- SearchConsole
- Ahrefs
- Moz OSE
- Majestic - you need to verify your domain and then you can download fresh list for free. For historical you need to paying account.
- WebMeUp - free with some limitations. Paid version also available
Once you get everything you need to compose mega sheet in Excel (or Google Sheets) and then you can do "link profile audit".
But you also need to make "panda audit". And this is little bit weird because you need to read almost everything from many authority persons. From all of them i can recommend Glenn Gabe and Josh Bachynski. First can be found on http://www.hmtweb.com and second http://themoralconcept.net/pandalist.html Of course they also wrote for other sites like Moz, SEL, SEJ, etc. Of course this doesn't mean that there isn't other authors like:
https://mza.bundledseo.com/blog/have-we-been-wrong-about-panda-all-alongSo let's go back on Panguin tool. You need very depth inspection in Analytics on your dates and dates of releasing algos.:
https://mza.bundledseo.com/google-algorithm-change
and make detail analysis of your situation. IMHO there is also huge chance that you're hit with both algos. -
-
Thank you so much Peter for your helpful reply. PSD2HTML is a new for me, already read about WPMU crazy moves and disavow guide..started working on it yesterday..just friend of mine who works at Ahrefs still convinced that I shouldnt go Disavow path.
Alright, working on audit then, thank you for links already!
What other tools you would recommend, if you say it isnt enough with Ahrefs?
-
-
First definitely there is some kind of "algo filter" on your website.
Second you need to make audit on website because there can be crossing updates:
https://mza.bundledseo.com/blog/the-danger-of-crossing-algorithms-panda-update-during-penguin-3
And that's why only with some interactive chart can't be sure what filter is applied to your site. Now Panda and Penguin works much quick in filtering and need refresh to escape filtering.About disavowing. I strongly recommend to read Marie Haynes articles:
https://mza.bundledseo.com/blog/guide-to-googles-disavow-tool
https://mza.bundledseo.com/blog/5-spreadsheet-tips-for-manual-link-audits
(also check other artciels about this)
https://mza.bundledseo.com/blog/my-story-how-psd2html-worked-to-have-a-manual-penalty-revoked
https://mza.bundledseo.com/blog/how-wpmuorg-recovered-from-the-penguin-update
https://mza.bundledseo.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing <- this is continuing of prev. article.https://mza.bundledseo.com/community/q/disavow-straightaway-urgent <- just check question and mine second answer, it's long just to be paraphrased here. But only with Ahrefs you can't pass you also need other tools.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Merging Two Sites: Need Help!
I have two existing e-commerce sites. The older one, is built on the Yahoo platform and had limitations as far as user experience. The new site is built on the Magento 2 platform. We are going to be using SLI search for our search and navigation on the new Magento platform. SLI wants us to 301 all of our categories to the hosted category pages they will create, that will have a URL structure akin to site.com/shop/category-name.html. The issue is: If I want to merge the two sites, I will have to do a 301 to the category pages of the new site, which will have 301s going to the category pages hosted by SLI. I hope this makes sense! The way I see it, I have two options: Do a 301 from the old domain to categories of the new domain, and have the new domain's categories 301 to the SLI categories; or, I can do my 301s directly to the SLI hosted category pages. The downside of #1 is that I will be doing two 301s, and I know I will lose more link juice as a result. The upside of #1, is that if decide not to use SLI in the future, it is one less thing to worry about. The downside of #2, is that I will be directing all the category pages from the old site to a site I do not ultimately control. I appreciate any feedback.
Intermediate & Advanced SEO | | KH20171 -
Can I use two sitemaps?
I have a Magento website. I am going to add a Wordpress blog under /blog. If I setup each with its own webmaster tools to submit a sitemap does it hurt anything?
Intermediate & Advanced SEO | | Tylerj0 -
Do industry specific domains help SEO?
Hi everyone, I've looked for an answer to this but I can't find one. Hopefully someone can help! I have a new client that is a builder. They currently have a .co.uk domain (e.g. businessname.co.uk) Would it help them if the website was businessname_.builders_ instead? Thanks, Alex
Intermediate & Advanced SEO | | WebsiteAbility0 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
Looking for help with my website
Hi does any one know of a good seo company that will get results, i.e., fix site issues and get the site improving in the serps.
Intermediate & Advanced SEO | | Taiger0 -
Moving blog to a subdomain, how can I help it rank?
Hi all, We recently moved our blog to a sub-domain where it is hosted on Wordpress. It was very recent and we're actively working on the SEO, but any pointers on getting the subdomain to rank higher than the old blog posts would be terrific. Thanks!
Intermediate & Advanced SEO | | DigitalMoz0 -
Social Buttons Help SEO, 2 Questions...
Howdy Guys, I noticed a weird thing over the weekend - our main keyword has been hit pretty hard by penguin and we had dropped down to #79. On Friday I decided to change some on-page optimisation and changed the title tag and some tags. When I've ran my rank tracker this morning we have jumped up to #62... Has anyone else noticed just a simple change boosts rankings? Second Questions We took all our social buttons off the website back in January as no-body was using them but from a few recent reports I've seen having the buttons on the site help organic rankings... Is this true? Scott
Intermediate & Advanced SEO | | ScottBaxterWW0