Penguin 2.1 Penalty- Can't Understand why hit by it?
-
Hi,
I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this.
Any help would be appreciated.
Website: http://tiny.cc/hfom4w
-
Where do you find the link profile?
-
Whilst over-optimisation is an issue, it is more about the quality of the websites you are getting links from than the anchor text used. You could have a brand link and if it's a poor quality website, you are going to get penalised. If you haven't disavowed any links, I'd make sure that 90% of them were disavowed.
You've also got duplicate content on your website so I'd say you've been hit by a Panda refresh as well as Penguin. Type in "To assist clients in getting value from their businesses, we provide Business Critical Software and IT services within various industry verticals." to Google and you will see 4 different versions of your content.
You need to do a full backlink analysis ASAP and be ruthless in the ones you get rid of, if you wouldn't show the link to Google then it's bad. Use the disavow tool in Webmaster Tools then file a reconsideration request to Google with details of what you have done and why there were so many bad links.
It's a long road to take but if you want the website to come back into the SERP's then you are going to have to do this.
Good luck!
-
Hi Yiannis and Remus
Thank you for the reply. There are many competitor websites like http://tiny[dot]cc/9tpm4w, http://tiny[dot]cc/dvpm4w that are just targeting "Software Development". Why aren't they getting penalized
-
Hello Chanpreet,
Like Yiannis says, it's probably related to "over optimized anchor text". To get more info you could compare your anchor text link profile with one for a competitor that still ranks well in SERP's.
-
Hello,
I had a quick look at your link profile and it seems that you had a big spike in link building activity from 17 of July up to 24th of August (mostly from directory submission). Then it goes quiet again which looks unnatural. You have used contextual anchor texts around "software development" keyword which covers pretty much 80% of your profile making your web site vulnerable to Penguin and my guess is that you got hit by the latest refresh/update.
I would suggest you make your anchor texts look more natural and remove all those directory links you got between july and august by using the disavow tool. Then monitor if you move up to SERPS and report back.
Remember that you might not move immediate results, sometimes the algorithm needs to refresh before you see any noticeable changes.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can affiliate links affect DA?
Hey guys, over the past two months my DA has gone down from 17 to 12, and I have no dura what could have caused it. I started putting in some Amazon affiliate links in my posts - could that be the reason why? Also, I have about 30 backlinks from a blog with a spam score of 11% - could this also be affecting it in any way?
White Hat / Black Hat SEO | | AmyAed0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
What is the difference between rel canonical and 301's?
Hi Guys I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories. Tell me have I got this right or completely wrong? Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings And love and relationships category - https://www.zenory.com/love-relationships Hope this makes sense - I really look forward to your guys feedback! Cheers
White Hat / Black Hat SEO | | edward-may0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes? Thanks
White Hat / Black Hat SEO | | Robertnweil10 -
Drop in non-branded organic search April 1
I saw an intense drop in non-branded organic search for major pages on my site on April 1st this year. The homepage wasn't affected and it's not an annual thing. I've attached a screen shot showing the drop. I'm new to the company and recently learned that they had hired a pretty black hat company last year and I'm worried that this is Panda...although the timing seems wrong. Has anyone experienced panda effects between the two updates? I'd love to get some feedback!! 1ry2a.png
White Hat / Black Hat SEO | | CIEEwebTeam0 -
When to give up on a website with a Google penalty?
I recently had a Google 60 penalty hit my website. The main two issues were that I had a person helping me with SEO and they bought some links. The second issue is that I own about 90 URL's in the my vertical. I created about 60 one page sites for these keyword targeted domains. I then linked these sites to main site. Big mistake! I kept these URL's all on the same server as my main site. In October 2010 I noticed my site hits dropped dramatically. I started looking for the issue. I didn't know which issue caused the penalty. I fixed both issues in November 2010 and asked Google for reconsideration in early December 2010. I kept link building for my site by finding quality links.I was extremely honest with Google. I gave them all of the domains I own and I told them the name of the person that bought links for me and the websites where those links were placed. As of late February 2011 a Google search for my domain still showed up in approximately the 64th position. I recently asked Google again to lift the penalty. I basically told them that I fixed all of my issues that led to the penalty and let them know I have been waiting for almost 3 months. I told them I have put the past 2 years of my life into this website and begged them to forgive me. I also asked them to let me know if my site was never going to be forgiven? I got the typical canned response from the Google team. As of today the penalty is still in effect. I just want to know when you should give up on a site. I have spent about $20,000 on this site and about 2 years of hard work. I don't want to give up, but I don't want to keep putting my hard work and time into the site if it will never escape the dreaded Google penalty. Do you think I should continue to wait and if so how long? Anything else I can do to persuade Google to release me from this penalty hell? If I do abandon the site and start from scratch what steps should I take? Do I need a new server? What if any content can I take from my current site and transfer to the new site? If I can how do I do this without getting another penalty or lose the credit for the original content. I created about 2,000 pages of original content for this site. I'd love to be able to transfer this content if I have to start from scratch. Any ideas or detailed help plans would be greatly appreciated.
White Hat / Black Hat SEO | | tadden0