Penguin Rescue! A lead has been hit and I need to save them!
-
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
-
This sounds like a plan. Give it a shot and test the results
-
Does anyone care to share their view on my last?
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
-
Thanks for all the responses guys. I have taken them on-board. 1 thing I have noticed...
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
Thanks
-
I second the time frame issue. 1 month won't be enough time and your work will just benefit the next person this client gets to work on it, while you'll be left with an upset client because of bad expectations.
-
"you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again."
I agree with this 100%.
These types of problems can be fixed and then must wait until google reevaluates and then republishes back into the SERPs. Sites that are hit with these types of problems escape in batches - not when things are fixed.
So, you could do great work, get it fixed on 25th day and then google does not reprocess and republish for 60 more days and some other SEO gets credit for your hard work.
I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo.
Exactly... What are good links? Your "added" links will not be natural.
-
Well, from what everyone is writing about Penguin, it's an algorithmic update. Meaning you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again.
I think the timeline you have set is highly unrealistic and you should aim to set expectations with the client that this process can very well take much longer. If this previous SEO company built problematic links, I think you'll have to deal with them. I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo. I think you're going to have to go through the tedious work of cleaning things up. The good news is that a bunch of people have written about what to look for. Check in WMT tools for sitewide links, check your anchor text pointing into the site. Export your external links from OSE and then upload them to Linkl Detective- http://linkdetective.com/- let it do the hard work for you, classify a lot of the links, and then you need to go through the process of trying to clean things up, doing as much as you can, and then submit a reinclusion request (may help, may not), hoping Google will discard the other links.
Good luck - really try to demonstrate to your client the complexity of the process and extend the timeframe of the project - that's my ultimate recommendation
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Im scoring 100% in the page optimization, wht else I need to do, because I rank 7-12 in search results
Hi All, Pls check the below url http://www.powerwale.com/inverter-battery for inverter battery keyword in google.co.in im scoring 100% in the page optimization, wht else I need to do, and also I still rank in between 7 to 12 in search results.. How can be in Top 3 search results.. Pls suggest.. Thanks
Intermediate & Advanced SEO | | Rahim1191 -
Rankings Tanked since new Site redesign land new url Structure ? Anything Glaringly Obvious I need to check ?
Hi All, I've just checked my rankings and everything on my eCommerce Site has pretty much tanked really badly since my new URL structure and site redesign was put in a place 2 weeks ago. My url structure was originally long and had underscores but we have now made it clean, shorter and use hyphens. We also have location specific pages and we have incorporated these into the new url structure.Basically it now pretty much follows the breadcrumb trail on our website. We were originally a general online hire site but now we have become niche and only concentrating on one types of products, so we got rid of all the other categories/products and pages we do not deal with anymore. Our Rankings issue , was only bought to light in the most recent MOZ Ranking report so it's looking site google hates our new store. Someone mentioned the other day, that Google may have been doing a Panda/Penguin refresh last weekend, but I am surprised to have dropped like 20 to 50 places for most of my keywords. We have set up the 301 redirects, We have also made the site alot smaller and set up a few thousand 404's to get rid of a lot of redundant pages . We have cut down massively on the thin/duplicate content and have lots of good new content on there. We did new sitemaps , set up schema.org. , increase text to code ratio . Setup our H1-H5 tags on all our pages. made site mobile responsive.. Basically , we are trying to do everything right. Is there anything glaringly obvious , I should be checking ?. I attach a Short url link if anyone wants to have a quick glance- http://goo.gl/7mmEx i.e Could it be a problem with the new urls or anything else that I should be looking at ?.. I.e how can I check to make sure the link juice is being passed on to the new url ? Or is all this expected when doing such changes ? Any advice greatly appreciated .. Pete
Intermediate & Advanced SEO | | PeteC120 -
How careful do you need to be about changes to readable URLs?
We are moving to Sitecore where the standard out the box is that if you change page title it amends the URL as well. I am worried that this will lead to SEO issues and am considering whether we need to get it locked down so that if the page title is amended (only in a minor way) it does not also change the URL. I have never worked with readable URLs before - what are the implications of the URL not exactly matching the wording of the page title?
Intermediate & Advanced SEO | | alzheimerssoc0 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
Penguin & Panda: Geographic Penalities?
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America) If so, perhaps it is a Panda or Penguin issue?
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Ecombuffet.com are offering a Rescue Review focused on Panda - Penguin and identifying issues. Has anyone used this service or aware of the organisation in general?
http://www.ecombuffet.com/rescue-review.htm . I have 2 sites that have definitely been hit by penguin and getting worse so am thinking of paying for this service as nothing I do seems to stop the slide (more like a plummet). Any comments welcome.
Intermediate & Advanced SEO | | Shaann1 -
I need help with setting the preferred domain; www. or not??
Hi! I'm kinda new to the SEO game and struggling with this site I'm working on: http://www.moondoggieinc.com I set the preferred domain to www. in GWT but I'm not seeing it reroute to that? I can't seem to get any of my internal pages to rank, and I was thinking it's possiblly b/c of a duplicate content issue cause by this problem. Any help or guidance on the right way to set preferred domain for this site and whiy I can't get my internal pages to rank? THANKS! KristyO
Intermediate & Advanced SEO | | KristyO0 -
How to find pages hardest hit
I have been hearing that panda can penalize a website for low quality pages. I have run duplicate content check and done my best to go through the whole website. I hear many people talking about deleting hardest hit pages, or fixing hardest hit pages. My question is how can I find which pages on our website are hardest hit? Is there anyway to check a website for pages that might score low. We do have a ecom section to the website which I am concerned might be considered low quality for each product page. Any advice would be a great help.
Intermediate & Advanced SEO | | fertilityhealth0