Penguin Rescue! A lead has been hit and I need to save them!
-
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
-
This sounds like a plan. Give it a shot and test the results
-
Does anyone care to share their view on my last?
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
-
Thanks for all the responses guys. I have taken them on-board. 1 thing I have noticed...
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
Thanks
-
I second the time frame issue. 1 month won't be enough time and your work will just benefit the next person this client gets to work on it, while you'll be left with an upset client because of bad expectations.
-
"you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again."
I agree with this 100%.
These types of problems can be fixed and then must wait until google reevaluates and then republishes back into the SERPs. Sites that are hit with these types of problems escape in batches - not when things are fixed.
So, you could do great work, get it fixed on 25th day and then google does not reprocess and republish for 60 more days and some other SEO gets credit for your hard work.
I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo.
Exactly... What are good links? Your "added" links will not be natural.
-
Well, from what everyone is writing about Penguin, it's an algorithmic update. Meaning you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again.
I think the timeline you have set is highly unrealistic and you should aim to set expectations with the client that this process can very well take much longer. If this previous SEO company built problematic links, I think you'll have to deal with them. I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo. I think you're going to have to go through the tedious work of cleaning things up. The good news is that a bunch of people have written about what to look for. Check in WMT tools for sitewide links, check your anchor text pointing into the site. Export your external links from OSE and then upload them to Linkl Detective- http://linkdetective.com/- let it do the hard work for you, classify a lot of the links, and then you need to go through the process of trying to clean things up, doing as much as you can, and then submit a reinclusion request (may help, may not), hoping Google will discard the other links.
Good luck - really try to demonstrate to your client the complexity of the process and extend the timeframe of the project - that's my ultimate recommendation
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need 301's if I use HSTS in HTTP to HTTPS migration?
Just wondering if this was strong enough signal to search engines that we don't need to write a 301 rule in .htaccess.
Intermediate & Advanced SEO | | KevinBudzynski0 -
Disavow post Penguin update
As recent Penguin update makes quick move with backlinks with immediate impact; does Disavow tool also results the changes in few days rather than weeks like earlier? How long does it take now to see the impact of disavow? And I think still we must Disavow some links even Google claim that it'll take care of bad backlinks without passing value from them?
Intermediate & Advanced SEO | | vtmoz0 -
Local SEO - Do I need it if I don't do business locally?
Super confused about this. Our office is located in Los Angeles, but it is not a storefront, and our clients are from all over the country... and our business involves travel to other countries. So there is nothing "local" about us. But everything I read seems to say we should be doing local SEO. How to approach this?
Intermediate & Advanced SEO | | benenjerry1 -
Penguin hit Website - Moving to new domain
Hey! I am working on a Penguin hit Website. Still ranking for all brand keywords and blog articles are still being returned in Google SERPs, but the website is showing up for only 3 or 4 money keywords. It is clearly a penguin hit as it was ranked 1st page for all money keywords before latest update (3.0). We already did a link cleanup and disavowed all bad backlinks. Still, the recovery process could take over 2 years from previous experience, and in 2 years, the site will suffer a slow death. Solution: We own the .com version of the domain, currently being served on the .net. We bought the .com version about 6 years ago, it is clean and NOT redirected to the .net (actual site). We were thinking about moving the whole Website to the .com version to start over. However, we need to make sure Google doesn't connect the 2 sites (no pagerank flow). Of course Google will notice is the same content, but there won't be any pagerank flowing from the old site to the new one. For this, we thought about the following steps: Block Googlebot (and only googlebot) for the .net version via robots.txt. Wait until Google removes all URLs from the index. Move content to the .com version. Set a 301 redirect from .net to .com (without EVER removing the block on googlebot). Thoughts? Has anyone went over this before? Other ideas? Thanks!
Intermediate & Advanced SEO | | FedeEinhorn0 -
Need to know if this could be dangerous
We have recently launched an store, and we had created 301 redirects from the old product pages to the new ones. But we found that the new website performance was a quite slow, and that it would improve a lot if you add a number at the end of the product page url. But that implies to change the 301 redirect and to add a new one for the pages that has already been updated by the search engine. I mean. www.store.com/oscommerce/catalog/productpage.html has been now 301 redirected to www.store.com/sectionname/productpage this was 48 hours ago. But now, (I know it, we should check this before...) we need to change the url to www.store.com/sectionname/productpage/5487 this way the performance improves a lot. But we are afraid of doing two 301 in just 48 hours. any advice?
Intermediate & Advanced SEO | | teconsite
Shoud we find another way of solving the performance issue? Thanks0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Need Help Finding Directories to Submit To
I am looking for a lot of free "do follow" technology directories to submit to. Does anyone know of a good directory or a list of some sort of technology directories or something similar? Actually, I guess any directory that has a technology category would be helpful.
Intermediate & Advanced SEO | | MyNet0