Help, really struggling with fixing mistakes post-Penguin
-
We had previously implemented a strategy of paying for lots of links and focusing on 3 or 4 keywords as our anchors, which used to REALLY work (I know, I know, bad black hat strategy - I have since learned my lesson). These keywords and others have since plummeted up to 100 spots since Panda 3.3 and Penguin. So I'm trying to go in and fix all our mistakes cuz our domain is too valuable to us just to start over from scratch.
Yesterday I literally printed a 75 page document of all of our links according to Open Site Explorer. I have been going in and manually changing anchor text wherever I can, and taking down the very egregious links if possible.This has involved calling and emailing webmasters, digging up old accounts and passwords, and otherwise just trying to diversify our anchor text and remove bad links. I've also gone into our site and edited some internal links (also too weighty on certain keywords) and removed other links entirely.
My rankings have gone DOWN more today. A lot. WTF does Google want? Is there something I'm doing wrong? Should we be deleted links from all private networks entirely or just trying to vary the anchor text? Any advice greatly appreciated. Thanks!
-
I would go through your list and remove the links and not try to vary anchor text at this point. I've been hit was well and moved to a domain I have held for years, but am slowly removing bad links that are on networks or painfully outside my niche. I would suggest naturally building links slowly with partial match anchor text and with the majority of the links having anchor text of your brand
-
Hi LilyRay,
Regarding your Penguin penalization, I would treat it like any other pre-Penguin link-based penalty. I have worked with many sites that have been penalized for manipulative linking, and the process to get the penalty lifted is always the same:
- REMOVE as many of the manipulative links as you can. It's the link that Google has classified as manipulative. The anchor text was just the identifier that helped them find it. Changing the anchor text of a manipulative links and leaving them up will keep the penalties associated with those links in place.
Document all of the steps that you're taking to eliminate manipulative links. Make a neat, bulleted list, with the link(s), network(s), actions taken by you, and the results. In some cases, you won't be able to remove a link. That's understandable, as they're not in your control. While you're at it, clean up ANYTHING else on your site that could be perceived as on-page spam. You're trying to prove to Google that you are a good citizen of the web, so make your site as sparkly as you can.
Once you've completed these steps, submit all of your documented work as part of your reconsideration request, to show Google that you're operating in good faith. Under normal circumstances, wait times for reconsideration requests can be anywhere from a week to a month. With the mass of reconsiderations that Google is getting right now, I'd expect a longer wait.
I'm sure this process sounds painful, and it is, but it's the only way to get back from a penalty that I've seen be effective.
-
It was partially out of my control. Pressure from higher ups for instantaneous results. I've always supported and wanted to stick to white hat seo.
-
And promise yourself never to go for the quick and easy again.
-
Google since released a 52 pack of updates since the roll out of Penguin and Panda 3.6 which you may have been stung by almost immediately after the first hit.
SEOmoz provide up to date change history of algorithm updates as soon as they are released.
Any backlinks you have which are associated to blog rings / networks - I would delete as many as you can. If the network has been identified and blacklisted by Google, then they'll be rolling out penalties for any domains that have used them. Parallel to this, build some natural links to balance out your link profile as soon as you can too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Job Posting Page and Structured Data Issue
We have a website where we do job postings. We manually add the data to our website. The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc. We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt. Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue. Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page: "Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details." Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
Intermediate & Advanced SEO | | dailynaukri0 -
How to Implement AMP for Single Blog Post?
Hello Moz Team, I would like to implement AMP for my single blog post not on whole blog. Is it possible? if Yes then How? Note - I am already using GTM for my website abcd.com but I would like to use for my blog post only and my blog is like - abcd.com/blog..............let me clarify Blog Post means - abcd.com/blog/my-favorite-dress Thanks!
Intermediate & Advanced SEO | | Johny123450 -
Help With Href Lang Implementation
Hi, We've received the following Href Lang errors in Google Search Console. After much research and playing around with the tags we're still no closer to fixing the issues. ERRORS; http://prntscr.com/du8ei8 AND http://prntscr.com/du8evi As you'll see, Google is telling us that some URLs do not have return tags, but they do seem to! Does anyone know of a pro that can provide a service to fix this for us? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Robots.txt Help
I need help to create robots.txt file. Please let me know what to add in the file. any real example or working example.?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Help with homepage SEO please
Hi I have been looking after this site www.kids-academy.co.uk for 3 weeks now. I spotted that links were a major problem with the site and started to strip out the black hat inbound links straight away. I have also been doing some onsite optimisation for the main areas I have been asked to focus on. This saw results within a week for the subpages which is great. however, the design of the site meant there were no landing pages for the categories and the menu is a permanent "fixture" called a megamenu. I have advised /landingpages were needed not only for the ease of the end user to find what they need but also for SEO. Now the issues I have are that due to the menu style, it sees every one of those links within the homepage - over 200 links! I am wondering if I should nofollow some of them, or get them to change the style of the menu as surely this is having a direct result on the homepage and landing pages just not being seen at all within Google (as normal - Bing and Yahoo love the site). There is a lot of work to do on this site, but I would have thought to have seen some movement on the homepage at least by now. Any help is much appreciated. (Please note, there are some duplicate pages on there at the moment as I amalgamated some of the pages together last night and need to redirect these but I am having issues with redirect loops so those are not a contributing factor as this is a recent change). Thanks
Intermediate & Advanced SEO | | LeanneSEO
Leanne0 -
Alternative Markup Challenge. Can anyone help?
I have a challenge around alternative markup. We currently operate a single domain with geo-targeted folders and alternative markup implemented. We are now now looking to expand this out to non-English content. Current Implementation; All generic English language content hosted on the main domain, with x5 English language content variations (locales) available under a folder structure (.com/en-us/ etc.). Alternative markup is in place for all locales within the HTML, implemented automatically by developers via the CMS. Locale folders geo-targeted via GWT and Bing WT. Planned Launch; Introduction of 5 new non-English locale folders (e.g. /de-de/ etc.), targeted to their respective country and language. Content language will be mixed, with around 1/10 of pages translated and the other 9/10 of pages (business listings) having their body content remain in English, with headers / footers translated. Locale folders will be geo-targeted via GWT and Bing WT. Folder and markup usage TBC. Options; Folders; Implement folder structure /de/, attempting to indicate country but not language (issue; usually a single identifier indicates language, not country?). Implement /de-de/ folder structure to match the English locales and maintain correct country targeting (issue; some content is not in language). Alternative markup; Do not make use of markup at all. Implement CMS based automated markup on all English and non-English content throughout the locale (e.g. /de-de/), but exclude English language versions (e.g. /en-gb/). Attempt manually implementing markup to bridge the English and non-English locales, potentially creating future issues with new content going live and content being removed. A heavy risk. Current approach is webmaster tools targeting, a /de-de/ folder structure and automated implementation of markup. This means English language URLs will have markup and non-English language URLs will have markup, but they will not match up (e.g. English pages will never have markup for non-English language content). If you minds haven't melted, what's your thoughts? Any help is much appreciated.
Intermediate & Advanced SEO | | HelloAlba0 -
Does 301 redirect to a new domain removes penguin penality
Hi, One of my client has shady link profile and has hit by penguin update. I have confirmed the penalty using Google hack. Now, seeing his link profile, most of his links comes from blog comments which are from unmoderated blogs, and there is no way, we cant remove those comments. But without removing them, we cant get rid of the Google's penguin penality. So, i am planning on 301 redirecting to a new domain. But my question is, will the penality transfers, if i 301 to a new domain? What iff, if someone buys an old domain hit by a penguin update? Please clarify me, or if there are any alternatives to get rid of penguin update, please help me.
Intermediate & Advanced SEO | | Indexxess0 -
Are External Links Really Bad for SEO?
I have a quality site with good PR and we have very few outbound links. We are always looking for good content since it is a blog, and I get frequent requests from "guest bloggers"–some of which actually provide really high quality & unique content. But, of course, to get their content I have to give them at least 1 link. I have always been taught that external links really hurt your site and that basically 1 outbound link cancels out 1 inbound link because they balance each other out. Is this true? I always wondered why we must get penalized for linking out to good sites? It makes me get very "stingy" with my outbound links, whereas if my site would not be effected, I would want to be more generous. Any suggestions & info would greatly help! Thank you, Afshin
Intermediate & Advanced SEO | | applesofgold0