Panda, Negative SEO and now Penguin - help needed
-
Hi,
We are small business owners who've been running a website for 5 years that provides our income. We've done very little backlinking ourselves, and never did paid directories or anything like that - usually just occasional forum or blog responses. A few articles here and there with some of our keyword phrases for internal pages. Of course I admit we've done some kwp backlinks on some blogs, but our anchor text profile is largely brand names and our domain name and non keywords (excepting for some "bad" backlinks). Our DA is 34, PA 45 for our home page.We were doing great until last Sept 27 when we got hit by Panda and have been working on deoptimizing our site for keywords, we made a new site in Wordpress for good architecture and ease of use for our customers, and we're deleting/repurposing low quality pages and making our content more robust. We haven't yet recovered from this and now it appears we got hit May 22 for Penguin...ARGH!
I recently discovered (hard to have time to devote to everything with just two of us) that others can "negative seo" a site now and I feel this has happened based upon results below... I signed up for linkdetox.com yesterday and it gives a grim picture of our backlinks (says we are in "deadly risk" territory).
We have 83 "toxic" links and 600 some "suspicious" links (many are in malware/malicious listed sites, many are .pl domains from Poland, others are I believe foreign domains, or domains that are a bunch or letters that make no sense, or spammy sounding emd domains), - this makes up 80% of our links.
As this is our only business, our income is now 1/3 of what it has been, even with PPC ads going as we've been hit hard by all of this and are wondering if we can survive fixing this.
We do have an SEO firm minimally helping us along with guidance on recovering, but with income so low, we are doing the work ourselves and can't afford much. Needless to say, we are quite distressed and from reading around, not sure if we'll be able to recover and that is deeply saddening, especially from Negative SEO. We want to make sure we are on the right path for recovery if possible, hence my questions.
We haven't been in contact with Google for reconsideration, again, no penalty messages from them.
-
First of all, if we don't have a manual penalty, would you still contact all the toxic/malicious/possible porn looking sites and ask for a link removal, wait, ask for link removal, wait then disavow? Or just go straight to Google disavow?
-
For backlinks coming from sites that are "gone" (like a message saying the account has been suspended), or there is no website there anymore, do I try and contact them too? Or go direct to disavow? Or do nothing?
-
For the sites flagged as malicious (by linkdetox, my browser, or by Google), I don't want to try and open them on my browser to see if this site is legitimate. If linkdetox doesn't have the contact info for these - what are we supposed to do?
-
For "suspicious" foreign sites that I can't read the webpage -would you still disavow them (I've seen many here say links from foreign sites should be disavowed).
-
How do you keep up with all this is someone is negative SEOing you? We're really frustrated that Google's change has made it possible for competitors to tank your business (arguably though, if we had a stronger backlink profile this may not have hurt, or not as much - not sure). When you are small biz owners and can't hire a group to constantly monitor backlinks, get quality backlinks, content, site optimization, etc - it seems an almost impossible task to do.
-
Are wordpress left nav and footer link anchor text an issue for Penguin? I would think Google would realize these internal links will be repetitive for the same anchor text on Wordpress (I know Matt Cutts said to not use the same anchor text more than once for internal linking -but obviously nav and footer menus will do this).
-
What would you do if this was you? Try and fix it all? Start over with a new domain and 301 it (some say this has been working)? Just start over with a new domain and don't redirect?
Thanks for your input and advice. We appreciate it.
-
-
Hi Chris,
Thanks - BTW - I don't have 200 links in my footer - I have 200 pages on my site, with about 20 navigational links in the footer. -
#3. Yes, that's what I would do.
#5. I mean a link that is/was helping you algorithmicaly to a certain extent due to the authority that external page was able to accumulate.
#6. 200 links in your footer? I wouldn't put that in the non-crazy ridiculous category. You really only need footer links to help visitor navigation. A lot of people put them there to help with SEO but in most cases they're not really effective at that. I'd suggest getting rid of most of those.
-
Hi Chris,
Thanks for your response.So for #3, you wouldn't mess with malicious sites - just go straight to a disavow?
For #5 - I'm glad it's not "that easy" then for people to tank one another. Can you clarify when you said "It's the links that cross a certain threshold of strength AND fit the mold of inorganic that trip the filter." I don't quite follow what you mean. Is this an example of if you haven't been doing any link cleaning for 5 years (like we have) we may have exceeded a certain number and now it will "bite us"?
For #6 - you say "Don't worry about anchor text within your own site, as long as it's not crazy ridiculous". Right now, we have 200+ pages on our site - which makes 200+ footer anchor text links - I can't imagine this could be an issue as we don't have a big site - do you agree? We did use a KWP in our footer really because it's descriptive of what our page was about and helping people understand - but i have since changed it to be safe.
Our biggest traffic drop was right on Sept 27/28 - we moved to 1/2 of the traffic we had prior to that - and I understand that was a Panda update. We do have some definite quality issues. I thought if you lined up your traffic drop to a Panda update that's how you could tell?
Thanks
-
MLM,
1. Per Google, just disavowing links isn't enough to make your reconsideration request successful, G also wants to see a good faith effort to get your toxic links removed whenever possible.
2. If a site no longer exists and the pages that contain links to your site not longer exist, it may take some time for them to disappear from the crawl indexes but you won't have to do anything about them--the links are non-existent.
3.For purposes of your reinclusion request, you should make note of the fact that the sites were marked as malicious and include those links in your disavow link file.
4. If you have reason to believe that the links on those foreign sites are inorganic you can still attempt to contact the webmaster with your removal request, and if no response, add them to your disavow link file.
5. Its my conjecture that effective negative SEO is likely to be as expensive as link building for your own site, which means that someone going out and buying a bunch of low quality links and pointing them at your site is not going to do anything. It's the links that cross a certain threshold of strength AND fit the mold of inorganic that trip the filter. Once tripped, however, the owner is stuck with cleaning up the whole mess.
6. Don't worry about anchor text within your own site, as long as it's not crazy ridiculous.
If it was my site, I'd be focusing on whether it was panda or not. A site with a penquin penalty may be salvageable but one with a panda penalty may not be. As far as your toxic links, don't forget you can move your linked-to content to new URLs and leave the old ones to 404--it's an option if you don't have bad ones pointing to your home page. 301ing to a new domain is generally considered to eventually redirect your penalty to the new domain as well, so I wouldn't go that route.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Coverage impact on SEO
Does the coverage issues on google search console ( Google Webmaster) has an impact on SEO ( CTR or impressions). How much of a difference or impact will fixing these have on Search results and average
Algorithm Updates | | Rishardg0 -
Meta Keywords is relevant in SEO yet ?
Hello, meta keywords is relevant to use in SEO yet? If yes, how many words is recommended to use? Regards, Leandro
Algorithm Updates | | lmoraes0 -
Case Sensitive URL Redirects for SEO
We want to use a 301 redirect rule to redirect all pages to a lower case url format. A 301 passes along most of the link juice... most. Will we even see a negative impact in PageRank/SERPS when we redirect every single page on our site?
Algorithm Updates | | tcanders0 -
Do practitioner listings for the same business need to have different categories?
I'm trying to figure out an issue with practitioner listings, and I wanted to ask more about this because in this Q&A thread ( http://www.seomoz.org/q/multiple-businesses-at-the-same-address-avoiding-google-places-trouble ) it was stated that each listing needs to have a different category. Sorry if this question is redundant, I just wanted to make sure I have a clear understanding before proceeding.... I'm managing local listings for a senior healthcare center that has 4 locations, and multiple practitioners at each location. I'm trying to figure out how to best handle the multiple practitioner listings, most of which appear to be scraped. I was going to claim, verify and begin managing them. However, they all fall under the same category, "practitioner." What I've gathered from the response in the above Q&A thread is that I need to have a different category for each practitioner to please Google Places, despite the fact that "practitioner" is the best/most accurate category available in Google's pre-set categories. Is my interpretation correct? I'm confused because Google's guidelines say you can create a separate listing for each practitioner, which to me implies that it would be ok to use the practitioner category for each one. But then again, I want to make sure before proceeding. Thanks, Kim
Algorithm Updates | | TECHSEO35
#TECHSEO Account Manager0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Did we get hit by Panda? What do we do?
Hello, here's our site: nlpca(dot)com We had a big drop in rankings, going from about 19th to about 43rd for our main keyword and having significant drops in other keywords. This happened roughly 6 weeks ago We thought it was being caused by either: Placing keywords in titles before we had them in the content. or Trying to rank for Utah keywords - we're the NLP Institute of California and we are in both places now, but the site talks about mainly California. We changed both these things, and we're still at the low rankings. Will we move back up? What do we do? Will a backlink campaign be effective at this point?
Algorithm Updates | | BobGW0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0 -
Yet another Panda question
Hi Guys, I'm just looking for confirmation on something..... In the wake of Panda 2.2 one of my pages has plummeted in the rankings whilst other similar pages have seen healthy improvements. Am I correct in thinking that Panda effects individual pages and doesn't tar an entire site with the same brush? Really I'm trying to see if Panda is the reason in the drop on one page or whether it could be something else. The page in question has dropped 130 positions - not just a general fluctuation. Thanks in advance for your responses!!!
Algorithm Updates | | A_Q0