My Website Has a Google Penalty, But I Can't Disavow Links
-
I have a client who has definitely been penalized, rankings dropped for all keywords and hundreds of malicious backlinks when checked with WebMeUp....However, when I run the backlink portfolio on Moz, or any other tool, they don't appear anyone, and all the links are dead when I click on the actual URL. That being said, I can't disavow links that don't exist, and they don't show up in Webmaster Tools, but I KNOW this site has been penalized. Also- I noticed this today (attached). Any suggestions? I've never come across this issue before.
-
At this point you can only wait to see the results of the reconsideration request and hope they provide you examples if more clean up is needed.
-
I used a software called WebMeUp...no other software shows the bad backlinks as they are all 404's now and no longer linking to the website. I have already submitted a reconsideration request.
-
Have you tried ahref? Majesticseo? cogintiveseo? to see your backlinks or just moz?
If you think you have been hit try using this https://www.google.com/webmasters/tools/reconsideration and see if itshows up.
-
Hi Karen,
Breaking away from the hypothesis about the penalty here for a second, but I am wondering if the HTTPS / HTTP issue is part of the problem here.
Would you be able to PM me the website in question if you are not comfortable sharing it publicly here? I am not a Moz staff member (I used to be, 2006 - 09) but I help out with Q&A as an associate.
Cheers,
Jane
-
I have all the bad links, they are all 404's now.
-
Have you looked into the Link Detox tool? It supposedly helps you find the bad links that no one else monitors.
-
Thanks Travis but there are no crawl issues
-
Have you attempted to crawl the site? If not, I would give it a little Screaming Frog. If I knew the URL, I would have done it already. : )
Sometimes crawl problems look and feel like penalties. That upward trend in pages blocked by robots.txt before the fall is making me a little suspicious. Plus, if it is a problem with crawlability - that's easier to fix than a penalty.
-
Can you pinpoint the approximate date that the traffic dropped significantly? Look for a Penguin update that's near that time frame using this page: http://moz.com/google-algorithm-change
If you don't see a Penguin update near your drop in traffic, AND you don't have a manual penalty, you might have another issue. Possibly a Panda penalty or just some other site health issue that caused the drop in traffic. Considering the large amount of spammy backlinks you say existed, it does sound like a Penguin penalty is likely, but it can't hurt to check that Google Algorithm Change history.
If the spammy backlinks are ALL gone now, then you're right... there's nothing you can do in regards to disavowing or manual removal. If you do have a manual penalty that was given due to unnatural inbound links, you can submit a reconsideration request and let them know that you didn't build the backlinks and they all disappeared. Let Google know you plan to continue to monitor your backlink profile and take immediate action against future negative backlinks that are found.
-
Alright...weird...the search results show some https but when I click on them I get the "SSL Connection Error"
-
There's a chance I'm wrong about this. Maybe EVERY site shows that "update" line, regardless of whether or not an https version was found by Google....
Try a "site:" search on Google for both variations.
- site:https://example.com
- site:http://example.com
edit: "if you have a website on HTTPS, or if some content is indexed under different subdomains." You'll see the "update" line if you've got multiple sub-domains as well. So that's likely what's happening in your scenario.
-
I don't have that version of the site...If I try to go to https://website I get an "SSL connection error"
-
That "update" line you posted a screen shot of means you have an https version of your website. Are you sure you've got the right version verified in Webmaster Tools? If you've verified http but not https, or vice versa, verify the other one. You might be able to see the backlinks in Webmaster Tools on the other version.
You can read more about this recent Google Webmaster Tools update here: http://searchenginewatch.com/article/2337524/Google-Webmaster-Tools-Gives-More-Precise-Index-Status-Data
Update us with what you find once you look into that a bit!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website can't break into Google Top100 for main keywords, considering 301 Redirect to a new domain
A little background on our case. Our website, ex: http://ourwebsite.com was officially live in December 2015 but it wasn't On-Site optimized and we haven't done any Off-site SEO to it. In April we decided to do a small redesign and we did it an online development server. Unfortunately, the developers didn't disallow crawlers and the website got indexed while we were developing it on the development server. The development version that got indexed in Google was http://dev.web.com/ourwebsite We learned that it got indexed when we migrated the new redesigned website to the initial domain. When we did the migration we decided to add www and now it looks like: http://www.ourwebsite.com Meanwhile, we deleted the development version from the development server and submitted "Remove outdated content" from the development server's Search Console. This was back in early May. It took about 15-20 days for the development version to get de-indexed and around 30 days for the original website (http://www.ourwebsite.com) to get indexed. Since then we have started our SEO campaign with Press Releases, Outreach to bloggers for Guest and Sponsored Posts etc. The website currently has 55 Backlinks from 44 Referring domains (ahrefs: UR25, DR37) moz DA:6 PA:1 with various anchor text. We are tracking our main keywords and our brand keyword in the SERPs and for our brand keyword we are position #10 in Google, but for the rest of the main (money) keywords we are not in the Top 100 results in Google. It is very frustrating to see no movement in the rankings for the past couple of months and our bosses are demanding rankings and traffic. We are currently exploring the option of using another similar domain of ours and doing a complete 301 Redirect from the original http://www.ourwebsite.com to http://www.ournewebsite.com Does this sound like a good option to you? If we do the 301 Redirect, will the link-juice be passed from the backlinks that we already have from the referring domains to the new domain? Or because the site seems "stuck," would it not pass any power to the new domain? Also, please share any other suggestions that we might use to at least break into the Top 100 results in Google? Thanks.
Intermediate & Advanced SEO | | DanielGorsky0 -
Google Penalty Checker Tool
What is the best tool to check for the google penalty, What penalty hit the website. ?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Website Redesign, 301 Redirects, and Link Juice
I want to change my client’s ecommerce site to Shopify. The only problem is that Shopify doesn’t let you customize domains. I plan to: keep each page’s content exactly the same keep the same domain name 301 redirect all of the pages to their new url The ONLY thing that will change is each page’s url. Again, each page will have the exact same content. The only source of traffic to this site is via Google organic search and sales depend on the traffic. There are about 10 pages that have excellent link juice, 20 pages that have medium link juice, and the rest is small link juice. Many of our links that have significant link juice are on message boards written by people that like our product. I plan to change these urls and 301 redirect them to their new urls. I’ve read tons of pages online about this topic. Some people that say it won’t effect link juice at all, some say it will might effect link juice temporarily, and others are uncertain. Most answers tend to be “You should be good. You might lose some traffic temporarily. You might want to switch some of your urls to the new structure to see how it affects it first.” Here’s my question: 1) Has anyone ever done changed a url structure for an existing website with link juice? What were your results and do you have a definitive answer on the topic? 2) How much link juice (if any) will be lost if I keep all of the exact content the same but only change each page’s url? 3) If link juice is temporarily lost and then regained, how long will it be temporarily lost? 1 week? 1 month? 6 months? Thanks.
Intermediate & Advanced SEO | | kirbyf0 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
Google does not favour php websites?
Hi there. An SEO company recently told me that google does not favour php development? This seems rather sketchy, I have not read that google doesn't favour this anywhere, did I just miss that part of SEO or are these guys blowing a little smoke?
Intermediate & Advanced SEO | | ProsperoDigital1 -
How can i stop such links being indexed
Hi, How can i stop such links being indexed The first link is what i want to stop indexed. We have 1,000's of people writing articles and the below URl shows how many articles each did http://www.somename.com/article/15633 But this is the URl which shows the exact articlehttp://www.Somename.com/article/step-step-installation-ibm-lotus-notesAs both start as thishttp://www.Somename.com/article/How can i set noindex? Should we set for each URL manually one by oneThanks
Intermediate & Advanced SEO | | mtthompsons0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0