My Website Has a Google Penalty, But I Can't Disavow Links
-
I have a client who has definitely been penalized, rankings dropped for all keywords and hundreds of malicious backlinks when checked with WebMeUp....However, when I run the backlink portfolio on Moz, or any other tool, they don't appear anyone, and all the links are dead when I click on the actual URL. That being said, I can't disavow links that don't exist, and they don't show up in Webmaster Tools, but I KNOW this site has been penalized. Also- I noticed this today (attached). Any suggestions? I've never come across this issue before.
-
At this point you can only wait to see the results of the reconsideration request and hope they provide you examples if more clean up is needed.
-
I used a software called WebMeUp...no other software shows the bad backlinks as they are all 404's now and no longer linking to the website. I have already submitted a reconsideration request.
-
Have you tried ahref? Majesticseo? cogintiveseo? to see your backlinks or just moz?
If you think you have been hit try using this https://www.google.com/webmasters/tools/reconsideration and see if itshows up.
-
Hi Karen,
Breaking away from the hypothesis about the penalty here for a second, but I am wondering if the HTTPS / HTTP issue is part of the problem here.
Would you be able to PM me the website in question if you are not comfortable sharing it publicly here? I am not a Moz staff member (I used to be, 2006 - 09) but I help out with Q&A as an associate.
Cheers,
Jane
-
I have all the bad links, they are all 404's now.
-
Have you looked into the Link Detox tool? It supposedly helps you find the bad links that no one else monitors.
-
Thanks Travis but there are no crawl issues
-
Have you attempted to crawl the site? If not, I would give it a little Screaming Frog. If I knew the URL, I would have done it already. : )
Sometimes crawl problems look and feel like penalties. That upward trend in pages blocked by robots.txt before the fall is making me a little suspicious. Plus, if it is a problem with crawlability - that's easier to fix than a penalty.
-
Can you pinpoint the approximate date that the traffic dropped significantly? Look for a Penguin update that's near that time frame using this page: http://moz.com/google-algorithm-change
If you don't see a Penguin update near your drop in traffic, AND you don't have a manual penalty, you might have another issue. Possibly a Panda penalty or just some other site health issue that caused the drop in traffic. Considering the large amount of spammy backlinks you say existed, it does sound like a Penguin penalty is likely, but it can't hurt to check that Google Algorithm Change history.
If the spammy backlinks are ALL gone now, then you're right... there's nothing you can do in regards to disavowing or manual removal. If you do have a manual penalty that was given due to unnatural inbound links, you can submit a reconsideration request and let them know that you didn't build the backlinks and they all disappeared. Let Google know you plan to continue to monitor your backlink profile and take immediate action against future negative backlinks that are found.
-
Alright...weird...the search results show some https but when I click on them I get the "SSL Connection Error"
-
There's a chance I'm wrong about this. Maybe EVERY site shows that "update" line, regardless of whether or not an https version was found by Google....
Try a "site:" search on Google for both variations.
- site:https://example.com
- site:http://example.com
edit: "if you have a website on HTTPS, or if some content is indexed under different subdomains." You'll see the "update" line if you've got multiple sub-domains as well. So that's likely what's happening in your scenario.
-
I don't have that version of the site...If I try to go to https://website I get an "SSL connection error"
-
That "update" line you posted a screen shot of means you have an https version of your website. Are you sure you've got the right version verified in Webmaster Tools? If you've verified http but not https, or vice versa, verify the other one. You might be able to see the backlinks in Webmaster Tools on the other version.
You can read more about this recent Google Webmaster Tools update here: http://searchenginewatch.com/article/2337524/Google-Webmaster-Tools-Gives-More-Precise-Index-Status-Data
Update us with what you find once you look into that a bit!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento 1.9 SEO. I have product pages with identical On Page SEO score in the 90's. Some pull up Google page 1 some won't pull up at all. I am searching for the exact title on that page.
I have a website built on Magento 1.9. There are approximately 290,000 part numbers on the site. I am sampling Google SERP results. About 20% of the keywords show up on page 1 position 5 thru 10. 80% don't show up at all. When I do a MOZ page score I get high 80's to 90's. A page score of 89 on one part # may show up on page one, An identical page score on a different part # can't be found on Google. I am searching for the exact part # in the page title. Any thoughts on what may be going on? This seems to me like a Magento SEO issue.
Intermediate & Advanced SEO | | CTOPDS0 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Content Of Dead Websites Can be resused?
I have 2 websites. One website links are from spamy techniques (wrong guy hired) which still has massive links so I started a new website with a fresh domain. Now when the new website (only white hate methods used) has started to show positive movements I feel like its the right time to shut the other website down. Since, I have a lot of content on my first site (spamy links) can i reuse the content again on my new site after I shut down my first site?
Intermediate & Advanced SEO | | welcomecure0 -
Can you recover from "Unnatural links to your site—impacts links" if you remove them or have they already been discounted?
If Google has already discounted the value of the links and my rankings dropped because in the past these links passed value and now they don't. Is there any reason to remove them? If I do remove them, is there a chance of "recovery" or should I just move forward with my 8 month old blogging/content marketing campaign.
Intermediate & Advanced SEO | | Beastrip0 -
Can a competitor close your business on Google Places?
One of my listings says it has been closed and the business is not closed. On Google + / Google places there is a field that allows users to check that claims the business is closed. Can they actually close it? Your Google Places listing has been updated Dear Google Places user, Google has updated your listing data on our consumer properties such as Google and Google Maps to more accurately reflect the latest information we have about your business. We use many sources to determine the accuracy of our listing data and to provide the best possible experience for business owners and consumers who use Google and Google Maps to find local information. Based on our sources, the following listing has been marked as closed: Company info... If you disagree with the changes we have made, please visit your Place Page to edit your listing. Note that if you are an AdWords or Boost customer, your ads will be unaffected by this change and will continue to display the listing information you have provided in Google Places. To manage your online advertisements, please sign into Google Places or Google AdWords. For more information about updates to claimed listings, please visit:http://www.google.com/support/places/bin/answer.py?hl=en&answer=1318197 Sincerely,
Intermediate & Advanced SEO | | SEODinosaur
The Google Places Team |0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0