Experience with Google Disawow Tool and discovering bad back-links
-
Hi Community,
is there any experience to tell here about the disawow tool from Google? Any review?
It have helped revocer sites beaten by Penguin or penalized after WMT Unnatural Link building message?
Which tools and methods you use to find bad back-links to submit for the disawow tool?
Thanks for your feedback,
-
We have 1000s of links coming from www.wn.com. I have no idea why they link to us and our GA stats don't show any traffic from wn.com.
Could I disallow www.wn.com in our robots.txt file?
-
Yes,of course, but do you think is enough, to reference the use of disawow tool, where the list of links which could not be removed was submitted to disavow, so google can mark them as nofollow? Secondary, as supplement to add a list, from urls where links was removed.
In any case, the webspam team from Google could measure it anyway, without this kind of protocoll?
-
Not only do you have to file a reconsideration request but you'll need to show good evidence that you have tried to get as many links as possible manually removed as well.
-
Thanks, for your reply. If you say, "You will likely not be able to remove a manual unnatural links warning by just using the disavow file on its own", you mean, after you use the disawow tool, you have file a reconsideration request?
-
I use the disavow tool a lot, but only for sites that have received a manual unnatural links penalty. I do as much work as I can to manually get links removed and for the unnatural links that I can't get removed then I include them in the disavow.txt file. I've had great success getting penalties removed this way.
You will likely not be able to remove a manual unnatural links warning by just using the disavow file on its own.
For Penguin, most SEOs believe that you cannot recover from Penguin until Google refreshes the Penguin algorithm. As the algorithm has not refreshed since the disavow tool came out, no one can say whether or not the disavow tool is helpful. The documentation that was released with the disavow tool hints that it may be helpful in Penguin cases but it's kind of vague.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Places Tools To Find Duplicate Listings?
Hey guys, Any recommended tools - paid or processes to find duplicate listings for Google Places? Cheers.
Intermediate & Advanced SEO | | wickstar0 -
Google Search Console Tools Sitelinks
Question. If you wanted to use the Google Demote this sitelink URL - will another sitelink appear to replace it? There is a sitelink I feel doesn't belong and hoping another one that is beneficial appears. I understand Google has control of what appears. https://support.google.com/webmasters/answer/47334 Thanks.
Intermediate & Advanced SEO | | Kdruckenbrod0 -
Using rel="nofollow" when link has an exact match anchor but the link does add value for the user
Hi all, I am wondering what peoples thoughts are on using rel="nofollow" for a link on a page like this http://askgramps.org/9203/a-bushel-of-wheat-great-value-than-bushel-of-goldThe anchor text is "Brigham Young" and the page it's pointing to's title is Brigham Young and it goes into more detail on who he is. So it is exact match. And as we know if this page has too much exact match anchor text it is likely to be considered "over-optimized". I guess one of my questions is how much is too much exact match or partial match anchor text? I have heard ratios tossed around like for every 10 links; 7 of them should not be targeted at all while 3 out of the 10 would be okay. I know it's all about being natural and creating value but using exact match or partial match anchors can definitely create value as they are almost always highly relevant. One reason that prompted my question is I have heard that this is something Penguin 3.0 is really going look at.On the example URL I gave I want to keep that particular link as is because I think it does add value to the user experience but then I used rel="nofollow" so it doesn't pass PageRank. Anyone see a problem with doing this and/or have a different idea? An important detail is that both sites are owned by the same organization. Thanks
Intermediate & Advanced SEO | | ThridHour0 -
Multiple sitewide (deep)links devalued by Google?
In my experience sitewide links can still be very powerful if used sensibly and in moderation. However, I'm finding that sitewide text blocks with 2 or 3 (deep)links to a single domain appear not to be working that well or not at all in raising the authority of those target pages. Anyone having the same experience? In your experience, is the link value diminished when there are multiple deeplinks to a single domain in a sitewide text area? Is anything more than 1 link per target domain bad? Or could it even be that it's not so much the number of deeplinks to a single domain that matter, but purely the fact that they are sitewide "deeplinks"? Are sitewide deeplinks treated differently than sitewide links linking to an external homepage? Very interested in hearing your personal experience on this matter. Factual experience would be best, but "gut feeling" experience is also appreciated 🙂 Best regards, Joost
Intermediate & Advanced SEO | | JoostvanVught0 -
Is removing inorganic links a bad idea?
Hey there, We have recently been in touch with a SEO agency that recomended we remove all inorganic links from our backlink profile. Most of the links are pretty good but there are some news sites that have sitewide links to our site. The link is in the nav menu, as a useful link. We didn't ask for this link it was totally organic. Also some link building in the past was focused on anchor text so some of the keywords may have been over emphasised. Is it a good idea to go about removing all of the potentially inorganic looking links? My concern is that we wipe out links that google are actually valuing. I still know sites are ranking #1 with much more dubious backlink profiles, and then there's this guy who removed his sitewide backlinks and dropped in his ranking: http://www.seomoz.org/q/removed-site-wide-links If a competitor decided to add negative links to our site, it would take longer to find and remove negative links than it would for them to add them. It seems odd that google would allow negative SEO to be that easy.. What do you think?
Intermediate & Advanced SEO | | timscullin0 -
How to promote some links on google
Hi our site is http://www.mycarhelpline.com If people search on our site in Google by typing - Mycarhelpline they see links - why mycarhelpline, contact us and about us how can we put some other key pages by replacing above pages
Intermediate & Advanced SEO | | Modi0 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0