Removing Domains From Disavow File
-
We may have accidentally included the wrong domains in our Disavow file and have since removed most domains leaving the only very highly rated spammy links (using moz's new spam score)in the file.
How long can it take for to google to recognise this change?ThanksMike
-
Great! Thank you for your help.
-
Hi Mike,
I recommend you to read this guide of spam score from Moz:
https://mza.bundledseo.com/help/guides/link-explorer/spam-score
Start reading on this part: "Another site's Spam Score - Again, this doesn't mean that these sites are spammy. This percentage represents a wide variety of potential signals ranging from content concerns to low authority metrics. Since this is just based on correlation with penalization, rather than causation, the solution isn't necessarily to disregard sites or disavow links with higher Spam Scores. Instead, we'd recommend using it as a guide for kick starting investigations. Be sure to check out a site's content and its relevance in linking back to you before disregarding or disavowing."
I personally never use Disavow Links Tool. I manually delete links or simply create new ones to reduce to percentaje of "spammy links" or the percentaje of links that have the same anchor...
But if I had to say a spam rating where I would use the disavow links tool, it probably would be higher than 60-80%, depending on my personal opinion of how spammy I see the website. If I see it very spammy, higher than 60%, if a don't see it very spammy, higher than 80%.
Hope that helps
-
Hi Pau Pl
Thank you for the response,How often do you advise to use the disavow file? for example we use the new Moz tool that provides a spam rating from 1 to 100% and we tend to disavow links from site that are higher than 80% with active links (99% of these are from hotlinking image sites).ThanksMike
-
Hi mlb7,
Matt Cutts explained this around 2015:
When you are disavowing links, you can know that a link in your disavow file is considered disavowed once you see that Google has cached the page where the link resides. But when it comes to reavowing, we have no way of knowing when Google is going to start counting that link again or whether it will be given the same weight.
Reavowing a link can “take a lot longer than disavowing it,” though no one knows how long that is. Google wants to be really certain that spammers are not going to try to figure out which links are helping or hurting them by doing disavow and reavow experiments.
I recommend you to take a look to this video from Matt Cutts: https://www.youtube.com/watch?time_continue=1&v=393nmCYFRtA
Sources:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
https://ahrefs.com/blog/google-disavow-links/Hope that helps, good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority
Hi, I've been adding links to my site on directories, article sites and relevant industry blogs. I've also been given and a budget for Google Ads, which has also increased traffic. In the last week my Keyword positions have risen. However, my domain authority has reduced. I know the most important thing is improving traffic and keywords but I'd like to know how I can improve my domain authority. What is causing it to reduce and how can I improve it? Cheers,
Moz Pro | | desktop_nev
Neville0 -
Domain Authority Trending
Hi - Is there a way to run a report or see any Domain Authority dips? Basically I would like to see Domain Authority trends.
Moz Pro | | Digital_SEO1230 -
Competitive Domain Analysis Full Expanation
Can someone please explain these words to me? The little explanationme basically are only helpful if you know SEOmoz vocabulary. I'm used to referring domains, backlinks, & anchor text. Any help? Domain Authority
Moz Pro | | jonnyholt
Domain MozRank
Domain MozTrust
External Followed Links
Total External Links
Total Links
Followed Linking Root Domains
Total Linking Root Domains
Linking C-Blocks
Followed Links vs NoFollowed Links
Followed Linking Root Domains vs NoFollowed Linking Root Domains Thanks!0 -
Root domain or sub domain
When I crawl my site as a root domain, I get more errors is my campaign than when I set my site as a sub domain. Which one is the correct way: root domain or subdomain. My site is www.aa-rental.com
Moz Pro | | tanveer10 -
Competitive Domain Analysis: Domain Metrics value 1
Hi, I am writing this article to know the possible causes of the value of 1 domain authority, and all values of the "Competitive Analysis Domain" equal to 0. The domain was registered on February 14.
Moz Pro | | ndrrr
Is there anything I can do to solve this problem?
For example my site has some backlinks but are not displayed, the same goes for internal links. (On webmaster tools, I can see the backlinks) Thanks in advance for the answer Andrea0 -
Competitive Domain Analysis Update
The updates to the Competitive Domain Analysis were occuring every 7 days upto Jan 16. When will the next updates be and how often is the new schedule? R/ John
Moz Pro | | TheNorthernOffice790 -
Campaigns - domains / sub domains
Dear SEOmoz folk, At the moment, I have one (rather large) site set up as a campain. I have decided to split off parts of my site into sub domains. My question is, can I set up these sub domains as a completely separate campaign on here? I'd very much like the crawl reports separated out for each one. A very real fear I have is that my site is well over double 10,000 pages, so I never truly see the actual number of errors that I have. Thoughts appreciated. Many thanks, Matt
Moz Pro | | Horizon0 -
Why is Roger crawling pages that are disallowed in my robots.txt file?
I have specified the following in my robots.txt file: Disallow: /catalog/product_compare/ Yet Roger is crawling these pages = 1,357 errors. Is this a bug or am I missing something in my robots.txt file? Here's one of the URLs that Roger pulled: <colgroup><col width="312"></colgroup>
Moz Pro | | MeltButterySpread
| example.com/catalog/product_compare/add/product/19241/uenc/aHR0cDovL2ZyZXNocHJvZHVjZWNsb3RoZXMuY29tL3RvcHMvYWxsLXRvcHM_cD02/ Please let me know if my problem is in robots.txt or if Roger spaced this one. Thanks! |0