Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools Not Showing Correct Data?
Hi, I am experience a weird issue. Google webmaster tools suggested me some HTML improvements a few weeks ago. The suggestions were about duplicate Title Tags and Short Meta Descriptions. I changed the Title Tags and Meta Descriptions. But after 3 Google Updates, webmaster still shows the same suggestion. Please advise Thanks
On-Page Optimization | | Kashif-Amin0 -
Looking for a Tool to Assist with Site Optimization. Does it already exist?
I'm looking for a tool that can help us quickly identify web pages on a client's site that contain a selected keyword phrase.
On-Page Optimization | | RosemaryB
I would like to enter say 100 keyword phrases and the client's URL and receive a report that shows - for each keyword - the client URLs that contained that exact phrase. Does anyone know of a tool that can do this? Thank you, Rosemary Brisco0 -
Duplicate meta and title in Google Webmaster Tools not updated?
Our canonical tags were removed by mistake for some time and our duplicate meta and title warnings in Google webmaster tools html improvements went up. We fixed the issue a week ago. I made sure the sitemap is picking up the canonical URL. Still the number went up after we fixed it (from around 5000 to around 7000 now) but when I click on details, it shows around 2000 in detail page. How long does it take for Google to update this? RB
On-Page Optimization | | rbai0 -
Is there a tool that I can use to scrape and see metatags?
Looking for a tool that allows me to scrape the websites off a page listing of Google and output a spreadsheet with the websites and their related meta-tag details (mainly title tag). Is there a tool out there that can conveniently allow me to do this?
On-Page Optimization | | Gavo0 -
Tools for seaching for keyword rich domains
Could I get some recommendations please for tools that will help me find keyword rich domains in my space. Also I did try out the Go Daddy bulk domain checker (just copy and paste domains into box) but it doesn't let me bulk search .co.uk domains. Any ways around that one?
On-Page Optimization | | Clicksjim0 -
SEOmoz Says 530 - Google Webmaster Tools Says 16
A SEOmoz crawl report on 1/20/2013 says we have 530 duplicate page titles most of which are pages like this these two: http://www.audiobooksonline.com/reviews/review.php/details/38 http://www.audiobooksonline.com/reviews/review.php/details/104 These pages are the result of allowing visitors to post their opinions and ratings about specific audio books. Google Webmaster Tools reports we have 16 duplicate page titles none of which are a review page. Why is SEOmoz reporting duplicate page titles? Should SEOmoz be reporting these pages as having duplicate page titles? Why isn't Google Webmaster Tools reporting the review pages as having duplicate titles?
On-Page Optimization | | lbohen0 -
Plagiarism or duplicate checker tool?
Do you know a plagiarism or duplicate checker tool where I can receive an email alert if someone copies my content? I know there's a tool like this (similar to http://www.tynt.com/ though people can still remove the link from the original source) but I forgot the name or site. It's like a source code that you must insert in each of your webpage. Thanks in advanced!
On-Page Optimization | | esiow20131 -
Is there a tool out there I could use to help me compose unique meta tags in bulk?
We have a website that has hundreds of crawl errors due to duplicate meta tags. I could do with a tool to help compose unique ones in bulk so we don't exceed the recommended character limit and follow any other best practices.
On-Page Optimization | | WebDesignBirmingham0