Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools Not Showing Correct Data?
Hi, I am experience a weird issue. Google webmaster tools suggested me some HTML improvements a few weeks ago. The suggestions were about duplicate Title Tags and Short Meta Descriptions. I changed the Title Tags and Meta Descriptions. But after 3 Google Updates, webmaster still shows the same suggestion. Please advise Thanks
On-Page Optimization | | Kashif-Amin0 -
Looking for a Tool to Assist with Site Optimization. Does it already exist?
I'm looking for a tool that can help us quickly identify web pages on a client's site that contain a selected keyword phrase.
On-Page Optimization | | RosemaryB
I would like to enter say 100 keyword phrases and the client's URL and receive a report that shows - for each keyword - the client URLs that contained that exact phrase. Does anyone know of a tool that can do this? Thank you, Rosemary Brisco0 -
Google cache tool help
This link is for the Ebay Google cache - http://webcache.googleusercontent.com/search?q=cache:www.ebay.com&strip=1 I wanted to do the same for my homepage so I switched out the urls and it worked. When I try to get a different link in there such as mysite.com/category it wont work. I know my pages are indexed. Any ideas why it wont work for other pages?
On-Page Optimization | | EcommerceSite0 -
SEOmoz Says 530 - Google Webmaster Tools Says 16
A SEOmoz crawl report on 1/20/2013 says we have 530 duplicate page titles most of which are pages like this these two: http://www.audiobooksonline.com/reviews/review.php/details/38 http://www.audiobooksonline.com/reviews/review.php/details/104 These pages are the result of allowing visitors to post their opinions and ratings about specific audio books. Google Webmaster Tools reports we have 16 duplicate page titles none of which are a review page. Why is SEOmoz reporting duplicate page titles? Should SEOmoz be reporting these pages as having duplicate page titles? Why isn't Google Webmaster Tools reporting the review pages as having duplicate titles?
On-Page Optimization | | lbohen0 -
On-page report tool
This is a question regarding the advise this tool offer to increase the ranking of a webpage with focusing with a particularly keyword we choose. I give an example: On-page Report Card am checking my keywords and I use. "cleanse london" my surprise is Report card give an "F" for my target landing page http://www.purifyne.com , but the issue is I am already in first place first position. I know SEOmoz know have the algorithm from Google to know how to rank better but my issue is should be a little more accurate! at least. I don't want to be misunderstood here, I just want more guidance, to rank much better using this tool that I am paying for. Any thoughts?
On-Page Optimization | | teksyte0 -
On Page SEO Tool
Hello - I'm looking for one tool that does the following and was wondering if anyone knew of such a tool? In a perfect world I would like to enter in one domain name and have a report generated that shows All Internal links, link titles, and anchor text All internal broken links / redirects All images, image size and image alt, if the image alt is missing. I'd love to be about to export these reports to excel and quickly run my on page optimization. The goal is to produce a checklist for a developer to execute quickly. Thanks for your help Gabe
On-Page Optimization | | Gabe0 -
Is there a tool out there I could use to help me compose unique meta tags in bulk?
We have a website that has hundreds of crawl errors due to duplicate meta tags. I could do with a tool to help compose unique ones in bulk so we don't exceed the recommended character limit and follow any other best practices.
On-Page Optimization | | WebDesignBirmingham0 -
Term Extractor Tool?
I want to check content (Keyword Density and such) for a page before I load it to the server. The Term Extractor Tool is great for pages already loaded on the site but what if I want to scan content before I upload it? Is there a tool out there where I can cut and paste content from a program like word and have it scanned for keyword relevancy prior to uploading it? Thanks
On-Page Optimization | | fun52dig
Gary0