Disavow questions
-
Pretty sure I know the answers to these but someone asked me to make absolutely sure so here goes, any opinions welcome:
-
If i disavow a whole domain does it include all sub-domains on the domain also?- my answer is clearly yes.
-
If i have network of links really bad linking to my website that are already nofollow but awful websites to be linked on, is it worth putting them in the disavow list anyway to basically tell Google literally no association? I know the whole point of disavow is to essentially nofollow the link.
Opinions much appreciated, thank you guys.
-
-
Great questions. I'll give my 2 cents based on what we've witnessed at Penalty Pros:
1 - Yes, this will do the trick. Just make sure that you are referencing the non-www version in the disavow. For example "domain:site.com" and NOT "domain:www.site.com". If you want to be super safe, just include the exact subdomains as separate line items.
2 - Google's official word is that you don't need to worry about nofollows, but we've encountered a few situations where nofollow links were pointed out as problematic in failed recon requests. This may be human error on the part of the manual team, but it's probably worth it just disavowing regardless.
Hope this helps and good luck!
-
In most cases if you disavow the root, the subdomains will be disavowed as well. But, this may not work for certain large hosts. If you're trying to disavow all of wordpress.com or blogspot.com you really should disavow the subdomains individually.
Regarding nofollow links, I am 99% sure that you can ignore them. John Mueller said that you don't need to include nofollowed links in a disavow: http://goo.gl/EhpI5O
The reason I say 99% sure and not 100% sure is because a colleague of mine was recently given a nofollowed link as an example link on a failed reconsideration request. The site is not available on archive.org, so I can't go back and check but my suspicion is that the site may have recently changed their links to nofollow. I have done MANY disavows and have had many successful reconsideration requests and I do not disavow nofollowed links.
-
I totally agree with Andy here. PageRank is not the only metric - not by a long shot. Any association can be negative or positive whether PageRank is passing or not. I would disavow just to get out of the bad neighborhood.
-
I wouldn't take any chances with poor links, even if they are nofollowed. Even though it passes no pagerank, it is still going to be a signal that passes a message back to Google.
I never take a chance and disavow these as a matter of course. You never fully know just how much Google is using them.
-Andy
-
According to Dr. Pete: "My understanding is that you can block the root domain that way, yes, but Google seemed to qualify that sub-domains were at their discretion. Unfortunately, we don't have much data yet. If you know that every link from the domain is bad, then I'd use the "domain:example.com" format."
Second Question--Can nofollow links hurt my site? Matt Cutts said "Typically, no unless it is abused and manual action applied."
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question About Thin Content
Hello, We have an encyclopedia type page on our e-commerce site. Basically, it's a page with a list of terms related to our niche, product definitions, slang terms, etc. The terms on the encyclopedia page are each linked to their own page that contains the term and a very short definition (about 1-2 sentences). The purpose of these is to link them on product pages if a product has a feature or function that may be new to our customers. We have about 82 of these pages. Are these pages more likely to help us because they're providing information to visitors, or are they likely to hurt us because of the very small amount of content on each page? Thanks for the help!
Technical SEO | | mostcg0 -
Hi - I have a question about IP addresses
- would it hurt link juice to host a blog on a different server to the rest of your website? I have a web host saying they can't run Wordpress as they won't support PHP for "security reasons" - one solution would be to set up Wordpress on a different server and redirect domain.com/blog there (I presume this is do-able?). But I don't know if that affects the SEO adversely?
Technical SEO | | abisti21 -
Http to https - Copy Disavow?
If the switch is made from http to https (with 301 redirects from http to https) should the disavow file be copied over in GWT so it is also uploaded against the https as well as the http version?
Technical SEO | | twitime0 -
URL redirect question
Hi all, Just wondering whether anybody has experience of CMSs that do a double redirect and what affect that has on rankings. here's the example /page.htm is 301 redirected to /page.html which is 301 redirected to /page As Google has stated that 301 redirects pass on benefits to the new page, would a double redirect do the same? Looking forward to hearing your views.
Technical SEO | | A_Q0 -
Question about construction of our sitemap URL in robots.txt file
Hi all, This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file: http://www.ccisolutions.com/sitemap.xml As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file? Thanks! Dana
Technical SEO | | danatanseo0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0 -
Pages not ranking - Linkbuilding Question
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Technical SEO | | Impact-201555
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are: Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that? Or has it just not been enough time? Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that? Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing? The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new. Would linking to pages that are not ranking well be a good idea? Would it help them to get indexed / ranking? Or would it be better to link to the pages that are already indexed / ranking? If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site0