Can I dissavow links on a 301'd website?
-
So we are performing link removal for a client on his old website (A), which is being 301 redirected to his new website (B). We have identified toxic links on site A and are removing, once complete we will undo the current 301, confirm a new GWT account for website A, and then submit the disavow report.
We would then like to reapply the 301 redirect to site B while we are waiting for Google to process the disavow report, the logic being we can retain some current rankings on site B while waiting for the disavow to process on site A.
Has anyone had experience with this method? I foresee some potential issues here but am interested to here from others on this. Thanks!
-
I tend to agree with Federico's concerns. If the 301 transfers a penalty, the impact could be long-term, and it could be harder to rescue site B. The short-term ranking gains may not be worth it.
Google hasn't been clear on how this operates with 301 redirects. John's suggestion to disavow on both sites seems safe. Worst case, it's wasted effort, but it's not much effort (once you've built one file, building two is easy). Still, you've got to wait for that to process, and if the algorithmic penalty is something like Penguin, then you'd have to wait for a data refresh. This could take months, so I'd be really hesitant to risk site B until you've cleaned up the mess.
Once you disavow to site A, the 301-redirect should be fairly safe, but it does depend on the extent of the penalty. The risk/reward trade-off is definitely a "devil is in the details" sort of situation.
-
Well, you are right, manual are easier to fix, although most likely, sites with manual penalties usually fall into an algorithmic penalty too.
Steps I'd suggest:
- Don't reinstate the redirect.
- Do some cleaning, extensive cleaning.
- Use it just as a redirection for users, but not for crawlers, rankings (using robots.txt disallow site A and 302 redirect the domain to site B).
Hope that helps!
-
No, this is an algorithmic penalty. Wish it was manual, would be easier to figure out.
-
But did you get any MANUAL penalty on A or B?
-
The problem is that despite the algorithmic penalty site A appears to be pushing heavy authority to site B and keeping decent rankings for some very competitive terms that we otherwise would not rank for with site B. If I remove the 301 I fully expect all current rankings to drop, I am trying to avoid this.
Were doing link removal now, but plan on having to use the disavow tool once we have a few removal requests out to webmasters. I actually got an answer on this from John Mueller at Google in the technical SEO community on G+.
John Mueller
"I would think about the final state you want to be in and just do that. If you want to do a domain move, then 301 and keep them. If you do a domain move + disavow links, then submit the file for both domains. This process will take quite some time (maybe even a year), so you don't want to play with it incrementally: just find out what you want in the end and set that up." -
Hey Chris,
Did site A or B receive a manual penalty?
As any penalty on A, which is 301'd to B, will ultimately pass the penalty to B. I would suggest removing the 301 ASAP. Then cleanup the A domain until it's clean (if a manual action, until it's revoked) and then you can think of putting the 301 back.
Removing a manual penalty could be a long process, it took 1 year for us and 4 reconsideration requests to get the penalty revoked. We had to use the disavow as a machete as disavowed almost our entire link profile leaving aside the domains that we knew were good links, all others were disavowed using the "Domain:" to avoid any missed link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Redesigning a website and losing the .html from pages! .301 needed?
I have redesigned a customers website, i kept all pages with the same name however they have gone from domain.com/pagename.html to domain.com/pagename (lost the .html) will these pages automatically be picked up as the same or do i need to do a 301 direct. If i need to do a redirect is there a faster way? As there's about 250 pages! Thank you
Intermediate & Advanced SEO | | AdvimateLtd0 -
Moving to https with a bunch of redirects my programmer can't handle
Hi Mozzers, I referred a client of mine (last time) to a programmer that can transition their site from http to https. They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages. Currently, the way EPS redirects is setup (as shown in the attachment) is simple: On the left side you enter your old url, and on the the right side is the newly 301'd url. But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well. What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https. As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS. But my client is worried the next plugin update he will lose all http to https redirects. While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website. Well, we don't want to lose all 400 301's and jeopardize our SEO. Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls? Thank you all! Ol8km
Intermediate & Advanced SEO | | Shawn1240 -
HTTPS website migration and internal links
Hey Moz! I read Moz's guide on migrating websites from http to https, and it seems changing all relative internal links to absolute https is recommended (we currently use relative internal links). But is doing this absolutely necessary if we will already have a redirect in our .htaccess file forcing all http pages to https? Changing all of our internal links to absolute https will be very time consuming, and I'd like to hear your thoughts as to whether it's absolutely recommended/necessary; and if so, why? Thanks!
Intermediate & Advanced SEO | | TheDude0 -
301 Externally Linked, But Non-Producing Pages, To Productive Pages Needing Links?
I'm working on a site that has some non-productive pages without much of an upside potential, but that are linked-to externally. The site also has some productive pages, light in external links, in a somewhat related topic. What do you think of 301ing the non-productive pages with links to the productive pages without links in order to give them more external link love? Would it make much of a difference? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
Preferred domain can't set in Web master Tool
I have put my domain name as xxxxxtours.com without www in web master tool. i have redirect to www version using htaccess file .So I wanna put Preferred domain "Display urls as www.xxxxtours.com .When trying it give error as attached image.but i have verified site the .waiting for expert help . Ar5qx.png
Intermediate & Advanced SEO | | innofidelity0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1