Disavow File and SSL Conversion Question
-
Moz Community,
So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under
Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected).
Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version?
Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well.
Thanks
QL
-
Hi. I think mememax gave a very good answer.
The only thing I would submit for consideration is making too many changes at one time can be hard to track later. When we did the switch to https, I was super paranoid we would screw something up and lose rankings. So I chose to leave the disavow file exactly the same. It turned out the switch was not as bad as I thought and we didn't have any noticeable effect on rankings. So later when I was convinced that the https switch was not a factor, I could modify the disavow file. I also left the old domains from years ago in there for the reasons mememax points out.
Good Luck!
-
Hi QuickLearner,
You are actually raising a very interesting point. So, as for disavow you have to disavow links pointing to the current site and the ones pointing to any other property you own which is 301ing to it to be extra safe.
Remember that the disavow file should include all URLs/Domains that are pointing to your site that you are not able to remove by yourself or after trying to ping the webmaster. Based on this:
- you should disavow in your http site all the links that are pointing to the HTTP site only that you marked as spammy
- since you're going to make many changes on the disavow file, it may be a good moment to further reanalyze links you want to include vs you want to remove. Just ensure you're doing it right.
- the HTTPS site disavow file should contain all the links of the HTTP site + the ones pointing to it. Again only the links you want to remove obviously
- Even if sites that have expired can be safely removed as they're not linking to your site anymore, in the past I always kept them. Two reasons:
- sometimes google index is not very much up to date especially with tiny, low quality sites, which these ones may be. The site may have disappeared but if google doesn't drop it, it still counts as a link to your site
- you never know what's the real reason behind that site 4XX,5XX. So in case they may reappear I would just keep it there. It's like an IP blacklist. I don't know if that IP is still used but just in case I keep it there.
I hope this helps you!
e
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Complicated Duplicate Content Question...but it's fun, so please help.
Quick background: I have a page that is absolutely terrible, but it has links and it's a category page so it ranks. I have a landing page which is significantly - a bizillion times - better, but it is omitted in the search results for the most important query we need. I'm considering switching the content of the two pages, but I have no idea what they will do. I'm not sure if it will cause duplicate content issues or what will happen. Here are the two urls: Terrible page that ranks (not well but it's what comes up eventually) https://kemprugegreen.com/personal-injury/ Far better page that keeps getting omitted: https://kemprugegreen.com/location/tampa/tampa-personal-injury-attorney/ Any suggestions (other than just wait on google to stop omitting the page, because that's just not going to happen) would be greatly appreciated. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
How to leverage browser cache a specific file
Hello all,
Intermediate & Advanced SEO | | asbchris
I am trying to figure out how to add leverage browser caching to these items. http://maps.googleapis.com/maps/api/js?v=3.exp&sensor=false&language=en http://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js http://www.google-analytics.com/analytics.js Whats hard is I understand the purpose, but unlike a css file, how do you specify an expiration on an actual direct path file? Any help or link to get help is appreciated. Chris0 -
Htaccess 301 regex question
I need some help with a regex for htaccess. I want to 301 redirect this: http://olddomain.com/oldsubdir/fruit.aspx to this: https://www.newdomain.com/newsubdir/FRUIT changes: different protocol (http -> https) add 'www.' different domain (olddomain and newdomain are constants) different subdirectory (oldsubdir and newsubdir are constants) 'fruit' is a variable (which will contain only letters [a-zA-Z]) is it possible to make 'fruit' UPPER case on the redirect (so 'fruit' -> 'FRUIT') remove '.aspx' I think it's something like this (placed in the .htaccess file in the root directory of olddomain): RedirectMatch 301 /oldsubdir/(.*).aspx https://www.newdomain.com/newsubdir/$1 Thanks.
Intermediate & Advanced SEO | | scanlin0 -
How To Internationalize - Big Question
Hi all, Here is a big question. We have a long-established good content website with a .co.uk domain. The site is UK focussed. However, we are planning a new feature which will be UK and worldwide. So do we: 1. Keep it all on our .co.uk ? 2. Put the non-UK parts on a .com domain ? We don't have any content as such for a separate domain, and are not planning any. But, we are not sure if for example US users would be unimpressed with a UK domain. We could fudge it with "co.uk/us" etc. (Notice how we have not mentioned Google. Fed-up chasing big G the whole time. We just want to concentrate on our users and the service we provide to them. But G remains the elephant crapping in the corner of the room.) Also, we are asking this question before we let our developers and designers get to work. Basically we value Moz community opinions over and above theirs. Realise this is a big question, but you have big brains. Please chip in.
Intermediate & Advanced SEO | | dexm100 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0 -
Rel=nofollow and SSL Certs
Will I lose or gain seo benefit from using rel=nofollow on my SSL certificate? every page on the site refers (links) to the cert and the server call to display the cert adds over 500ms to my page load speeds. <updated question=""> Is there a way to display the cert to cut down on load speeds? Also, would Google discount or penalize the site if the cert were nofollowed?</updated> Thoughts? Thanks in advance!
Intermediate & Advanced SEO | | AnthonyYoung0