Multi Domain SSL Certs re HTTPS migration
-
Hi
How important is it that when migrating sites to HTTPS they have their own SSL certificates as opposed to choosing the much cheaper multi domain certificate options such as: https://www.namecheap.com/security/ssl-certificates/comodo/ev-multi-domain.aspx
I have been told really should have 1 certificate per domain and people generally unsure about multi domain certsificates ?
All Best
Dan
-
cheers Alan !
-
I cant see how it could affect rankings and I have never heard of such a thing
-
thats great info thanks Highland
-
There's no difference, really. Multi-domain certificates have been around for years (UCC, which is what this basically is, isn't new at all) and I've never heard of any problems with them. If anything, a multi-domain EV like this better in the long run because EV is much harder to obtain (you have to be a corporation to get one). I wasn't aware that anyone was allowed to issue what is basically a EV UCC certificate but I guess it was inevitable (I am surprised they don't have EV wildcard yet)
For Google's purposes they want a valid, signed certificate. They don't care if it's a single domain or a multi-domain EV like this. All certificates are issued and validated the same way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cross Domain Canonicalization for Site Folder
Hello colleagues! I have a client who decided to launch a separate domain so they could offer their content translated for other countries. Each country (except the US/English) content lives in its own country folder as follows: client.com/01/02/zh
Technical SEO | | SimpleSearch
client.com/01/02/tw etc. The problem is that they post the content in US/English on this domain too. It does NOT have its own folder, but exists righth after the date (as in the above example) Oh, and the content is the same as on their "main" domain so google likes to index that sometimes vs. the original client on the domain where we want the traffic to go. SO, is there a way to say "hey google, please index the US content only on the main domain, but continue to index the translated content in these folders on this totally separate domain?" Thank you so much in advance.0 -
Desktop in http and mobile in https
Any experience or advice you can share of having a mix set of pages/urls in one site/domain https and http e.g. mobile in https and desktop in http , (desktop version) http://mydomain/product1 (mobile version)https://m.mydomain.com/product1 att the same time some mobile pages still in http://m.mydomain.com/sectionA/ thanks
Technical SEO | | arnoldcr0 -
HTTPS Migration & Preserving Link Equity
Hey All — I’m working with a site that is migrating to HTTPS and had a couple questions. I read Moz’s ‘SEO Tips & Tricks for HTTPS’ post but want some clarification on a couple items. Aside from using https canonicals... 1. What is the best way to preserve link equity from inbound links? Site wide 301 Redirect in .htacess? 2. What is the best way to redirect internal links from http to https? The site uses absolute internal links. THX!
Technical SEO | | JJLWeber0 -
Changed site to https now GWT and analytics - do I Know have to re-add it
Hi had the previous version (wordpress) site in GWT working just fine - now everything seems to have stopped. Do I have to treat this as an entirely new site and now add a new account for the https version? Many thanks,
Technical SEO | | AndreavanEugen0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
What is the best practice to re-index the de-indexed pages due to a bad migration
Dear Mozers, We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months. Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302). I need to pass all the value from old URLs to new URLs. Example URL Structure
Technical SEO | | riyas_
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google) After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-version Does creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts. Riyas0 -
Campaigns Domain and Subdomain... ?
I made two separate campaigns before I understood the meaning of "subdomain". I make one campaign for my www.com and another for my .com. I now realize I should have made the .com the domain and the www. the subdomain in the same campaign. Is there a way to edit this? Thanks!
Technical SEO | | musicforkids0 -
Redirecting root domains to sub domains
Mozzers: We have a instance where a client is looking to 301 a www.example.com to www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean
Technical SEO | | Yozzer0