Will using https across our entire site hurt our external backlinks?
-
Our site is secured throughout, so it loads sitewide as https. It is canonicalized properly - any attempt to load an existing page as http will force to https. My concern is with backlinks. We've put a lot of effort into social media, so we're getting some nice blog linkage. The problem is that the links are generally to http rather than https (understandable, since that's the default for most web users). The site still loads with no problem, but my concern is that since a redirect doesn't transfer all the link juice across, we're leaking some perfectly good link credit. From the standpoint of backlinkage, are we harming ourselves by making the whole site secure by default? The site presently isn't very big, but I'm looking at adding hundreds of new pages to the site, so if we're going to make the change, now is the time to do so. Let me know what you think!
-
We run one site with all https and there is no problem at all - we link build as usual and see no bad impacts, in fact we are doing very well.
It's not usual practice but for SEO as long as you are playing by the rules it will have no impact whatsoever.
-
Yes -- I actually just got done reverting back from HTTPS -> HTTP because of the handshake. Think about this.
- How many images does the page have? All of your images need to have SSL.
- How many styles and external style sheets? All of your style sheets need to have SSL
- Does all of the sites you link to have SSL as well? I found that if I link something it can sometimes red flag that there are elements in the page that are not secure.
It's a lot of work and a lot of maintenance and at the end: the visitor gets frustrated and leaves. Even if you are at rackspace and you have a dedicated SSL proxy server with load bouncers and it auto scales. The clients browser still needs to form a relationship with the SSL certificate for all of the images/scripts on your page.
-
your backlinks will suffer. You need to go and 301 each of the http pages to the https ones. That being said 301s do not pass 100% of link juice on and many people will continue to link to the http pages.
Do you really need every page to be https? why not just have the key data exchange pages as https and the rest as http?
-
I would seriously consider the possibility of making only as much of your site https as is really necessary.
That said, the portion of your link juice being lost due to the redirects is probably relatively insignificant. But if you could keep half the site as http, that would cut your leakage in half.
-
There's very rarely any reason to force SSL for an entire site. Any content that you're trying to SEO, obviously has no need to be encrypted.
SSL puts a huge overhead on page load time.
-
We have the same issue. Our site is 100% SSL. We use 301 redirects for any http requests to go to https instead. We rank well in the SERPs for phrases we care about. I'm pretty sure the link juice is flowing from http to https because of the 301s (many of our external links are http).
(and, SEOMoz folks: really looking forward to your crawl tool working with https sites!)
-
Don't really see a way around it. Only force HTTPS on pages that need it. If you can operate at 80% HTTP and 20% HTTPS, that is much better, as people rarely link to HTTPS pages.
So yes, change it
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Advice on Categorizing Posts, Using Topics, Site Navigation & Structure
Hey there, My site had terrible categorization. I did a redesign, and essentially decided to start over using Topics instead of categories - which appear as my site's main navigation. Now I need to assign a Topic to all my posts. Is it safe to assign posts to multiple parent Topics from an SEO point of view? I want to do it since it would be helpful for users to find them in multiple locations some of the time, but I certainly don't want any SEO issues. Also, should I de-categorize all of my posts since I'm assigning them to my new hierarchical taxonomy - Topics? This is very important to finalize. Any help or advice is greatly appreciated. Thanks, Mike
Technical SEO | | naturalsociety0 -
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Does using cufon for H-tags etc hurt SEO?
Does the use of cufon for H-tags et al affect SEO/how Google views your website?
Technical SEO | | Alligator0 -
Why would this site outrank a Pr2 site with higher domain authority?
I am trying to get a pr2 site to be on top 7 local spot for the keyword Van Nuys Bail bonds but have discovered a site which has barely any back links and is not even a year old on top results. Their backlinks are from lower authority domains than what we have. How could this site be beating a 7 year old pr2 website? The site I'm working on is http://bbbail.com/ The site that is ranking in 5th spot local with pr0 is http://www.vipbailbonds.org/ is it maybe because it is a .org site? Also I notice that all websites in top spots have www, could that be a factor as well?
Technical SEO | | jesse13410 -
Will SEO Moz index our keywords if the site is ALL https?
We have a site coming into beta next week. Playing around with SEO Moz, I had trouble getting the keywords to rank at all. Was this because the site is entirely https? If yes, what else can SEO Moz NOT do if the site is all https? Thanks!
Technical SEO | | OTSEO0 -
Will a 303 redirect hurt us?
Our membership based website is using a 303 redirect to handle the redirection of users back to the login page when those users try to access a page behind the logged in firewall. Said another way, if a user is not yet logged in, we redirect them to the login page using a 303 redirection. Unfortunately, Googlebot get this redirection too and after a recent audit, we're thinking this isn't the best way to handle this. For pages which require a user to login first, should we: A) index and 303 redirect to the login page (what we are currently doing) B) index and 302 redirect to the login page C) noindex those pages D) Remove any special treatment and let Google figure it out. Thanks in advance for your help! David
Technical SEO | | voicesdotcom0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Will using hidden divs cause me problems?
The developers want to use hidden divs to display links to each page of our photo gallery from the first page to Google but only previous and next buttons to users. Will this be a problem? Is it the best solution?
Technical SEO | | GriffinHansen0