Google indexing https sites by default now, where's the Moz blog about it!
-
Hello and good morning / happy Friday!
Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl.
I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so.
Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html
http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/
http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html
https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/
https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html
I found it a bit ironic to read about this on mostly unsecured sites.
I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this.
Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions:
- It doesn’t contain insecure dependencies.
- It isn’t blocked from crawling by robots.txt.
- It doesn’t redirect users to or through an insecure HTTP page.
- It doesn’t have a rel="canonical" link to the HTTP page.
- It doesn’t contain a noindex robots meta tag.
- It doesn’t have on-host outlinks to HTTP URLs.
- The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL.
- The server has a valid TLS certificate.
One rule that confuses me a bit is :
- **It doesn’t redirect users to or through an insecure HTTP page. **
Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https?
Thank you!
-
Can you please make a concrete example of a key-word for that you do not rank nicely. Please also specify the thing which in your opinion need to appear nicely inside the serch and the object for the blog of nextgenapk .
-
Thanks for your response, Peter! As I said, I could be totally wrong - glad I asked this question
Cheers!
-
-
_"Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version." _
looking at it from technical standpoint, these shortners are also not https (when crawling. Would they not have the same effect as other non https links?
Sorry, I could be going totally wrong about this and this question doesnt make sense at all.
-
Touche, good sir, these are certainly some great ways to go about this. Especially number 3.
Thanks!
Wonder how long we got until http2 implementation...
-
Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version.
Or you can also make some page as "partners" where you can link only HTTP external sites.
Or you can also make internal page redirector to HTTP site. Like HTTPS -> HTTPS (inside redirector and dummy page) -> HTTP. On this case redirector won't be indexed and that's why it's dummy.
And this is just three ideas that i think for one minute. Probably mine favorite is #3. But it's IMHO.
-
So if my manufacturers don't have https sites, I should remove the links to them since it's going to hinder indexing?
Thanks for the http redirecting to https response.
-
Some sites comes with redirectors or "beacons" for detecting user presence. Example i'm on site X page A and there i click on link to go on page B. But due marketing department this pass via HTTP redirector or pure HTTP (and there 301 redirect to HTTPS). Then this page B can be not indexed.
This mean that once you set sitewide 301 redirect to encrypted connection you must make few more steps:
- you must check all resources to pass via this encrypted channel. Images, CSS, JS - just anything.
- you must check canonical to be set to HTTPS
- you must check that link between pages to be also HTTPS
- you must see any 3rd party tools for encrypted connection. Can be analytics software or "tracking pixels" or heat maps or ads.
- you must check if outgoing links from your site can be via other sites with encryption. Can be Wikipedia, Moz, Google. Since everything there is already encrypted you will skip frustrating HTTPS -> HTTP -> HTTPS jump too.
So then your site can be indexed in HTTPS. It's tricky procedure with many traps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
Bad Grammar's Effect on Rankings
Mozzers, I have a client who's brand style guide dictates that they write in all lowercase letters. Do you think this will hurt rankings? Nails
Algorithm Updates | | matt.nails1 -
Next Google PR update
When is next google Pagerank update is expected to arrive.
Algorithm Updates | | csfarnsworth
I know it takes one month to one year for Google to update it but I know many people sitting here at Moz know some secrets for sure.0 -
Search Engines Traffic for New Site?
Hi, Can anyone tell me please when a new website starts receiving traffic from the search engines? Regards
Algorithm Updates | | kywebsol0 -
Google's not indexing my blog posts anymore! Why?
Google just recently stopped indexing my blog posts immediately after being published, why could this be? I would usually post a blog post and it would be in google results within 45 seconds, now they don't show up until 6 hours later, if at all (a few never even showed up). Also, my home page doesn't even refresh when I make a change to the site. My site is CantStopHipHop [dot] comI have all in one SEO, xml sitemap generator, and webmaster tools and nothing seemed irregular in the settings.I appreciate any thoughts/help/suggestions.
Algorithm Updates | | bb2550 -
Google site links on sub pages
Hi all Had a look for info on this one but couldn't find much. I know these days that if you have a decent domain good will often automatically put site links on for your home if someone searches for your company name, however has anyone seen these links appear for sub pages? For example, lets say I had a .com domain with /en /fr /de sub folders, each seoed for their location. If I were to then have domain.com/en/ as no1 in Google for my company in the UK would I be able to get site links under this or does it only work on the 'proper' homepage domain.com/ A client of mine wants to reorganise their website so they have different location sections ranking in different markets but they also want to keep having sitewide links as they like the look of it Thanks Carl
Algorithm Updates | | Grumpy_Carl0 -
Google removing pages from Index for Panda effected sites?
We have several clients that we took over from other SEO firms in the last 6 months. We are seeing an odd trend. Links are disappearing from the reports. Not just the SEOmoz reports, but all the back link reports we use. Also... sites that pre Panda would show up as a citation or link, have not been showing up. Many are these are not Indexed, and are on large common Y.P or other type sites. Any one think Google is removing pages from the Index on sites based on Panda. Yours in all curiosity. PS ( we are not large enough to produce quantity data on this.)
Algorithm Updates | | MBayes0 -
Which is better for SEO. 1 big site or a number of smaller sites.
Hello , I am about to create a website with product reviews for a certain niche. What i want to know: Is it better for me to have a site with all reviews , like nicheproductsreviews.com and then have nicheproductsreviews.com/product-one-review.html and nicheproductsreviews.com/product-two-review.html or buy multiple domains to have product name in the domain name, like product-one-review.com and product-two-review.com As far as I understand, first approach consolidates all pages on the same site , consolidating all the link juice to it. However, second approach lets me have the product name in the main domain URL. Which way is better for SEO and why?
Algorithm Updates | | voitenkos0