Google indexing https sites by default now, where's the Moz blog about it!
-
Hello and good morning / happy Friday!
Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl.
I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so.
Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html
http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/
http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html
https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/
https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html
I found it a bit ironic to read about this on mostly unsecured sites.
I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this.
Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions:
- It doesn’t contain insecure dependencies.
- It isn’t blocked from crawling by robots.txt.
- It doesn’t redirect users to or through an insecure HTTP page.
- It doesn’t have a rel="canonical" link to the HTTP page.
- It doesn’t contain a noindex robots meta tag.
- It doesn’t have on-host outlinks to HTTP URLs.
- The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL.
- The server has a valid TLS certificate.
One rule that confuses me a bit is :
- **It doesn’t redirect users to or through an insecure HTTP page. **
Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https?
Thank you!
-
Can you please make a concrete example of a key-word for that you do not rank nicely. Please also specify the thing which in your opinion need to appear nicely inside the serch and the object for the blog of nextgenapk .
-
Thanks for your response, Peter! As I said, I could be totally wrong - glad I asked this question
Cheers!
-
-
_"Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version." _
looking at it from technical standpoint, these shortners are also not https (when crawling. Would they not have the same effect as other non https links?
Sorry, I could be going totally wrong about this and this question doesnt make sense at all.
-
Touche, good sir, these are certainly some great ways to go about this. Especially number 3.
Thanks!
Wonder how long we got until http2 implementation...
-
Or you can leave but change their links to pass some URL shortener - bit.ly or t.co until they comes with HTTPS version.
Or you can also make some page as "partners" where you can link only HTTP external sites.
Or you can also make internal page redirector to HTTP site. Like HTTPS -> HTTPS (inside redirector and dummy page) -> HTTP. On this case redirector won't be indexed and that's why it's dummy.
And this is just three ideas that i think for one minute. Probably mine favorite is #3. But it's IMHO.
-
So if my manufacturers don't have https sites, I should remove the links to them since it's going to hinder indexing?
Thanks for the http redirecting to https response.
-
Some sites comes with redirectors or "beacons" for detecting user presence. Example i'm on site X page A and there i click on link to go on page B. But due marketing department this pass via HTTP redirector or pure HTTP (and there 301 redirect to HTTPS). Then this page B can be not indexed.
This mean that once you set sitewide 301 redirect to encrypted connection you must make few more steps:
- you must check all resources to pass via this encrypted channel. Images, CSS, JS - just anything.
- you must check canonical to be set to HTTPS
- you must check that link between pages to be also HTTPS
- you must see any 3rd party tools for encrypted connection. Can be analytics software or "tracking pixels" or heat maps or ads.
- you must check if outgoing links from your site can be via other sites with encryption. Can be Wikipedia, Moz, Google. Since everything there is already encrypted you will skip frustrating HTTPS -> HTTP -> HTTPS jump too.
So then your site can be indexed in HTTPS. It's tricky procedure with many traps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword optimisation: Google's eyes before users' eyes?
Hi all, So the default and ultimate suggestion about how to rank a page high is to get favoured by users, so by the Google. But if write content in favour of users, it may miss out the keywords or will not have much keyword density and variety of keywords to get in to Google's eyes. Then we may appear around 3rd page; then how do we get into top slots? I can see some top results without even a single mention of the keyword they are ranking for. How that would be possible? Thanks
Algorithm Updates | | vtmoz0 -
Optimizing Site for Multiple Business Services
I am currently optimizing for a site that does a ton of services:
Algorithm Updates | | ccdispoto
Web Development
Logo Design
SEO
Copywriting
Social Media
Email Marketing
PPC Marketing Is it better practice to optimize one "Services" page for multiple keywords or to break each service onto its own unique URL and optimize each page individually?0 -
Crosslinking & Managing Multiple Domains in Same Webmaster Tool's Account
I am wondering if there are any consequences if you manage multiple websites in the same Webmaster Tool's account and cross link between them? My guess is that this would be a very easy thing for Google to detect and build into their algorithms. Hence affect the link juice from those domains that are owned by the same person. I am looking for verification on this. Thanks, Joe
Algorithm Updates | | csamsojo0 -
Seperate Blog vs Blog in Site
Hello, When is it useful to have a blog as an external entity, versus a blog embedded in an ecommerce site. My thoughts lean towards making it part of the site to build new content, but I'm open to suggestions.
Algorithm Updates | | BobGW0 -
Getting Listed in Google Places
How do I get listed in Google Places if I don't have a physical address? EG: I am a medical health insurance company in Colo Springs, Colorado, but service 20 cities? What is the best procedure? Getting a mailbox at Mailboxes, etc. or UPS Store?
Algorithm Updates | | GregWalt0 -
Google Multiple Results
With Google's penchant for listing at times many results - one on top of the other - from the same domain, is it now advisable to not worry about having multiple pages in the same site targeting the same or very similar keywords? Is this (keyword/page internal competition) one less thing that I have to worry about or worry about less or what? Thanks! Best... Jane
Algorithm Updates | | 945010 -
Let's talk about link networks
With the recent deindexing of blog/link networks, I was hoping to get the Q&A's take on what defines a link network. Are all link building services using link networks? Would you consider something like: submitedge.com thehoth.com To use link networks? They generate links for you, but most of the time they will do it with "decent" content, on sites like Wordpress, Blogger, Squidoo and other similar sites. I don't think that most of their link sources are owned internally, but I could be wrong. Some of them use profile links to send links to their articles, which is garbage. Would you suggest staying away from services like this all together? I'd say that 90% of the services offered on submitedge might be junk, but a few look useful. I've seen a few people at my company have success with them, but fully understand that it could be short term, and potentially inevitable that those links get deindexed. I'd like to potentially find a good link building service that could bridge the gaps between when I have time to write content and do link building, as I know the engines like to see a steady stream of both. Any thoughts? Any other services you guys have used with some success? I am not looking for sites like fiverr or anything quick/cheap. I'd be willing to spend the appropriate money occasionally when I think I could use a few extra links, but don't think I need a regular link builder (as that's something I like to do). I also don't want to go the route of outright buying links from other websites. Cheers, Vinnie
Algorithm Updates | | vforvinnie2 -
Google and Content at Top of Page Change?
We always hear about how Google made this change or that change this month to their algorithm. Sometimes it's true and other times it's just a rumor. So this week I was speaking with someone in the SEO field who said that this week a change occurred at Google and is going to become more prevalent where content placed at the "top of the fold" on merchant sites with products are going to get better placement, rather than if you have your products at top with some content beneath them at the bottom of the page. Any comments on this?
Algorithm Updates | | applesofgold0