Still Not Secure in Chrome
-
Hi
We migrated to HTTPs in November - but we still aren't showing as Secure.
I thought it was due to there being an Insecure SHA-1 script in the SSlL Certificate, so am waiting to get this fixed.
We had a few http links outstanding so they have been updated, but we're still getting the issue.
Does anyone have an idea of what it could be? https://www.key.co.uk/en/key/
-
I'm surprised to say... that SSL certificate you have is very poor quality and has a number of pretty significant security issues, in addition to the SHA-1 encryption.]
To answer your specific question, there's nothing you or your devs can do about the SHA-1 encryption problem, as that problem exists on one of the certificates in the chain that is owned and controlled by Thawte (the cert issuer or "Certificate Authority"), not your own certificate. It is up to them to fix it.
As you can see from the cert security scan, there are a number of other issues with the certificate that are unacceptable. Especially in a paid certificate. [Edited for clarity - some of those warnings are likely server-specific, meaning the server is being allowed to communicate with certificate in less than optimal ways]
https://www.ssllabs.com/ssltest/analyze.html?d=www.key.co.ukIt's unlikely that the encryption problem is whats giving the "not secure" warning on the site at the moment (although it will become a major issue later in February) so you'll need to keep looking for resources called over HTTP if you're still getting warnings.
When I had a quick look at the home page, I didn't see any more warnings, as it appears you've fixed the image call that Andrew mentioned. You can use Chrome or Firefox Dev Tools to inspect any pages that are not secure to be shown exactly what element is causing the failure. It often comes down to hardcoded images like those in CSS/background images etc, or hardcoded scripts. For example, your Quotations page is calling a script from Microsoft to validate the form, but it's failing as it's called over HTTP.
Knowing this, you'd want to check any other pages using such form validation. A thorough Screaming Frog crawl to look for any other wayward HTTP calls can also help dig our the remaining random culprits.
Hope that helps?
Paul
Sidenote: Your certificate authority is Thawte, which is connected with Symantec. Which has done such a bad job of securing their certificates that Chrome and other browsers no longer trust them and are in the near future are going to be officially distrusted and ignored. Symantec has in fact given up their Certificate Authority status and is transferring their business to a new company which does have a trusted infrastructure for issuing certificates. So you're going to need to deal with a new certificate in the not too distant future anyway.
Given the poor security of your existing cert, and the upcoming issues, if it were me, I'd be asking for a refund of my current cert, and replacing it with one from a more reliable issuer. I know that can mean a lot of extra work, but as these existing problematic certs go through the distrust process over the next 8 months, sites that haven't dealt with the issue are going to break.
It's possible that Thawte will build out a reliable process for migrating. At the very least, you need to have a strong conversation with your issuer about how to insure you are getting the security and long-term reliability you've paid for. Sorry to be the bearer of bad news that is a much bigger issue. You can read up about it more here:
https://security.googleblog.com/2017/09/chromes-plan-to-distrust-symantec.html -
Thank you.
Also, does anyone know if we need to rekey the SHA-1 signature algorithm, what we rekey it with or should my dev team know this?
-
I also got this report from https://www.whynopadlock.com
Soft FailureAn image with an insecure url of "http://www.key.co.uk/img/W/KEY/F7/IC/F7-112H204-1-LX.jpg" was loaded on line: 1 of https://www.key.co.uk/en/key.
Errors that are reported on line 1 are generally not part of the source code. This error may be caused by an external javascript file which is writing to the page, however we are unable to reliably detect these scripts in our automated test.
Please contact us using the "Need Help?" link below if you need assistance with resolving this error.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
I still see the old page in index
Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
If my products aren't showing in rich snippets, is there still value in adding product schema?
I'm adding category pages for an online auction site and trying to determine if its worth marking up the products listed on the page. All of the individual product pages have product schema, but I have never seen them show up in rich snippets likely due to the absence of the price element and the unique nature of the items. Is there still value in adding the product schema even if the items won't show in rich snippets? Also, is it possible the product schema will help optimize for commerce related keywords such as [artist name] + for sale?
Intermediate & Advanced SEO | | Haleyb350 -
Dtox showing 64% of backlinks are TOX1 but still ranking
A local business has been smashing the SERPs for a while now, but since May (updates) it has been sliding and search visibility has plummeted. They came to me for help, so I ran a Dtox report and it's showing a lot of bad links (2,863 links in total). TOX1 are deindexed website so it was being linked to from a huge private blog network. MY question is, with only 209 decent links pointing to them, are they ranking because Google hasn't picked up all the shitty links or DESPITE them? I assume that after Google deindexes a domain, that link is wiped out in their index? Which is the reason for the huge drop in rankings and visibility. However, they are still there or there abouts for 40% of their keywords. Whats the best course of action here, do you think? They haven't had a penalty (as far as I know). Should I proceed to disavow? Leave them to drop away and juts build quality links? I don't want to disrupt anything at the moment, they still do well in bing. They say their rankings are slowly sliding. Any ideas would be good!
Intermediate & Advanced SEO | | jasonwdexter1 -
Bing still not listing my site after 3 weeks, Google ranks very very low
I am scared that somehow the search engines are penalizing me for something, but I don't know what. The site can be found at http://www.hypnotherapy-guide.com It is a business directory/advice/guide site listing a lot of hypnotherapists (9000). Is it possible that such a large site popping up over night is flagged by the SE as spam? I don't know what I am doing wrong.
Intermediate & Advanced SEO | | tguide0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Competitor using shady SEO tactics but still ranks at the top in Organic listings
A competitor of my client has been ranking consistently in the top 2 spots in the organic listings on Google for years. They have the advantage of keywords in the URL but no matter what we do we just can't bump them out of the top position. Recently we discovered that they have 7 or so .org sites set up claiming to be run by volunteers and their opinions are un-biased that "highly recommend" products from the main site's company. Is this against Google's policies? I've submitted a spam report to Google but of course haven't heard anything back from them. Our AdWords rep told us that the policy team doesn't respond directly to claims because of privacy reasons? Anyone know any other way to report things like this to Google? We got dropped from the rankings for 3 months due to malicious code (spammy links) injected into all the pages on our site but these people sit steadily at the top. I don't get it.
Intermediate & Advanced SEO | | pidot0 -
Does Google Use Security Seals As A Trust/Ranking Signal
There are quite a few secuirty seals/site safety tools by some big antivirus/trust companies Mcaffe site secuirty, verisign etc. Does Google, or any other big search engines use these as a trust/ranking signal?
Intermediate & Advanced SEO | | rhysmaster0