Still Not Secure in Chrome
-
Hi
We migrated to HTTPs in November - but we still aren't showing as Secure.
I thought it was due to there being an Insecure SHA-1 script in the SSlL Certificate, so am waiting to get this fixed.
We had a few http links outstanding so they have been updated, but we're still getting the issue.
Does anyone have an idea of what it could be? https://www.key.co.uk/en/key/
-
I'm surprised to say... that SSL certificate you have is very poor quality and has a number of pretty significant security issues, in addition to the SHA-1 encryption.]
To answer your specific question, there's nothing you or your devs can do about the SHA-1 encryption problem, as that problem exists on one of the certificates in the chain that is owned and controlled by Thawte (the cert issuer or "Certificate Authority"), not your own certificate. It is up to them to fix it.
As you can see from the cert security scan, there are a number of other issues with the certificate that are unacceptable. Especially in a paid certificate. [Edited for clarity - some of those warnings are likely server-specific, meaning the server is being allowed to communicate with certificate in less than optimal ways]
https://www.ssllabs.com/ssltest/analyze.html?d=www.key.co.ukIt's unlikely that the encryption problem is whats giving the "not secure" warning on the site at the moment (although it will become a major issue later in February) so you'll need to keep looking for resources called over HTTP if you're still getting warnings.
When I had a quick look at the home page, I didn't see any more warnings, as it appears you've fixed the image call that Andrew mentioned. You can use Chrome or Firefox Dev Tools to inspect any pages that are not secure to be shown exactly what element is causing the failure. It often comes down to hardcoded images like those in CSS/background images etc, or hardcoded scripts. For example, your Quotations page is calling a script from Microsoft to validate the form, but it's failing as it's called over HTTP.
Knowing this, you'd want to check any other pages using such form validation. A thorough Screaming Frog crawl to look for any other wayward HTTP calls can also help dig our the remaining random culprits.
Hope that helps?
Paul
Sidenote: Your certificate authority is Thawte, which is connected with Symantec. Which has done such a bad job of securing their certificates that Chrome and other browsers no longer trust them and are in the near future are going to be officially distrusted and ignored. Symantec has in fact given up their Certificate Authority status and is transferring their business to a new company which does have a trusted infrastructure for issuing certificates. So you're going to need to deal with a new certificate in the not too distant future anyway.
Given the poor security of your existing cert, and the upcoming issues, if it were me, I'd be asking for a refund of my current cert, and replacing it with one from a more reliable issuer. I know that can mean a lot of extra work, but as these existing problematic certs go through the distrust process over the next 8 months, sites that haven't dealt with the issue are going to break.
It's possible that Thawte will build out a reliable process for migrating. At the very least, you need to have a strong conversation with your issuer about how to insure you are getting the security and long-term reliability you've paid for. Sorry to be the bearer of bad news that is a much bigger issue. You can read up about it more here:
https://security.googleblog.com/2017/09/chromes-plan-to-distrust-symantec.html -
Thank you.
Also, does anyone know if we need to rekey the SHA-1 signature algorithm, what we rekey it with or should my dev team know this?
-
I also got this report from https://www.whynopadlock.com
Soft FailureAn image with an insecure url of "http://www.key.co.uk/img/W/KEY/F7/IC/F7-112H204-1-LX.jpg" was loaded on line: 1 of https://www.key.co.uk/en/key.
Errors that are reported on line 1 are generally not part of the source code. This error may be caused by an external javascript file which is writing to the page, however we are unable to reliably detect these scripts in our automated test.
Please contact us using the "Need Help?" link below if you need assistance with resolving this error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Re: Inbound Links. Whether it's HTTP or HTTPS, does it still go towards the same inbound link count?
Re: Inbound Links. If another website links to my website, does it make a difference to my inbound link count if they use http or https? Basically, my site http://mysite.com redirects to https://mysite.com, so if another website uses the link http://mysite.com, will https://mysite.com still benefit from the inbound links count? I'm unsure if I should reach out to all my inbound links to tell them to use my https URL instead...which would be rather time consuming so just checking http and https counts all the same. Thanks.
Intermediate & Advanced SEO | | premieresales0 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Is Video Sharing sites is still useful for SERP ?
Well I am not talking about the audience views, i am asking whether it is good for submitting videos to multiple video sites for backlinks and any sharp movements for the keywords. I seen most of the sites are nofollow which is not useful but for the link diversification is that something good ?
Intermediate & Advanced SEO | | chandubaba0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0 -
Is it fine to use an iframe for video content? Will it still be indexed on your URL?
If we host a video on a third party site and use an iframe to display it on our site, when the video is indexed in SERPs will it show on our site or on the third party site?
Intermediate & Advanced SEO | | nicole.healthline0 -
Why is my competitor Torontoseogroup.com ranked 31 in Chrome, but position 2 in Firefox? How is this possible?
There is a website I am analyzing that ranks highly in firefox - position #2 on top page. But in Google they are ranked only on the top of the 4th page. How is this possible? Looks like some of the codes in the links are different. Why? Link from Firefox link: http://www.google.ca/#q=hamilton+web+design&hl=en&prmd=imvnsfd&ei=nIB3TtG_BYTe0QHm05HnCA&start=0&sa=N&bav=on.2,or.r_gc.r_pw.&fp=6448f668fd4b6f72&biw=1024&bih=627 Link from Google Chrome: http://www.google.ca/#q=hamilton+Web+Design&hl=en&prmd=imvnsfd&ei=4353TvWmAaPW0QHhq9zYBg&start=30&sa=N&bav=on.2,or.r_gc.r_pw.&fp=c8e758962267edc1&biw=1024&bih=673 http://www.google.ca/#q=hamilton+web+design&hl=en&prmd=imvnsfd&ei=nIB3TtG_BYTe0QHm05HnCA&start=0&sa=N&bav=on.2,or.r_gc.r_pw.&fp=6448f668fd4b6f72&biw=1024&bih=627
Intermediate & Advanced SEO | | websiteready1