Deprecating HTTP

Mozilla have recently announced that they are planning to deprecate insecure-HTTP, which includes denying new features from sites that are served over HTTP connections. I believe that is a mistake.

I tweeted about it, but a longer form is in order, so here goes.

Why HTTPS everywhere is important

Let me start by saying that I strongly believe that the Web should move to HTTPS, and serving content over plain-text HTTP is a mistake. (And yes, this blog is still over HTTP. Apologies. A bug to be fixed soon. Not ironic though.)

Now, why do I think HTTPS is a must?

Well, even if you don’t think your content is worth securing for the sake of your users (AKA “so someone will know they browsed my nyan cat collection. Big deal”), not serving it over HTTPS opens your users to various risks. An attacker (which may be their ISP or local coffee shop wifi) can inject their own ads, poison your user’s cache or just serve them with the wrong information.

On top of that, if your site includes any form of login, the user’s credentials can be stolen by anyone on their network. Anyone.

So, I believe that eventual deprecation of HTTP and forcing HTTPS everywhere is a Good Thing™.

What don’t I like about Mozilla’s plan, then?

“Deprecating insecure HTTP” != “HTTPS everywhere”

Mozilla are pushing for something called “opportunistic encryption” as a replacement for insecure-HTTP. The problem is that opportunistic encryption is a misleading name, but I get why it was picked. “Easily circumvented HTTPS” doesn’t quite have the same ring to it.

What is this “opportunistic encryption”, you ask? Well, In order for you to be certain that the encrypted TLS connection you established is with the server you think you established it with, the TLS connection keys are signed by a certificate, which is guarantied to be issued only by the entity you think you’re talking to. So a signed certificate means your encrypted connection goes all the way through to the origin server.

With “opportunistic encryption” OTOH, the certificate can be self-signed, which means that there are no guaranties regarding who signed it, and the encrypted connection can be terminated by any router along the way. In other words, there’s no guarantied end-to-end encryption, and intercepting, looking into and changing packets being sent to the user is trivial.

Since network operators are not afraid to perform downgrade attacks, there’s no reason to believe this neutered form of TLS will stop the bad actors among them from actively intercepting the user’s traffic and changing it, in order to add their own ads, super-cookies or worse.

The only promise of “opportunistic encryption” is that passive attacks would be more difficult (but not impossible).

Yet, Mozilla are pushing for that as a “cheap” replacement for HTTP, on the expense of actually secure HTTPS.

Free certs!!!

One more reason why certificate cost is soon to be a non-issue is an extremely cool new initiative called “Let’s encrypt”, that would provide free and easy to install certificates to anyone.

That initiative, driven by both Mozilla and Akamai, will make the “opportunistic encryption” approach even less relevant.

(Disclaimer: I work for Akamai. I’m also not involved in the Let’s Encrypt effort in any way)

Applying pressure in the wrong place

Now going back to Mozilla’s deprecation plans, the part that I dislike the most is denying new features from HTTP sites, regardless of the features’ security sensitivity.

The Chrome security team have long restricted new security-sensitive (AKA “powerful”) features to HTTPS only. The most famous case is Service Workers.

It has arguably hurt adoption of those features, but serving these feature over HTTP would have meant significantly compromising the users’ security. The limitation to HTTPS in these cases had real, security-based reasons.

Mozilla’s proposal is significantly different. Mozilla wants to limit all new features, with the hope that developers would then fall in line and implement HTTPS on their sites.

In my view this type of thinking shows a lack of understanding why developers haven’t moved to HTTPS yet.

Switching a large site to HTTPS requires a lot of work and can cost a lot of money. It may require time investment from developers that management needs to approve. That same management often doesn’t care about new browser features, and disabling new features on HTTP is not likely to make a huge impact on their decisions.

So, in order to justify dedicating time and efforts into moving to HTTPS, you need to convince the business people that this is the right thing to do from a business perspective (i.e. that it would cost them real-life money long-term if they won’t switch). Unfortunately, keeping the users safe is not always a convincing argument.

Having free certificates is awesome for the long tail of independent developers, but for large sites, that’s hardly the main issue.

From what I hear, in many cases there’s also another blocker. Many sites include business-essential 3rd party widgets, often ads. If these widgets cannot be served over HTTPS, that’s a big issue preventing serving the entire site over HTTPS.

Since you cannot mix HTTP content inside you HTTPS site (for good reason, the chain is as strong as its weakest link), such a site cannot move to HTTPS without suffering mixed-content blocking (or warnings, in the best case).

What would may work

We’ve established that we need to convince the business folks, rather than the developers. So, how can we do that?

The first and obvious way is SEO. Google have announced last August that with all other things being equal, HTTPS sites will get higher search ranking than HTTP sites. Since SEO is language that business folks understand well, I think that this is a good first step in motivating businesses to move to HTTPS. The next step would be to increase the importance of that ranking signal, and penalize HTTP sites’ ranking.

Next, Mozilla’s Henri Sivonen had an interesting idea: limit cookie persistency over HTTP, rather than keeping innocent features hostage. While I’m not 100% certain this won’t have side-effects on unsuspecting Web developers, and won’t render some currently working legitimate use cases worthless over HTTP, that method does apply pressure in the right place.

3rd party widgets often rely on cookie persistency in order to track users across sites provide their users with a personally adapted experience. Providing that persistency only on HTTPS is a sure-fire way to get their attention, move their widgets to work on HTTPS, and perhaps even to get them pushing the content sites to adopt HTTPS (by giving them better ad rates, etc).

In conclusion

Moving the Web to HTTPS is important in order to keep our users safe and maintain their long term trust in the Web. The way to do that is by convincing the business folks that they have to.

So, while HTTPS everywhere is a noble goal, denying new features from HTTP will only alienate developers and hamper new feature adoption without doing much to convince the people that need convincing.

Update:

In the comments, Patrick McManus clarified that Mozilla does not plan to consider opportunistic encryption as a secure context. Therefore, as part of the deprecation plan, new features will be denied from “opportunistically encrypted” sites as well as regular HTTP sites.

I guess my misunderstanding stems from the term “insecure HTTP”, which I assumed means that opportunistic encryption would be considered “secure HTTP”. But you know what they say about assuming. So I was wrong about that and I apologize.

I still think opportunistic encryption is a bad idea that will at best be a distraction on our way to securing the Web, but apparently it is not related to Mozilla’s deprecation plans.

Written by Yoav Weiss

Comments

blog comments powered by Disqus

Later article : By the people

Older article : Long Overdue