See the entire conversation

23 replies and sub-replies as of Jun 13 2018

On the other hand, I would absolutely believe someone who told me Chrome would completely disable HTTP next year, and Mozilla would never be first to do that
I'd support greylisting existing sites and blocking/defaulting navigations to new TLDs and origins to HTTPS. But that's not disabling. We aren't going to break the web.
"Break the web" has always been a metrics-driven term, though? There's an acceptable %. And that % depends on how the metrics are set up (which is the root of many grievances with the Web Audio disaster, using a carefully fudged metric that achieves the desired result)
It's not really just down to metrics. It's not just about usage, but also about how much it breaks things.
Does he have a fair point about a lot of innocent older unmaintained sites getting thrown under the bus because there's nobody to pay for a switch to HTTPS?
Do they still pay for DNS and hosting?
...cause certs are free, thanks to @letsencrypt!
It's important to remember that the web is all rented thanks to DNS. DNS entries cost some money and can be transferred (e.g., when an entry expires and is registered by a different org). When they stop resolving, your site "goes down".
Hosting, likewise, is dynamic. We don't have offline, durable representations of web content. We'll get closer to that with Web Packaging, however, but it also requires certs.
So the argument is "there's 3 things I now need, two of which I have to pay money for and was OK with, and I'm objecting to the addition of a free thing to that list". The transition may cost, ofc.
I imagine that the difficulty for some old sites is not the difficulties of obtaining a certificate, it's moving out of autopilot mode. Any change at all is harder than just paying your hosting company every year to keep on keeping on.
I agree that there's a cost. It's a cost that we've externalized (as a community) until now. Not being able to actually know who's on the other end of the line is a price that users have borne via malware and other sorts of badness.
I have had an http site MITM'd maybe 5 times in 20 years. I get noxious crap injected by javascript adtech pretty much every day. I know you care about both, but "anti-vaxxers" is not a helpful framing.
That you know of. I mean, how often did you connect over insecure wifi? Or a terrible (perhaps pwn'd) gateway?
How about extending subresource integrity to <a> and <img> links? Mitigate the http issue that way?
How does that help if the top-level document can also be MITM'd?
It would help with cross site links and decentralised integrity. The certificate model has had its share of failures too. ACME must be quite an attractive target by now too.
Lots of sites are already gone for reasons completely unrelated to certificates. That will continue to happen. The Wayback Machine can help (and can use _our_ help: archive.org/donate/ )
Letsencrypt isn't maintenance free. You may need to move hosting to get automated renewal.
At first. Then a reasonable person finds out the main change is a UI warning to users. The inability to use new powerful features shouldn't matter to supposedly unmaintained old content.
Right, I'm not so much asking about "drat I can't implement shiny new web feature that requires HTTPS," more about "I haven't touched my website for 7 years and now when I visit it there's a big red X. I don't understand."
This was the initial point about things we could do to detect changes and adapt. I expect browsers to continue to evolve here and trust the team to continue to be thoughtful and sensitive in balancing developer and user needs.
He already hates us, too.