I don't expect many people to agree but I think that the "small web" should reject encryption, which is the opposite direction that Gemini is taking.
I don't deny the importance of encryption, it is really what shaped the modern web, allowing for secure payment, private transfer of personal information, etc... See where I am getting at?
Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.
It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part. In addition, TLS comes with expiring certificates, requiring regular maintenance, you can't just have your server and leave it alone for years, still working. It can also bring back simple caching proxies, great for poor connectivity.
Two problems remain with the lack of encryption, first is authenticity. Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature, not the server, this way you can guarantee the authenticity of the content, no matter where you are getting it from.
The other thing is the usual argument about oppressive governments, etc... Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough. Protecting your identity is what networks like TOR are for, and you can hide a plain text server behind the TOR network, which would act as the privacy layer.
Big thing that made encryption required is arguably that ISPs started injecting crap into webpages.
Governments can still track you with little issue since SNI is unencrypted. It's also very likely that Cloudflare and the like are sharing what they see as they MITM 80% of your connections.
> But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature
How would this work in reality? With the current state of browsers this is not possible because the ISP can still insert their content into the page and the browser will still load it with the modified content that does not match the signature. Nothing forces the GPG signature verification with current tech.
If you mean that browsers need to be updated to verify GPG signature, I'm not sure how realistic that is. Browsers cannot verify the GPG signature and vouch for it until you solve the problem of key revocation and key expiry. If you try to solve key revocation and key expiry, you are back to the same problems that certificates have.
Signatures do have similar problems to certificates. But Gemini doesn't avoid them either and often recommends TOFU certificates. I think the comment's point was that digital signatures ensure identity but are unsuitable for e-commerce, a leading source of enshittification.
> you are back to the same problems that certificates have.
Some of the same problems. One nice thing about verifying content rather than using an SSL connection is that plain-old HTTP caching works again.
That aside, another benefit of less-centralized and more-fine-grained trust mechanisms would be that a person can decide, on a case-by-case basis what entities should be trusted/revoked/etc rather than these root CAs that entail huge swaths of the internet. Admittedly, most people would just use "whatever's the default," which would not behave that differently from what we have now. But it would open the door to more ergonomic fine-grained decision-making for those who wish to use it.
I've been thinking the same thing for years -- thank you for saying it. I agree completely.
Another pro is that no encryption means super low power microcontrollers and retrocomputers can browse freely. The system req's go down by orders of magnitude. I think enforcing TLS in the Gemini protocol was a huge mistake; there are so many retrocomputing enthusiasts that would love to browse Geminispace on their Amigas and 486s -- it might actually have been a significant part of the userbase -- but they're locked out because their CPUs simply cannot reasonably handle modern TLS.
> It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part.
> Two problems remain with the lack of encryption, first is authenticity. Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature, not the server, this way you can guarantee the authenticity of the content, no matter where you are getting it from.
If your microcontroller can't do TLS then it probably won't do GPG either. But you can still serve HTTP content on port 80 if you need to support plaintext. I believe a lot of package distribution is still over HTTP.
Edit: Sorry, missed the web server part somehow and was thinking of a microcontroller based client.
> In addition, TLS comes with expiring certificates, requiring regular maintenance, you can't just have your server and leave it alone for years, still working. It can also bring back simple caching proxies, great for poor connectivity.
Yeah, TLS and DNS are the two of the biggest hurdles to a completely distributed Internet. Of course you go down that road and you get IPFS, which sounds cool to me, but doesn't seem to have ever taken off.
I have noticed that when I encounter an HTTP-only web site, I know I am in for a pleasant, calm, well-curated experience, and I mean that without a hint of irony.
I don't have a lot to say about the technical discussion here, other than "TLS null cipher could be fine but also a lot more infrastructure than desirable", which could subvert your intent here.
Maybe we should normalise TOR usage before it becomes a surefire signal to the FBI to raid one's home.
Anyone between you and the server can change the content of the page on unencrypted connections. I would love to live in a world where encryption is unnecessary, but unfortunately that world does not exist right now.
We do live in that world. Encryption is not at all necessary for the majority of web sites out there. The practice you speak of is not commonplace and does not need active defending against.
ICE is tracking people on social media, and the only sign it's happening to you is when they show up to your door with guns and handcuffs. If they could track who was reading subversive content, they would. It's better we don't give away more information than necessary.
The internet has become more hostile since its early days, in multiple ways and one of those ways is that networks spy on you more. They used to inject ads into the content, but that stopped being profitable when the majority went to HTTPS. If they could do it again on a large scale they would. The NSA is also saving all the data they can. It's important to hide the content, now we have first-world countries using information about what you read to make life-or-death decisions. Encryption should now be seen as necessary.
Maintainers of "small web" servers are not being forced to use TLS. If you want to put a plaintext HTTP server on port 80 on the Internet, you can totally do that today with no issues. Sure, modern browsers like Safari and Chrome will show a "Not Secure" message in the address bar, but users are not prevented from getting to the website. Perhaps your site won't be listed on Google, but I think for small web that's acceptable.
That said, there's more to the web than just blogs and recipes. If I want to run a small web forum for enthusiasts of some hobby, I think it'd be irresponsible of me as a webmaster to run that from a microcontroller if it prevents my users from being able to securely authenticate without their password being revealed to prying eyes.
I think a simpler argument would be that small web is not a good fit if your content is sensitive in the place you are publishing from. It’s meant for public publishing. If you need encryption, use a different distribution mechanism.
That is not the only protection that HTTPS offers. US ISPs used to inject ads into HTML HTTP responses.
Can all this performative love for unencrypted HTTP just die already. You’ve all forgotten what it was actually like, and what the drawbacks actually are. This is so tiring.
>Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.
People will still do financial transactions on an unencrypted web because the utility outweighs the risk. Removing encryption just guarantees the risk is high.
> People will still do financial transactions on an unencrypted web because the utility outweighs the risk. Removing encryption just guarantees the risk is high.
That does not necessarily require TLS to mitigate (although TLS does help, anyways). There are other issues with financial transactions, whether or not TLS is used. (I had idea, and wrote a draft specification of, "computer payment file", to try to improve security of financial transactions and avoid some kinds of dishonesty; it has its own security and does not require TLS (nor does it require any specific protocol), although using TLS with this is still helpful.) (There are potentially other ways to mitigate the problems as well, but this is one way that I think would be helpful.)
Your entire comment is paragraphs of grasping at straws.
> The other thing is the usual argument about oppressive governments, etc... Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough. Protecting your identity is what networks like TOR are for, and you can hide a plain text server behind the TOR network, which would act as the privacy layer.
A huge ‘citation needed’ for this whole paragraph. Just admit that you don’t care about this use case and move on. Don’t present a contrived and completely justified hypothetical where oppressive governments behave exactly in a way that happens to mean that there’s only room for the technologies that you personally are into.
You’ve completely departed from reality. It’s not 2004 anymore.
I just mentioned that because I expected someone to say "but privacy...", because privacy and encryption go hand in hand. And my argument is that the encryption we usually think of in the context of the web is TLS, and it is not a good fit in that context.
The goal here is to publish information for everyone to see, it is not secret messaging, what you may want to protect is your identity. There are networks especially designed for this, and you are better off using these, but if you are not, then I believe that accessing a HTTP website through an anonymizing proxy (like TOR) is better at protecting your identity than relying on the TLS layer of HTTPS or Gemini.
I think the small web ought to just serve everything over Tor (onion sites). No domain needed, no worries about surveillance. You can put a gateway in front of your minimalist web server or run the gateway on an OpenWRT router. This also helps the network by generating cover traffic and encouraging civilian use.
> I think that the "small web" should reject encryption, which is the opposite direction that Gemini is taking.
I think it should allow but not require encryption.
> Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.
Note that the article linked to says "the Gemini protocol is so limited that it’s almost incapable of commercial exploitation", even though Gemini does use TLS. (Also, accounts and access restriction can sometimes be used with noncommercial stuff as well; they are not only commercial.)
> It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part.
This is one of the reasons I think it should not be required. (Neither the client side nor server side should require it. Both should allow it if they can, but if one or both sides cannot (or does not want to) implement encryption for whatever reason, then it should not be required.)
> Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature
Using TLS only prevents spies (except Cloudflare) from seeing or altering the data, and does not prevent the server operator from doing so (or from reassigned domain names, if you are using the standard certificate authorities for WWW; especially if you are using cookies for authentication rather than client certificates which would avoid that issue (but the other issues would not entirely be avoided)).
Cryptographic signatures of the files is helpful, especially for static files, and would help even if the files are mirrored, so it does have benefits. However, these are different benefits than those of using TLS.
In other cases, if you already know what the file is and it is not changing, then using a cryptographic hash will help, and a signature might not be needed (although you might have that too); the hash can also be used to identify the file so that you do not necessarily need to access it from one specific server if it is also available elsewhere.
> Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough.
There is also SNI. Depending on the specific server implementation, using false SNI might or might not work, but even if it does, the server might not provide a certificate with correct data in that case (my document of Scorpion protocol mentions this possibility, and suggestions of what to do about it).
I don't deny the importance of encryption, it is really what shaped the modern web, allowing for secure payment, private transfer of personal information, etc... See where I am getting at?
Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.
It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part. In addition, TLS comes with expiring certificates, requiring regular maintenance, you can't just have your server and leave it alone for years, still working. It can also bring back simple caching proxies, great for poor connectivity.
Two problems remain with the lack of encryption, first is authenticity. Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature, not the server, this way you can guarantee the authenticity of the content, no matter where you are getting it from.
The other thing is the usual argument about oppressive governments, etc... Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough. Protecting your identity is what networks like TOR are for, and you can hide a plain text server behind the TOR network, which would act as the privacy layer.