Which is to say, HTTP is not some "ancient" tech like an analog television. It is a modern technology used today doing things that HTTPS can't.
I saw once my ISP injecting javascript ads into http traffic and the horror is with me forever
Also would argue maintenance is only as complicated as you make it for yourself. Countless people keep patched, secure, https web servers running with minimal effort. If its somehow effort, introspect some on why you are somehow making so much work for yourself.
HTTP/3 already doesn't allow anything but CA TLS only. It won't be too long before they no longer allow you to click through CA TLS warnings.
If human people want things to be on the web for long time periods those things should be served HTTP+HTTPS.
EDIT: I have 15 year old things at work that do not compile, you have to maintain it for sure, biggest problem is cryptography. I am not sure that unstable tech should be part of the application ever.
If we're talking applications that don't actively listen on the internet that's fine, and I would agree that we should have complete software that just works. But a webserver, unless it's for personal/home use, it's on the internet and I don't see how it could work for 35 years without any update/change
On the other hand, that state of the world shouldn't exist. It's incredible to me that it's not illegal.
Would you mind sharing what ISP it was and what time period this was in?
On the platforms of NTT Docomo and KDDI (au), users could opt out of this behavior. However, with SoftBank, it could not be disabled, which led to controversy.
As you might expect, this caused issues—since the image data was modified, the hash values changed. As a result, some game apps detected downloaded image files as corrupted and failed to load them properly.
Needless to say, this was effectively a man-in-the-middle attack, so it did not work over HTTPS.
Within a couple of years, the feature seems to have been quietly discontinued.
There were also concerns that this might violate the secrecy of communications, but at least the government authorities responsible for telecommunications did not take any concrete action against it.
There is a Japanese Wikipedia article about this: https://ja.wikipedia.org/wiki/%E9%80%9A%E4%BF%A1%E3%81%AE%E6...
Also ISPs were monitoring and selling browsing data years ago.
https://www.reddit.com/r/technology/comments/9b5ikd/comcastx...
You are simply arguing that insecure network requests require less work. Which is obviously true. TLS did not appear out of nothing. Much effort was expended to create it, and there's a reason
The composability of TLS/HTTP is really a beautiful thing.
Where I live, and for people with older devices, this happens much less frequently than the HTTPS failure modes of unsupported browsers.
Or maybe your older server only speaks TLS 1.0 and that's not cool anymore. Or it could only use sha1 certs, so it can't get a current cert.
When I can, I like to server http and https, and serve the favicon with HTTPS and use HSTS to induce current clients to use https for everything. Finally, a use for the favicon.
If a server can't do TLS 1.2 from 2008 I question how it's still stable and unhacked more than anything.
And also the lifetime isn’t a problem in the setup I described, the internal server that uses the cert can do the dns challenge so it can get a new cert whenever it wants. It only needs to be able to access the DNS api.
The trust and security issues associated with maintaining intranet resources vs. outsourcing to a dedicated professional cloud service provider remain, but are not related to whether any SSL certificates used are issued through DNS-based verification or not.
Unfortunately it is not easy to automate either especially if you use multiple domain providers. Not every hosting has an API and Namecheap wanted $50 for enabling it if I remember correctly.
It is currently not possible to keep your internal network private and still have HTTPS without hacks or problems on standard end user devices.
Only if you consider transferring the cert from the public server to your internal server a hack. But how would it ever work otherwise? The CA needs to have some publicly accessible way to check your control of the domain, right?
And what if you aren't running a public webserver like 99% of normal people out there?
> But how would it ever work otherwise? The CA needs to have some publicly accessible way to check your control of the domain, right?
I mean that's exactly the problem: Why do you have to rely on the public CA infrastructure for local devices?
Consider the scenario of a smart wifi bulb in your local network that you want to control with your smartphone.
IMO it would be great to have your home router act as a local CA that can only issue certificates for .local domains and have that trusted per default by user agents. Would make smart home stuff a lot better than the current situation...
How would you talk to the router and make sure the communication is actually with the router and not someone else? The browser/lightbulb comes with trusted CAs preinstalled, but then you would have to install the routers CA cert on every device you add to the network.
Sure, if someone knows your WiFi password they could set up an "evil" router close to your house with the same SSID and credentials, or they could break into your house and install LAN wiretaps, but c'mon, if you are this paranoid you probably don't even have a smartphone in the first place.
Firstly, I don’t think that’s true because you add a lot of sketchy and unknown devices to your network over time (guests, streaming stick, computer with preinstalled OS…) so I wouldn’t trust every device in my WiFi.
And also, if you do trust your network, you don’t really need https inside it, right?
Also would be nice if there was a hotlink to view the original site directly from the index page.
If you click the image it should take you to an info page on the service.