That's one way to say it. The more common way was that users got tired of crappy plugins crashing their browsers, and browser devs got tired of endless complaints from their users.
It wasn't "politics" of any sort that made browsers sandbox everything. It was the insane number of crashes, out-of-memories, pegged CPUs, and security vulnerabilities that pushed things over the edge. You can only sit through so many dozens of Adobe 0-days before it starts to grate.
Not all of Java is open source. The TCK, the testing suite for standard compliance, for instance, is proprietary, and only organizations with Oracle's blessing can gain access. AdoptOpenJDK was only granted access after they stopped distributing another Java runtime, OpenJ9.
The only advantage to Java applets I can think of is that they had the advantage of freezing the browser so it could no longer be hacked.
The Java applet system was designed better than ActiveX but in practice I've always found it to be so much worse of an end user experience. This probably had to do with the fact most ActiveX components were rather small integrations rather than (badly fitted) full-page UIs.
One exception is early 2000s Runescape: that was Java in browser but always loaded, no gray screen and hanging browser. They knew what they were doing.
It's not a perfect fit, but it works. The speed of Ruffle loading on a page is similar to that of Flash initializing, so you can arguably still make flash websites and animations to get the old look and feel if you stick to the Ruffle compatibility range. The half-to-one-second page freeze that was the norm now feels wrong, though, so maybe it's not the best idea to put Flash components everywhere like we used to do.
Runescape proved that Java could be a pretty decent system, but so many inexperienced/bad Java developers killed the ecosystem. The same is true on the backend, where Java still suffers from the reputation the Java 7 monolithic mega projects left behind.
Can a person not run Flash authoring tools with an era-appropriate operating system in a VM or something?
Java was so buggy and had so many security issues about 20 years ago that my local authorities gave a security advisory to not install it at all in end user/home computers. That finally forced the hand of some banks to stop using it for online banking apps.
Flash also had a long run of security issues.
On the other hand, NASA in the past had some really great Java applets to play with some technical concept and get updated diagrams, animations and graphs etc.
They ran Windows XP, IE 8, and they stuck with a 3-4 year old JRE to support one piece of shit line of business app that was used only by about 100 (out of 50,000) users internally.
That institution had endpoints popped by drive-by exploit kits dropping banking trojans like Zeus daily.
I never understood why so many banks flocked to building their online banking in applets when it wasn't like you needed anything more advanced than HTML to view balances and make transactions.
Web is chosen because it is the fastest way to hit all platforms, not because it's a skill issue.
> a mess of JS callbacks makes it difficult to see the initiator of anything
Async/await is available in most browsers since 2017, what year are you from?
Applets could do things that JS could not. Some bank applets did client side crypto with keys that were on the device. Good luck doing that in JS back then. My bank's applet could cope with connection losses so I could queue a payment while dialup did it's thing.
Adobe had big plans on the Ipod supporting Flash and that announcement all but killed their Flash division.
Yes, Adobe supported Flash for years after that, but it was more of a life support thing and not active development. They saw the writing on the wall and knew that for flash to survive, it had to survive in a mobile world.
With the decreased support of flash, the other browser devs simply followed suit and killed off a route for something like Flash running in a browser.
Adobe was in a unique position to dominate the apps era, but they failed spectacularly. They could have implemented payment/monetization options for their ecosystem, to build their own walled garden. Plugins were slow but this was mostly due to hardware at the time. This changed rapidly in the following years, but without control of the hardware, they had already lost the market.
> For Flash vs iPhone case, it was indeed mostly politics.
It was politics in the sense that Flash was one of the worst cause of instability in Safari on OS X, and was terrible at managing performance and a big draw on battery life, all of which were deal breakers on the iPhone. This is fairly well documented.
> iPhone was released in 2007 and app store in 2008. iPhone and iPad did not support then popular Flash in their browsers.
There were very good reasons for that.
> Web apps were not a thing without Flash.
That is entirely, demonstrably false. There were plenty of web apps, and they were actually the recommended (and indeed the only one) way of getting apps onto iPhones before they scrambled to release the App Store.
> Flash ecosystem was the biggest competitor and threat for the App Store at that moment.
How could it be a competitor if it was not supported?
> iPhone users stopped complaining
It was not iPhones users who were complaining. It was Android users explaining us how prehistoric iPhones were for not supporting Flash. We were perfectly happy with our apps.
> and in 2011 Adobe stopped the development of mobile plugins.
Yeah. Without ever leaving beta status. Because it was unstable, had terrible performances, and drained batteries. Just what Jobs claimed as reasons not to support it.
> Adobe was in a unique position to dominate the apps era, but they failed spectacularly.
That much is true.
> Plugins were slow but this was mostly due to hardware at the time.
Then, how could native apps have much better performance on the same hardware, on both Android and iOS?
Web engines were honestly not great back then. WebKit was ok but JavaScriptCore was very slow, and of course that’s what iOS, Android, and BB10 were all running on that slow hardware. I have distinct (bad) memories that even “GPU-accelerated” CSS animations were barely 15fps, while native apps reliably got 60fps unless they really messed up. That’s on top of the infamous 300ms issue, where every tap took 300ms to fire off because it was waiting to see if you were trying to double-tap.
So I really think some of the blame is still shared with Apple, although it’s hard to say if that’s because of any malicious intent to prop up the App Store, or just because they were under pressure to build out the iOS platform that there wasn’t enough time to optimise. I suspect it was both.
Everyone, well almost everyone apparently, was relieved we didn't have to deal with any of that anymore.
I doubt you'd have been able to bootstrap Runescape in any form, even rewritten in native code, on the first iPhone to support apps. Applets worked fine on desktops and tablets which was what they were designed for.
Browser vendors killed the API because when they looked at crashes, freezes, and performance opportunities, the Flash/Java/etc. API kept standing out. Multithreaded rendering became practical only after the old extension model was refactorerd and even then browsers were held down by the terrible plugin implementations they needed to work around.
Apple was the first to publicly call out native plugins (jobs did so on stage) and outright refused to support them on iOS, then everyone else followed suit.
NPAPI's death in non-IE-browsers started around 2015. Jobs announcing mobile Safari without Flash was 2010. Unfortunately, ActiveX still works to this very day.
Chrome built up a whole new PPAPI to support reasonably fast Flash support after the Jobs announcement. Microsoft launched a major release of Silverlight long after Jobs' speech, but Silverlight (rightfully) died with Windows Phone, which it was the main UI platform for around its practical death. Had Microsoft managed to launch a decent mobile operating system, we'd probably still be running Silverlight in some fashion today. Even still, Silverlight lasted well until 2021 before Silverlight actually fell out of support.
Jobs may have had a hand in the death of Flash websites, but when it came to Java Applets/Silverlight, the decision had little impact. That plugin model was slowly dying on its own already.
There was a Flash runtime on Android. It was terrible. Java applets were already dead anyway, outside of professional contexts, which are not relevant on phones anyway.
Spiritually the web ought to be more than an application development platform. We haven't been doing great about that (with heavily compiled js bundles), but there's still a lot of extensions that many users take for granted. I'm using a continual wordcount extension (50 words so far), and Dark Reader right now.
Applet's are the native app paradigm, where what the app-makers writes is what you get, never a drop more. It's not great. The internet, the land of protocols, deserved better. Is so interesting because it is better.
It created so much uncertainty across the ecosystem even today people repeat the "applet crashes browser line, god riddance" line
But it was deliberate action by microsoft.
So yeah 100% politics because without a court document in modern society we cannot call this anything else.
I remember a few decades ago somebody saying the JVM was incredible technology, and as a user and programmer I still have zero clue what the hell they could have been thinking was good about the JVM.
I hear that now, decades into Java, they have figured out how to launch a program without slowing a computer down for 10+ seconds, but I'll be damned if I find out. There are still so many rough edges that they never even bothered to try to fix about launching a .jar with classpath dependencies. What a mess!
Java is also the workhorse of the big data ecosystem and moves enough money either as product revenue or as transactions than most nations GDP. They didn't figure out startup times for 10+ years, they were busy dealing with Oracle and its messy management. I think it will simply continue to get better given that Java has endured through so many language fads. It has its ways to go but it will end up like SQL - here before we were alive and will be here when most of us are dead.
However:
> Java is also the workhorse of the big data ecosystem and moves enough money either as product revenue or as transactions than most nations GDP.
The global financial system moves so much money around that comparisons to GDP are a bit silly. Financial transactions dwarf GDP by so much that even a bit player of a technology will facilitate more transactions than global GDP.
(And that's fine. Many of these transactions are offsetting, and that it's a sign of an efficient market that the mispricings are so small that participants needs giant gross flows to profit from them.
Somewhat related: a single high capacity fire hose (at about 75kg of water per second) moves about the same number of electrons as you'd need to power the total US electricity consumption at 120V. Obviously, your fire hose also sprays plenty of pesky protons which completely offset the electrical current from the electrons.)
Agreed. I guess its comparing production capacity to distribution capacity. Distribution capacity will equal n_tx * tx_amt. Having said that, another metric to look at is how much of software infrastructure is built on Java. Simply adding AWS to this equation proves the value added by Java backed systems. Hard to say that about any other langauge. Also we can look at versatility, Java is used to write very large data processing systems, CDN networks, API servers and even widely used consumer apps (IntelliJ products). Its very hard to find any other language that has had an outsized impact across domains. Of course the counter being Linux written on C powers all of the internet. True but C doesnt have the cross domain impact that Java has had.
So I disagree with the assessment that Java is a terrible langauge performance or productivity wise or it wouldnt have had this impact.
The JVM is quite different from Java language features or Scala language features. I've written entire programs in JVM bytecode, without a compiler, and I see very little of value in it. A stack based machine? Why? Not a huge blocker, it's weird, but usable. The poor engineering around the JVM for many use cases? That's a blocker for me, and where are the alternatives in implementation that don't have the atrocious launch performance and interface for specifying class path and jars?
Java may be used a lot, but so is Windows. It's an accident of history, of early adoption and network effects, rather than being inherently good technology. Java, the language, made a very wide and broad swath of programmers productive, just as Windows lets a very wide and broad set of IT people run IT systems, without having to learn as much or know as much as they would need to with, say, Linux. But Java's low-barrier-to-entry is quite distinct from the weaknesses of the JVM...
The JVM being a stack-machine is probably the least controversial thing about it. Wasm, CPython and Emacs all also have a stack-based bytecode language. The value, of course, comes from having a generic machine that you can then compile down into whatever machine code you want. Having a register machine doesn't seem very useful, as it's completely unnecessary for the front-end compiler to minimize register usage (the backend compiler will do that for you).
Specifying classpath isn't fun, I agree with that. Launch performance isn't good, and is generally a consequence of its high degree of dynamicism and JIT compiler, though of course there are ways around that (Leyden).
> I've written entire programs in JVM bytecode, without a compiler, and I see very little of value in it
I agree, I also see very little value in manually writing JVM bytecode programs. However, compiling into the JVM classfile format? Pretty darn useful.
Requires fewer instructions, so potentially faster evaluation, which is good for short-lived programs that ends before the JIT kicks in.
Stack machines requires less space per instruction, however, which reduces the size of the program (faster to load).
Going on a tangent: Windows is an interesting example to bring up, because the Windows versions everyone uses today have about as much to do with the 'accident of history / early adoption' versions that were based on DOS as using Wine on Linux has.
It would perhaps be like today's JVM being register based, when the first version were stack based.
I don't actually know how much the JVM has changed over time.
I see what you mean. In that case we can add Scala backed systems as well to the JVM balance sheet. If we simply look at the JVM and the systems it backs, there's very little evidence that it isnt a marvel of technology. It powers more impactful systems than few other technologies.
I wonder how long Teams or Slack would take to launch when it's on a 5400rpm disk on a 2000 era computer...
You got a buffer overflow safe language without compromise of speed. After it has been loaded, of course. But that's why Java had such a tremendous effect in Web services where the load times are negligible to the run time.
With universities almost immediately jumping to Java as an introductory language you got way more potential employees.
You can design a VM that still allows for buffer overflows. Eg you can compile C via the low-level-virtual-machine, and still get buffer overflows.
Any combination of VM (Yes/No) and buffer-overflows (Yes/No) is possible.
I agree that using a VM is one possible way to prevent buffer overflows.
Feels like you are still living in year 2010 ?
Running one packaged program across every platform. Write once, run anywhere was Sun's slogan for Java. (Though oftentimes ended up being debug anywhere.) As for the slow start part, programs can either be often-launched short-running or seldom-launched forever-running. Assume because enterprise software falls to the later part (and runtime performance > startup time + memory use), focus was there.
And, on its Android cousin... pick any S60 based Symbian phone (or anything else)... and try telling us the same. The lag, the latency, the bullshit of Java we are suffering because, you know, for phone developers, switch from J2ME to another Java stack was pretty much an easy task, but hell for the user. Even Inferno would have been better if it were free and it had a mobile ecosystem developed for it.
"Instant" is a strange choice of words to describe JVM startup performance. I recall the UX of encountering an applet involving watching a Java splash screen while the browser is frozen.
However, ActiveX usually required you to install components, while Java could just run first time.
In the era of LLM assistants like Claude Code, any engineer can write frontend code using popular stacks like React and TypeScript. This use case is when those tools shine.
I'm not sure if I'd use it for a website or anything, but if my goal was to embed a simulation or complex widget, I wouldn't ignore it as an option.
Takes a few seconds longer to load because it loads all of Java Spring, but it still performs just fine on my phone (though the lack of on screen keyboard activation makes it rather unfortunate for use in modern web apps).
Multiple people can work on different things in the Java ecosystem.
Compiling Rust to WASM doesn't really distract anyone from compiling Rust to x86 or ARM, either.
TeaVM and similar toolchains show that the original idea behind applets wasn’t wrong — the implementation model was. Moving Java to JS/WASM with tree-shaking, minification, and real browser APIs gives you all the benefits without the security nightmare.
The interesting takeaway isn’t nostalgia for applets, but how mature the web stack has become: the browser is finally the runtime applets always wanted.
"Properly"? I'm not convinced. Browsers have grown incredibly large and complex. Every plugin it replaced by JS-based APIs has become a security nightmare in almost exactly the same way as the plugin based approach. Browsers have a lot of features, but if they were implemented "properly", the user would have far more control than they do now. The cynic in me thinks that many of these features are mostly there to facilitate ad-tech and user tracking. From fingerprintable canvas to access to devices.
Applets gave you opaque binaries with full system access by default; the web at least gives you mechanisms to audit, limit, and turn off capabilities.
And yes, a lot of the modern API surface exists because ad-tech pushed the envelope — no disagreement there. But the same standardization that helped them also enabled PWAs, WASM, privacy-preserving modes, and tools like TeaVM.
So maybe not “properly” as in “perfect,” but “properly” as in “the ecosystem finally has levers for users and developers that plugins never offered.”
Here's one of many: https://ookigame.com/game/flappy-bug/
Servlets on the server-side survived a bit longer than applets, by evolving into JSP.
Compared to Java, maybe.
It is a far cry from modern frontend development with vite.
From a high level WASM and JVM byte code seems incredibly similar (though I'm sure the containerizing and IO are radically different). I never really understood why WASM wasn't some JVM subset/extension.
Not an expert at all in this, so genuinely curious to hear from someone who understands this space well
So instead you got TeaVM which is essentially a whole JVM in WASM.
Says that like it's a good thing.