Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.
Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
So, even though I feel what you are saying, we can't have every nice thing we want, at the same time.
The user-visible layer of an operating system is generally one of the simpler layers when it comes to code and maintain since it's build upon abstractions. However, the libraries powering these layers, esp. math-heavy and hardware-interacting ones are much more complex due to the innate complexity of the hardware in general.
Keeping multiple copies of a library, in two different architectures (even if it only changes in bit-length), where this simple bit-change needs different implementation strategies to work correctly is a pain by itself (for more information, ask Linux Kernel devs since they're also phasing out x86).
Moreover, x86 and x86_64 is a completely different mode on the processor. On top of that, x86 only mode is called "protected mode" and x86_64 is called "long mode", and running x86 under x86_64 is a sub-mode of "long mode", and is already complex enough at silicon level.
Same complexities apply to ARM and other processor architectures. Silicon doesn't care about the ISA much.
We have seen the effort of increasing performance on superscalar, out of order processors opened a new, untapped family of side-channel/speculative attacks already. So processors are complex, software is complex, and multiple architectures on the same hardware is exponentially complex. If you want to see how the sausages made, you can also research how Windows handles backwards compatibility problem (hint: by keeping complete Windows copies under a single Windows installation in ELI5 terms).
So, the impressive thing was making these multi-arch installations running for quite some time. We need to be able let things go and open some software and hardware budget for new innovations and improvements.
Addenda: Funnily, games are one of the harder targets for multi-arch systems since they are both math-heavy and somewhat closer to the hardware than most applications and are very sensitive to architecture changes. Scientific/computational software is also another family, and this interestingly contains databases and office software. Excel also had a nasty floating point bug back in time, and 32/64 bit installations of Microsoft Office has some feature differences since the beginning.
We should have a path to run legacy software when it’s practical but Halo is just not a good example to make that case.
I’d also personally be more interested in firing up the master chief collection or seeing if the upcoming campaign remake will be any good.
I don't know if this is the situation with Rosetta 2.
So, considering its silicon parts, Rosetta 2 is more of an Apple endeavor and technology.
On the other hand 5-7 years a very typical timespan for Apple. So, I don't think licensing fees were that important while ending support for it.
It was five years, from 2006 to 2011. Rosetta 2 will have been there for seven years (currently at five).
So, I effectively got 2 years out of Rosetta 1, but didn't meant to say Apple supported it for two years only.
Sorry for the confusion.
Looks like I can't edit my comment anymore to clarify.
This feels wrong. Apple sold Intel-based Macs until early June 2023. The last one was the 2019 Mac Pro model.
Ending support for Rosetta in macOS around 2028 also means ending support for any x86_64 versions of software. This means that those unfortunate users who bought an Intel Mac Pro in 2023 only got five years of active usability.
That doesn't mean that I expect these things to be updated or supported 15y after I bought them. I am absolutely certain I made the back $850 I originally paid (edu discount) + the ~$250 in upgrades over the years and I'm entirely ok with just letting it limp along until it physically dies. I think most people have similar expectations.
The hardware can be ok, the walled garden is not.
Rosetta is the technology that allows Apple Silicon hardware to execute Intel software. When they introduced Apple Silicon with the M1 processor, not many binaries existed for Apple Silicon, so Rosetta2 was a bridge for that problem.
They used the same technology (Rosetta 1) when they switched from PowerPC to Intel.
Pretty much every binary for macOS is distributed as a "Universal Binary", which contains binaries for both x86 and Apple Silicon, so x86 isn't being abandoned, only the ability to run applications on Apple Silicon that hasn't been redistributed / recompiled in 6-7 years.
Unless you’re doing something special, you can be fairly certain that universal binaries will behave well on both platforms, that’s what Apple guarantees. They expose one API, which can be executed on multiple hardware architectures.
If you’re doing something special, like an image editor, or a game, you might need to test performance, but you couldn’t really do that with Rosetta either.
Universal binaries work well. And as long as they exist, apps will most likely run just fine on both Intel hardware and Apple silicon.
Another aspect is, a Mac stops getting software updates after ~7 years, and then the API level starts to drift between the latest macOS releases.
So, after 10 year mark, you can't get the latest versions of the applications already since the features developers use aren't available in the older macOS versions and you can't run the software anyway.
They could just revert all that large change with no loss to the users.
It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.
Or, one can dream: RVA23
Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.
I was _so_ hopeful when I asked the devs to revive the Nx-UI code so that FH/MX could have been a native "Cocoa" app....
Freehand still works on Windows 11? I’m happy for you, I never found a true replacement for it.
> a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive
Any reason you haven’t shared the name of the designer or the typeface? That story sounds interesting, I’d really welcome learning more.
The designer/typeface are Warren Chappell's Trajanus, and his unreleased Eichenauer --- I read _The Living Alphabet_ (and his cousin Oscar Ogg's _The 26 Letters_) when I was very young, and met him briefly on a school field trip back when he was Artist-in-Residence at UVA and did a fair bit of research in their Rare Book Room, and even had a sample of the metal type (missing one character unfortunately).
It is currently stalled at my having scanned and drawn up one of each letter at each size which I have available, but only having two letters, _N_ and _n_ in all sizes --- probably shouldn't worry that much about the optical axis, since it was cut in metal in one master size and the other sizes made using a pantograph, but there were _some_ adjustments which I'd like to preserve. There is a digital version of Trajanus available, but it's based on the phototype. I've been working at recreating each character using METAFONT, encompassing the optical size variation in that programmatically, but it's been slow going (and once I'm done, I then have to work out how to make it into outlines....)
Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.
Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.
(Ironically, windows 11 + corporate bloatware made the laptops super laggy. Go figure.)
https://www.accio.com/business/operating-system-market-share...
Might, because the number is even less, when we differenciate between companies and home use.
That may be surprising for people here, but technology is not synonymous with software.
Only if you count food delivery apps, crypto Ponzi scheme unicorns, Ad-services and SaaS start-ups as "tech-forward" exclusively, because you're omitting a lot of other tech companies your daily life in the civilized world depends on, which operate mainly on Windows, like where I work now.
Is designing and building semiconductors not "technology"? Or MRI machines? Or jets? Or car engines?
Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?
On the Linux side, Rosetta is an executable that you hook up with binfmt to run AMD64 binaries, like how you might use Wine for windows binaries
However, to get performance benefits, you still need to have hardware support, and have Rosetta installed on macOS [1].
TFA is quite vague about what is being deprecated.
[1] https://developer.apple.com/documentation/virtualization/run...
When was the last time this was true? I think I gave up on the platform around the new keyboards, who clearly weren't made for typing, and the non-stop "Upgrade" and "Upgrade" notifications that you couldn't disable, just push forward into the future. Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
"CIOs say Apple is now mission critical for the enterprise" [1]
[1]: https://9to5mac.com/2025/10/25/cios-say-apple-is-now-mission...
Happen to have some less biased source saying anything similar, ideally not sponsored content?
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
https://www.quora.com/Apple-company/How-does-Apple-keep-secr...
    > Back then, Apple had a sabbatical program that encouraged (mandated?)     employees to take six consecutive weeks off every five years.
This is really a good take. I can't imagine companies give sabbatical programs nowadays. You still have your vacations so JK took 12 weeks (OP mentioned in the same comment). It was a boon for any system programmer who needs to clear his mind or deepen his thoughts.They released this a while ago which has hints of supporting amd64 beyond the Rosetta end date.
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
The only hold out is GraalVM which doesn’t trivially support cross compilation (yet).
And it looks like Rosetta 2 for containers will continue to be supported past macOS 28 just fine. It's Rosetta 2 for Mac apps that's being phased out, and not even all of that (they'll keep it for games that don't need macOs frameworks to be kept around in Intel format).
I've never seen this make a practical difference. I'm sure you can spot differences if you look for them (particularly at the hardware interface level) but qemu has done this for decades and so has apple.
The low-level Rosetta as a translation layer (which is what containers use) will be kept, and they will even keep it for Intel games, as they say in the OP.
The deprecation is mentioned in the context of Rosetta translation environment [1]. Rosetta for Linux uses same wording [2].
For example, Docker at least used to use this same binary translation internally year ago (the same tech as deprecation is mentioned). I don't know how it is today.
[1]: https://developer.apple.com/documentation/apple-silicon/abou...
[2]: https://developer.apple.com/documentation/virtualization/run...
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
Is it slow? Absolutely. But you'd be insane to run it in production anyway.
A test suite that becomes 10x slower is already a huge issue.
That said, it doesn't seem llike Rosetta for container use is going anywhere. Rosetta for legacy Mac applications (the macOS level layer) is.
If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.
Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.
Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.
I doubt such a thing has ever happened in the history of consumer-facing computing.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).
Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.
Linux users do it all the time with WINE/Proton. :-)
Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.
In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?
And there’s a near 100% chance you’ll have to recompile/download pre-re-compiled binaries if moving to a completely different architecture. Same here.
In "Not the same here”, "here" is people moving to M1. So no they very much didn't, that was the whole point of rosetta 2.
A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.
To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.
FWIW, Windows running on a 64-bit host no longer runs 16-bit binaries.
E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac
To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.
All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.
Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.
If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.
A lot of software is for x64 only.
If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.
This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.
https://docs.parallels.com/parallels-desktop-developers-guid...
Rosetta 2 never supported emulating a full VM, only individual applications.
https://www.parallels.com/blogs/parallels-desktop-20-2-0/
Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.
Edited my post above. Thanks for the correction.
It would be different if the feature wasn't popular at all but that doesn't seem to be the case.
Apple doesn't want to maintain it forever, and a handful of legacy apps will never be bothered to update to native Apple Silicon support unless it means losing access to their user base. Apple has given them plenty of time to do it naturally, and now Apple is giving them a stronger reason and a couple more years to get it done. Apple is not randomly discontinuing it with no notice; two years is plenty of time for maintained software to get over the finish line.
At the end of the day, Apple doesn't want to pay to maintain this compatibility layer for forever, and Apple's customers will have a better experience in the long run if the software they are using is not running through an extra translation layer.
There will always be some niche users who want this feature to remain forever, but it's clearly not a significant enough percentage of users for Apple to be worried about that, or else Apple would maintain it forever.
2. In the resulting window, click the "More Info..." button. This will open the System Settings window.
3. Scroll to the bottom of that window and click "System Report."
4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."
1. Go into Activity Monitor
2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.
I'm super aware of the issues involved--I oversaw the transition from PPC to Intel at a university back in the day, using OG Rosetta. Even then, we had users who would only stop using their PPC apps when you took them from their cold, dead hands.
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.
[1] https://github.com/apple/container -- uses Rosetta translation for x64 images.
Schematically "Rosetta 2" is multiple things:
- hardware support (e.g TSO)
- binary translation (AOT + JIT)
- fat binaries (dylibs, frameworks, executables)
- UI (inspector checkbox, arch(1) command, ...)
My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.
[0]: or rather, heavily cull
There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.
It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.
So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.
The problem I have with it is Apple unilaterally deciding that support ends. I don't see the harm in no longer supporting it but leaving it as an option for legacy support. No garentee that anything will work with it and no support for it. They've done this with their hardware before but here it's just a cudgel to force devs to update their apps.
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.
Bear in mind that a large chunk of Mac gaming right now that needs translation are windows games translated via crossover.
So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.
I’m assuming Apple isn’t going to arbitrarily restrict what runs but will remove things to just the subset that they believe are needed for games such that other stuff just implicitly won’t work.
I grant it’s probably possible to do, but I think that is a lot more work and more error prone than just continuing to ship the major frameworks as they were.
From Apple’s perspective I’m sure they have a few big goals here:
1. Encourage anyone who wants to continue offering software on Mac to update their builds to include arm64.
2. Reduce download size, on disk size, and memory use of macOS.
3. Reduce QA burden of testing ancient 3rd party software.
These are also the same motivations Apple had when they eliminated 32 bit Intel and when they eliminated Rosetta 1, but they were criticized especially for leaving behind game libraries.
Arguably, arbitrarily restricting what runs gets them the biggest slice of their goals with the minimum work. Devs are given the stick. People typically only play 1 game at a time and then quit it, so there isn’t a bunch of Intel code in RAM all the time because of a few small apps hanging out, and they have less to test because it’s a finite set of games. It just will chafe because if they do that then you know that some unblessed software could run but Apple is just preventing it to make their lives easier.
They already have the frameworks supporting intel. They can just start pruning away.
Some teams will draw the short straw of what needs to continue being supported, but it’s likely a very small subset of what they already maintain today.
And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?
It's a myth that Snow Leopard was a bug fix release. Mac OS X 10.6.0 was much buggier than 10.5.8, indeed brought several new severe bugs. However, Mac OS X 10.6 received two years of minor bug fix updates afterward, which eventually made it the OS that people reminiscence about now.
Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
> Mac OS X 10.6.0 was much buggier than 10.5.8
Somebody who worked on Snow Leopard has already disagreed with you here about those things:
> As the person who personally ran 10.6 v1.1 at Apple (and 10.5.8), you are wrong(ish).
> Snow Leopard's stated goal internally was reducing bugs and increasing quality. If you wanted to ship a feature you had to get explicit approval. In feature releases it was bottom up "here is what we are planning to ship" and in Snow Leopard it was top down "can we ship this?".
> During that time period my team and I triaged every single Mac OS X bug coming into the company every morning. Trust me, SL was of higher quality than Leopard.
— https://news.ycombinator.com/item?id=43431675#43439348
> Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
I don’t think the schedule matters. They just over-commit every time. I said elsewhere:
> [Apple] were never building and have never built software at a sustainable pace, even before the yearly cadence. They race ahead with tech debt then never pay it off, so the problem gets progressively worse.
> A while back, that merely manifested as more and more defects over time.
> More recently, they began failing to ship on time and started pre-announcing features that would ship later.
> And now they’ve progressed to failing to ship on time, pre-announcing features that would ship later, and then failing to ship those features later.
> This is not the yearly cadence. This is consistently committing to more than they are capable of, which results in linear growth of tech debt, which results in rising defects and lower productivity over time. It would happen with any cadence.
It's instructive to read the entire thread, not just the few sentences you quoted. For example, that person later admits, "So yeah, if you are comparing the most stable polished/fixed/stagnant last major version with the brand new 1.0 major version branch, the newer major is going to be buggier. That would be the case with every y.0 vs x.8."
> I don’t think the schedule matters. They just over-commit every time.
That's a distinction without a difference. Apple has committed to releasing major OS updates every year on schedule. That's a recipe for over-committment, because they need to produce enough changes to market it as a major release.
The "no new features" gimmick of Snow Leopard was a marketing lie but was also unique. It's a gimmick that Apple pulled only once, and it couldn't be repeated frequently by Apple without making a mockery of the whole annual schedule. Maybe they could do it a second time now, but in general the annual schedule is still a major problem for a number of reasons.
It should also be noted that Snow Leopard itself took 2 years to produce after Leopard.
Snow leopard brought a huge amount of under the covers features. It was a massive release. The only reason it had that marketing was because they didn’t have a ton of user facing stuff to show
lapcat loves his straw man about OS X 10.6.0 having plenty of bugs, but that misses the point of Snow Leopard. Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs. But it fixed a bunch of stubborn bugs and architectural limitations, and the new bugs mostly got ironed out in a reasonable time frame. (Snow Leopard was probably one of the better examples of Apple practicing what they preach: cleaning out legacy code and modernizing the OS and bundled apps the way they usually want third-party developers to do to their own apps.)
Fixing architectural bugs is still fixing bugs—just at a deeper level than a rapid release schedule driven by marketable end-user features easily allows for.
There have actually been quite a few of those releases. Some of the California-themed updates have been practically indistinguishable from the previous versions. Of course Tahoe and Big Sur brought huge UI changes, but those are the exceptions, not the norm.
> focuses on deep-seated and long-standing issues under the hood
Which issues would those be, specifically?
> If the right thing for the OS in the long term is to replace an entire subsystem
Which subsystems need replacement? You claim that this is what people mean by wanting another Snow Leopard, but which subsystems do people want replaced?
> misses the point of Snow Leopard
I haven't missed the point of Snow Leopard. You're conflating two entirely different things: (1) the point of Snow Leopard as conceived by Apple in 2008-ish and (2) why people in 2025 look back fondly at Snow Leopard. My claim is that the fond memories are the result of the quality and stability that were themselves the result of 2 full years of bug fixes AFTER the initial release of Snow Leopard. Whereas the initial quality of Snow Leopard was not great, just like the initial quality of all major OS updates is not great. Major updates invariably make software buggier, and the quality comes only after much time spent refining the new stuff.
My contention is that the marketing lie of "no new features", which is naturally very memorable, is the reason that a lot of people associate Snow Leopard with bug fixes and quality, but that's not actually what 10.6.0 brought, and the quality came much later in time.
I'm not saying that Snow Leopard didn't bring valuable changes. I'm just saying that Snow Leopard existed in various stages over 2 years, and the high quality version of Snow Leopard that we remember fondly now is actually late-stage Snow Leopard, not early-stage Snow Leopard, and those 2 years of minor bug fix releases were crucial. Moreover, that's what we need now, a long series of minor bug fix updates, not any new major updates. The bug backlog has become a mountain.
> Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs.
Which is why I think it's very wrong to claim that people want "another Snow Leopard". Snow Leopard II released in 2026 would be much buggier than even macOS Tahoe, which is precisely what people do NOT want, a bunch more bugs.
> But it fixed a bunch of stubborn bugs
Which bugs exactly?
> Fixing architectural bugs is still fixing bugs
Which architectural bugs do you have in mind, or more relevantly, which architectural bugs do people in general have in mind when saying that they want another Snow Leopard?
Modern day Apple cannot. A bugfix-only release is not going to sell anything.
It feels like keeping it alive could really help long-term x64 support on Apple Silicon, even if Apple decides to move on.
I also think current Native Instruments luncher "Native Access" still requires rosetta for the installation :)))
-- EDIT --
or just move back to windows, but I can't imagine it with the current state of AI bloat
‘We fully support the Studio.’
Edit: After hunting around without success, I’m now doubting my memory. I thought I could remember Jobs dismissively replying to a question about Adobe Flash that Apple supported flash (memory). Maybe I made that up?
I haven’t dabbled with hackintoshes in nearly a decade, I stepped away around the time iMessage started needing those extensive hacks to work. Things seemed to shift away from driver/bootloader gaps to faking Apple hardware. Years earlier, I had an Asus Eee PC (remember “netbooks”?) that ran macOS without any major issues. I even built a machine that I believed I could hackintosh easily, though it never quite worked as well as I hoped.
The era of random companies selling pre-built Hackintoshes was so cool. Kids these days probably wouldn’t even believe it if you told them, like how Netflix used to actually send you a DVD in the mail. :)
I never liked the idea, either get Apple, or get one of the other OSes.
It was like getting a Fiat Coupe with a Ferrari logo.
I do have sympathy for those that still use this in their daily work flow, but also... this is Apple. This is how they have always rolled.
User mode emulation for PPC and Intel Mac apps.
This is simply not true.
Ok, then try to run a pre-compiled macOS M1 compatible application on your new Sequoia system, such as https://github.com/rochus-keller/oberonsystem3/ or https://github.com/rochus-keller/leancreator/. Requires quite some tricks so that at least some applications run without Apple's benedictions, but the tricks don't work for all such applications; and as it looks, they will also remove the last remaining work-arounds in future.
- Unzip archive
- Try double-click, security error
- Go to Privacy & Security and click "Open anyway"
- Try double-click again, it opens fine.
ObreonSystem:
- Open DMG and copy all the files to a folder
- Try double-click, program opens fine but errors because missing files.
- Uses instructions given in README, running with './ObreonSystem', and program opens without errors.
macOS 26.0.1
Thanks. Unfortunately this no longer works on sequoia; you first have to run "spctl --global-disable" in a terminal and then - within a few seconds - go to Privacy & Security and select the new option in the popup menu (which was not available before). See also https://news.ycombinator.com/item?id=41184553. That's what I meant by "tricks". And even though there are apps which still didn't work. But fortunately I still have a Mac with an older OS version which I'm not going to upgrade.
But this is another way for Apple to say "do not trust us for your gaming needs no matter what PR says".
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
  The system prevents you from mixing 
  arm64 code and x86_64 code in the 
  same process. Rosetta translation 
  applies to an entire process, 
  including all code modules that the 
  process loads dynamically.
I've been using this VST from Arturia (Minimoog V) since they distributed it for free back in like 2011 or 2012, and it runs as well on my M1 Mac as it did on my previous Intel Macs.I mean, it's literally the same DMG from way back when and there's no chance it doesn't run under Rosetta, but I run Ableton natively!
Not sure this will be of any help to my projects once Rosetta 2 gets sunsetted...
You would hope that apple would open source it, but they are one of the worst companies in the world for open sourcing things. Shame on all their engineers.
Whereas a good graphics card alone is still insane money.
Just few days ago something updated and my virtual desktop switching now behaves erratically. I'm pressing <Super>+<1>, it changes to desktop 1 with vscode opened. And immediately it starts typing "1" into vscode. Seems to bug with all X applications. I fixed it for vscode to make it work under wayland, but now it doesn't draw border around vscode window. Another irritation and I have other X apps.
It works, it's free, I love it. But it's so not polished and it'll never be. I miss macOS polish, where basic things just work.
Funny, since iOS 26 my iPhone has been failing to bring up the screenshot UI half to time, completely broke guided access, and now I can’t figure out how to close all tabs in safari because all the buttons make no sense anymore.
Oh yeah and my battery life sucks now.
You should look into atomic Linux distros. They take getting used to but they’re awesome for being stable and easy to revert changes.
Stuff in Linux changes. Not quite as frequently, but it does change and in major ways that require significant amounts of relearning.
Example 1: audio
OSS -> Alsa -> Random Layers on top of Alsa -> Pulse -> Pipewire
Example 2: init
SysV -> OpenRC || runit || s6 || upstart -> systemd
Examples 3: desktops
KDE 1/2/3 -> KDE 4/Plasma
GNOME 1 -> GNOME 2 -> GNOME3+
Example 4: networking
ifconfig -> ip
Example 5:
Xfree86 -> Xorg -> Wayland
Now, it's important to note that people were attempting to resolve issues. The transitions weren't always clean, but the results are usually great. For example, moving to pipewire is possible the greatest advancement of audio ever. Linux audio finally doesn't suck. Xfree86 to Xorg was likewise great. For the last few years of X11, I usually didn't have to modify the config. I kind of don't care about init systems most of the time. The only major complaint for systemd is that disk I/O on embedded systems is kind of an issue, but things like Alpine are better there and Alpine doesn't use systemd.
With that said, I think the real issue is that people dislike advancements that break things. Early in Pulse's life, people absolutely hated it. Early in Wayland's life, people absolutely hated it, but it wasn't default so no one complained. With Windows and macOS, stuff changes seemingly constantly and randomly and breaks things, so people hate it. Saying, however, that Linux doesn't change seems a little daft to me. It changes faster than anything else on small levels, and different distributions have breaking changes at different rates.
Good job, Poettering.
And now I'm getting an Apple Silicon machine in a few months to replace my Intel Mac and I'm out of luck.