I think the issue stems from too many people making their living off reviews that require something exciting to get views. When updates are more evolution than revolution, it makes for a more boring article/video. I always worry that these types of responses will lead Apple to do silly things, like leaving old chips out there too long, or adding pointless features just so there is something new to talk about.
Also: incremental updates add up.
A (e.g.) 7% increase from one year to the next isn't a big deal, but +7%, +7%, +7%, …, adds up when you finally come up for a tech refresh after 3-5 years.
I have 64GBs of RAM in my Macbook Pro. I load a 48GB DuckDB to RAM and run real-time, split-second, complex, unique analysis using Polars and Superset. Nothing like this was possible before unless I had a supercomputer.
The only x86 CPU that does this is the Xeon Max: https://www.intel.com/content/www/us/en/products/details/pro...
There are other possible solutions but they are expensive.
the problem with Macs, among other things, is lack of ECC RAM.
We're using Macs as servers. But it's a small operation.
after 3 years
and 40% after 5 years.
I don't think they appreciate the cost of redesigning and retooling. Echo your thoughts and hope Apple doesn't listen to this feedback. Imagine more expensive laptops because some people want more frequent design changes!
Perhaps it’s just a language slip, how are people forced to upgrade every year? My experience is the opposite: ios 15 is still supported[0] and my 2016 iPhone let me access the World Wide Web.
The force your talking about comes instead from developers (like me) that implements features and systems always more CPU/GPU hungry.
0 security patched last month https://news.ycombinator.com/item?id=45270108
While also not getting that they're NOT the target market.
For the person whose iPhone finally (after half a decade or more) falls out of major version support a 5-6 generation jump on hardware is amazing.
They are the target market.
When I teach people how to talk to reporters I always emphasize this. If it’s the 10th time something happened, you need to explain it in terms of what’s -new- or your info won’t go beyond the pitch meeting.
Thats why your town’s street fair makes a big deal that it’s the 10th anniversary event. It’s “news” that you’ve hit a round number. That’s why Trump breaks the law in a little way before doing it in a big way… the second time isn’t interesting.
People who upgrade every year don't do it for technical needs. We're long past the times when phones were inadequate and yearly improvements were big leaps that made them less unusable.
Yearly phone upgrades are just to sport the latest model, symbolizes status. Or if there's some deal where you can do it for close to no cost, better than long upgrade cycles, but I don't think "free upgrades" are common.
The capitalist class truly are leaches.
If ever there was a case of "be careful what you wish for" - whether it's the Touch Bar, deleting ports or the butterfly keyboard, a redesign isn't necessarily a positive.
When you used the Terminal app, there was literally a "man" button that would open the relevant man page (for whatever command you currently had typed) in a new window.
Actually an awesome feature if application authors got on board.
Making the power button part of the bar instead of a physical button sucked though.
If they had done that from the beginning, I think the reception to Touch Bar would have been a lot more positive.
The chips they did release in that time period were mostly minor revisions of the same architecture.
Apple was pretty clearly building chassis designs for the CPUs that Intel was promising to release, and those struggled with thermal management of the chips that Intel actually had on the market. And Apple got tired of waiting for Intel and having their hardware designs out of sync with the available chips.
An ironic mirror of the PowerPC era when every version of the G5 was struggling with high power consumption and heat generation when operated at any competitive frequency/performance level. The top end models like the 2.5GHz quad-G5 needed water cooling, consumed 250W when idle, and needed a 1kW PSU.
Intel's offering at the time was as revolutionary as the M-series chips.
These days they're still somewhat beholden to TSMC continuing to make progress on nodes etc, but I think they have a closer partnership and a lot more insight into that roadmap so they can keep their hardware plans in sync.
I’m sure Intel had some releases each year, but did they have the right ones to make it possible for Apple to release an update?
And then Skylake's successors, which were broadly the same as Skylake for about four years.
Meanwhile back in the pre-M1 days I remember stalking Mac rumors for moths trying to make sure I wasn’t going to buy right before their once-in-blue-moon product refresh. You could buy a Mac and get most of its useful life before they upgrade the chip, if you timed it right, so an upgrade right after you bought was a real kick in the pants.
  it makes for a more boring article/video. I always worry that these types of responses will lead Apple to do silly things
The review ecosystem is really toxic in that regard, as makers will court to it.
We had the silly unboxing videos fade, and it meant gorgeous packaging flying in the face of recyclability and cost reduction.
I wonder if the glass backs and utterly shiny but heavy and PITA to repair design is also part from there. A reviewer doesn't care that much if it costs half the phone to repair the back panel.
Maker has a specific connotation, but technically still fits on the GP.
Examples include Apple, Samsung, Lenovo, etc etc.
Every car company in the world realized that yearly product updates was the way to go, and no one whines that this year's model isn't good enough to justify upgrading from the previous year.
The problem is that our hardware as we know it, has lost a lot of its stretch. Used to be that we got 100% performance gains on a generation to generation update. Then it became 50%, 30% ... Like in the GPU market, the last generation that actually got me exited was the 1000 series (1070 specific).
Now its "boring" 10 a 15% upgrades for the same generation (if we do not count naming / pricing rearrangements).
When was the last time any of use was "hey, i am exited to potentially buy this tech, really". Apple M1 comes to mind, and that is 5 years ago.
Nvidia tried to push the whole ray tracing (a bit too early), but again, its just a incremental update to graphics (as we had a lot of tricks to simulate lighting effects that had good performance). So again, kind of a boring gain if we look back.
Mobile gaming handhelds was trilling, steam deck... Then we got competitors but with high price tags = excitement became less. And now, nobody blinks with a new generation gets released because the CPU/iGPU gains are the same boring 15 a 20%... So who wants to put down 700, 900 Euro for a 15% gain.
What has really gotten you exited? Where your just willing to throw money at something? AI? And we see the same issue with LLMs ... what used to be big step/gain, in barely a years has gone from massive gains, to incremental gains. 10% better on this benchmark, 5% better there, ... So it becomes boring (GPT5 launch and reaction, Sora 2 launch and reaction).
> When updates are more evolution than revolution, it makes for a more boring article/video.
If you think about it, there is a reason why tech channels have issues and are even more clickbait then ever. Those people live on views, so when the tech they follow/review is boring to the audience, they start pushing more and more clickbait. But that eventually burning the channels.
Unfortunately, we have a entire industry that is designed around making parts smaller and smaller every generation, to make those gains. As we lost the ability to make large gains on making those smaller making parts ...
Its ironic, as we knew this was coming and yet, it seems nobody made any breakthrough at all. Quantum computing was a field that everybody knew had no road to general computing at home (materials issues).
So what is left is the same old, lets may the die a bit smaller, gain a bit, do some optimizing left and right, and call it a new product. But for customers, getting product 2.1, being named "this is our product 3.0!!!! Buy buy" ... when customers see its just 2.1, 2.2, 2.3 ...
We are in a boring time because companies sat too darn long on their behinds, milking their exiting products but never really figured how to make new products. I think the only one that took a risk was Intel years ago, and it blew up in their face.
So yes, unless some smart cookie makes a new invention, that can revolutionize how we make chips (and that can be mass produced), boring is the standard now. No matter how companies try to repackage it.
[I get your point; I just refuse to consider to a ridiculous reskin no one asked for to be a “feature.”]
Worth noting Snow Leopard also had new features, most notably the App Store. But it was marketed as a performance upgrade. v26 / Tahoe’s new features (excluding the UI reskin) are comparably small. But instead it is a massive slowdown & bloat release :(
When Snow Leopard came out it was very buggy, and many apps simply did not run on it. I've been a Mac user since 1993, and I think it's the only version of macOS I ever downgraded from. Don't get me wrong, it eventually became rock solid, the apps I needed were eventually upgraded, and it became a great OS.
But let's not mistake MacOS 10.6.8 for MacOS 10.6.0. And maybe let's not compare macOS 26.0 to MacOS 10.6.8 either, it's not quite fair. Ever since Snow Leopard I've been waiting at least 6 months before upgrading macOS. I don't intend to change that rule anytime soon...
Sprinkle with crashes and bugs that are never fixed and charge a premium.
Go to Spotlight -> Type “Settings” -> Locate the settings -> In settings, go to Accessibility -> Wait no, it’s Mouse -> Gestures -> Activate the right-click.
^ That’s the experience for beginners. That screen should be in the installation wizard if Apple wants to make it optional. “Customize your mouse gestures”.
Side note, rossmann has stopped talking about Apple because he is not longer focused on Apple repair and is turning his attention to other causes not because of apple's "repairability" changes which are still a token gesture.
I'm already salavating at the thought of a fordable tablet in any form. But not at the thought of paying $3000 for one with current pricing.
There's definitely an adjustment off of using a new keyboard after a decade.
I can run local LLM's fine on my M1 Pro from 2021. I play games on it too. Why would I spend multiple thousands on a M(N) Macbook if there's no real reason to? It's not like when I upgraded from a 386DX to a Pentium.
I have a similar argument for phones right now. There are some AI-related reasons to upgrade, but there's not really a clear killer app yet besides frontier model chat apps. Why should I spend thousands of euros on an upgrade when I don't get anything significantly different for it?
You shouldn't and nobody is asking you to. Apple can sell their new computers to billions of prospective customers who wish to upgrade from x86 or buy their first computer.
- builds are noticeably faster on later chips as multicore performance has increased a lot. When I replaced my M1 MBP with an M4, builds in both Xcode, cargo and LaTeX (I'll switch to Typst one of these days, but haven't yet) took about 60% of the time they had previously. That adds up to real productivity gains
- when running e.g. qwen3 on LM Studio, I was getting 3-5 tok/s on the M1 and 10-15 on the M4, which to me at least crosses the fuzzy barrier between "interesting toy to tinker with sometimes" and "can actually use for real work"
They tend to add lag in major OS releases. Gets people to consider refreshing their hardware. Just by sheer coincidence, they have a new model out this year! :-)
There is significant improvement from the M4 to the M5, but how much of it is comes from TSMC and how much from Apple ? They have exclusivity on the latest processes, so it's harder to compare with what Qualcomm or AMD is doing for instance, but right now Strix Halo is basically on par with the M3~4 developped on the same node density.
On the other hardware parts, form factor has mostly stagnated, and the last big jump was the Vision Pro...
I own an AVP, and I agree. Now I bought it secondhand for half the price, so I acknowledge that necessarily means there is at least one counterparty out there who disagrees.
Using the AVP for one work day, once I got the right fit and optical inserts, was such an eye opener. It’s like using an ultraportable laptop after living an entire life with large CRT monitors & desktop rigs tied to an actual desk. An experience, btw, which also lived through. It just radially opened my eyes to fresh new possibilities and interaction mechanisms I never before thought possible.
But at $3.5k? No sane company exec could have been serious in thinking that would take off.
Zoom calls with mandatory camera on were already barbaric, asking employees to strap a headset for team meeting sounds like a generally cruel idea to me.
Most of the people actually using it for daily work are using the Mac Virtual Display. I work on my couch or bed, touch typing on my MacBook while my entire vision is filled with a projected, wraparound virtual display.
Immensely productive. But I'm basically coding on my MacBook while using a $3.5k external monitor, just in an unusual form factor.
If it was by design excellence and truly providing a better proposition it would sweeten the pill, but as of now it would be only because the way better products are from a company everyone hates.
In a weird way, Meta has been good at balancing hardware lockdown, and I'd see a better future with them leading the pack and allowing for better alternatives to come up along the way. Basically the same way the Quest allowed for exploration, and extended the PCVR market enough for it to survive up to this point. That wouldn't happen with Apple domining the field.
They also made that new wireless chip recently, the chips for the headphones, and some for the Vision Pro. The camera in the iPhone also gets a lot of attention, which takes a lot of hardware engineering. In the iPhone more generally we saw fairly big changes just a month or so ago with the new Pro phone and the Air. The Pro models on the MacBook and iPad are almost as thin, if not more thin than the Air line, which I’m sure took a considerable amount of work, to the point of making the Air branding a little silly.
These decisions IMHO fall on the hardware team, and they're not doing a good job IMHO. Meta's hardware team is arguably pulling more weight, as much as we can hate Meta for being Meta.
> headphones
Here again, the reception wasn't that great. The most recent airPod Pro was a mixed bag, the airPod max had most of the flaws of the Vision Pro and they didn't learn anything from it.
> camera
The best smartphone cameras aren't the iPhone by far now, they're losing to the Chinese makers, but don't have to compete as the market is segmented.
> MacBook and iPad are almost as thin
I wouldn't put the relentless focus on thinness as a net positive though.
All in all I'm not saying they're slacking, I'm arguing they lost the plot on many fronts and their product design is lagging behind in many ways. Apple will stay the top dog by sheer money (even just keeping TSMC in their pocket) and inertia, but I wouldn't be praising their teams as much as you do.
- 5G connectivity - WiFi 7 - Tandem OLED Screen - Better webcam - FaceID - Cheaper RAM (RAM is more important to me these days than CPU speed) - More ports - Better/cheaper monitors - Make a proper tablet OS - Maybe a touchscreen but I really don't want one
just to get started
With a phone with a fingerprint scanner, I can have it unlocked as I pull it out of my pocket, and I don't have to bury my face in my phone, e.g. to pay. I can unlock it while it's sitting on a desk.
Similarly with the fingerprint scanner on the Macbooks, I don't need to have my face squarely in the center in front of the screen. It's a very bad experience unlocking an iPad Pro with FaceID, but I have no problems experience unlocking an iPad Air with TouchID.
But I think I'm a minority here, so at least I can save some money when the long-rumored FaceID Macbook comes to fruition :D
But as a regular guy who just has a lot of files and tends to keep tons of browser tabs open... it really sucks that I'm in the situation of getting extorted for $3k of pure profit for Apple, or have to settle for subpar hardware from other companies (but at a reasonable price). Wasn't an issue when the RAM & SSD weren't soldered on, but now you can't upgrade them yourself.
I have no idea what the hip PC laptop is these days, is it still the Lenovo Carbon X1? I went to their website and picked the pre-configured laptop with the most RAM (32GB), best CPU, and 1TB SSD. This was $3k: https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadx1/t...
Roughly the same size and specs as the most expensive pre-configured MacBook Pro of the same screen size (the MBP has 36GB RAM, +4GB over the Lenovo, and a much better processor & GPU for $3.2k).
It's all market segmentation. Apple is just being upfront about it and giving you a clean, simple purchase page that shows the tradeoffs. Whereas Lenovo is using car salesman techniques to disorient you with a bewildering array of options and models all of which have decision paralysis-inducing tradeoffs not entirely in your favor.
I’m rather happy I don’t have to upgrade from my M1. More performance is nice, but making it the baseline to run an OS would just be silly.
I can’t imagine leaving Resolve to go back even though I still wayyyy prefer the FCPX UI.
Maybe you need AI, but maybe you just need some AI agent app that uses AppleScript under the hood.
I'd rather buttery smooth, secure, fast, no bugs, let me do my work.
I still have to install a third party terminal like Kitty or Ghostty for basic, modern rendering.
Rectangle is great, Ghostty is great, I too install something to tweak the mouse speed/acceleration curve (don't remember which one).
Do we need all these bundled in? Generally a dedicated developer can make those much better than whatever they'd do in-house.
I'd say where it would be an issue is if you cannot make an app that gives you the behavior you want because the OS is missing the necessary APIs and configuration toggles needed too.
Every other OS I use does all of this built in, so yes.
Obviously, it’s fine to prefer another terminal app. I’ve personally been quite disappointed by the much-hyped alternatives to the default.
But what's "marketable"... well, I guess we need to drizzle whatever we come up with in AI. or douse it.
I also can’t snap windows, and Cmd-tab still can’t tab between different windows of the same application.
There’s lots more usability that can be improved IMO
If you want the OS with all the shit you do (and don't) need, then maybe Windows is for you. ;-)
The backtick thing is just a constant annoyance. My workflow is to open windows doing the things I want some, and I want to quickly switch to the window with my next work item. Instead, I need to keep track of extra mental state and figure out if backtick is the right keystroke or if tab and then backtick is the right thing to do.
It's...fine. I'm thankful I have better options at home, but it's tolerable at work with a few third-party apps.
What things are you finding that aren’t that way?
Apologies that my memory fails me here! This was a few years ago, I only have my zsh history (and the name of a now-deleted script) to go by.
Infinite, just like in any complex UI. All the basic interaction primitives built into the OS are somewhat broken, from app/window management and text editing to keybindings and mouse gestures
1) Sign Nvidia's drivers again, at least for compute (there's no excuse)
2) Implement Vulkan 1.2 compliance (even Asahi did it, c'mon)
3) Stop using notifications to send me advertisements
3.1) Stop using native apps to display advertisement modals
4) Do not install subscription services on my machine by-default
5) Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
6) Document the APFS filesystem so the primary volume isn't inscrutable, akin to what M$ did for NTFS
If they're trying to get me to switch off Linux, those would be a nice start. I don't think any of that is too much to ask from a premium platform, but maybe my expectations are maligned.
The de facto answer is Homebrew — even internally at Apple. They just can’t publicly say it without liability issues.
> If they're trying to get me to switch off Linux
It’s important to know that Apple is not trying to get you to switch from Linux. Converting “UNIX workstation” people was an effort of theirs circa 2001 but that marketing campaign is long over with.
Their targets are consumer, prosumers, and media/influencer people. They give app developers just enough attention to keep their App Store revenue healthy.
Plan your long-term computing needs accordingly. You’ll see what I mean in the next 12-24 months.
You're better off using MacOS built native unix binaries and a VM or docker.
I never noticed ads in notifications, unlike with Windows which is ads infested everywhere now.
I agree that better GPU support would be nice, but also better Metal support in common open source would be nice, since I'm a laptop user.
They shipped something similar in macOS 26 - native Linux container support.
With Tahoe it's different: its ugliness is uncanny. I've just given up on it.
It totally sucks but you can see why they wouldn't ever bother with this.
I don't see that at all. I'm not unique. There are many Linux users, and we also tend to be technically competent power users with a significant influence in organizations. There is a usable port of Linux to Apple hardware now, created without support from Apple. What other cadre of people are both capable of and inspired to do such a thing? None, as far as I know.
If there were even tacit support, Apple could sell millions of MacBooks to us.
This has never been true since inception, depending on your definition of usable. Worse yet, the entire project was staffed by people with poor mental health seeking attention for using Rust and validation as "hacker" engineers. Once failure was more than apparent and the fame never materialized, these people will move on to the next attention-seeking and high profile project that they can latch onto with Rust.
Only the M1 and M2 were bootable and even then, not the entirety of hardware. No support for anything after M3 and we're already at M5. And then there's Apple actively antagonizing any and all attempts with hardware changes designed specifically to mitigate these hacks. Apple's latest business model is to weaponize vendor lock-in as a means of extracting the remaining cash left on their platform, assuming they aren't able to sell more devices.
> If there were even tacit support, Apple could sell millions of MacBooks to us.
No power users of Linux want anything to do with Apple. Any that want anything to do with Apple's hardware only want the performance gains and efficiency. Tacit support from a quality manufacturer that would embrace Linux without hostility will capture sales. Most of the Linux enthusiasts want a classic ThinkPad with the performance and battery life of the M5 Max. A nice display and keyboard would be the enterprise offering that could carry that company for the next generation. Apple had their chance.
Not really. Even if there were more Apple wants it's customers to use Macos. They don't want people using Linux.
That's probably a 4x markup, and the $200 to go from 256 to 512 is even worse.
Every time a user considers jumping from Windows but balks at the storage costs, that's leaving many thousands of potential revenue on the table. I just can't believe it really makes economic sense for them, except in short-term cashflow terms.
It's funny that my ipad has a more current CPU than my two laptops.
Yes and no. Sometimes Intel did not move as fast as Apple wanted, and sometimes Apple didnt feel like it. Especially the MacPro (trash can and old cheese-grate) and the MacMini (2012-2018) were neglected.
Today, the MacPro ships with M2 Ultra, the MacStudio ships with M3 Ultra, and its not certain that the MacMini and the iMac will get the M5 or will continue shipping with the M4 for the foreseeable future.
My M4 iPad Pro is amazing but feels totally overpowered for what it's capable of.
I guess what I'm saying is.......I don't need faster CPUs. I want longer battery life, 5G connectivity, WiFI 7, lighter weight, a better screen, a better keyboard, etc..
I guess it's odd that Apple spends so much time making faster computers when that is practically an already solved problem.
More performance (especially for local AI models) is always great, but I'm trying to imagine what I'd want out of a design change!
I think slightly thinner would be nice, but not if it runs hotter or throttles.
Smaller bezels on the screen maybe?
I'm one of those who liked the touchbar (because I think that applications which labelled its shortcuts in the touchbar are awesome) so I think some innovation around things like that would be nice. But not if it compromises the perfect keyboard.
I do think MacOS would be improved with touchscreen support.
On the contrary, I appreciate the Mac UI not being forced into touch friendliness. The whitespace increase in Big Sur is already bad enough, at least to me.
I'm uncertain as to whether any M series mac will be performant enough and the M1/M2 mac mini's specifically, or whether there are features in the M3/M4/M5 architecture that make it worth my while to buy new.
Are these incremental updates actually massive in the model performance and latency space, or are they just as small or smaller?
I have an M4 and it is plenty fast enough. But honestly the local models are just not anywhere near the hosted models in quality, due to the lower parameter count, so I haven’t had much success yet.
  - Some developer buys a new laptop
  - Developer writes software (a browser)
  - When the software works "fast enough" on their new laptop, they ship it
  - The software was designed to work on the dev's new laptop, not my old laptop
  - Soon the software is too bloated to work on my old laptop
  - So I have to buy a new laptop to run the software
If our computers never got faster, we would still be able to do everything the same that we can do today. But we wouldn't have to put down a grand every couple years to replace a perfectly good machine.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.
I am so fed up of hearing this. I would love to optimise my code, but management will always prioritise features over optimisations because that is what drives sales. This happens at almost every company I've worked at.
Also more often than not, I have a huge problem even getting stuff working and having to wrangle co-workers who I have to suffer with that cannot do basic jobs, do not write test and in some cases I've found don't even run the code before submitting PRs. That code then get merged because "it looks good" when there is obvious problems that I can spot in some cases from literally the other side of the room.
The solution to that is a few decades old: plug-in a 3D rendering card. (Of course there's the whole system bus issue, but that's largely solved by a bigger bus, rather than a faster CPU and more system memory. 3d programs requiring more cpu/memory is largely software bloat)
A few decades ago there was a lot of research into system-level parallel processing. The idea was to just add more machines to scale up processing power (if needed). But because machines got faster, there was less need for it, so the research was mostly abandoned. We would all be using distributed OSes today if it weren't for faster machines.
Name a software that won’t run comfortably on my M1 MacBook Air, now 5 years old.
The new pretty stuff feels a lot less magical when it lags or the UI glitches out. Apple sells fluidity and a seamless user experience. They need those bug fixes and an obsessive attention to detail to deliver on what is expected of their products.
That's gonna be wild starting 2026, with the first implementations of RVA23, such as Tenstorrent Ascalon devboards TBA Q2.
That being said, I do kind of head-tilt at the folks screaming that this sort of “boring” cycle of hardware isn’t sustainable, that somehow, someone must create the next major improvement to justify all new spend or otherwise this is a worthless exercise. In reality, it’s always been the opposite: Moore’s Law wasn’t infinitely scalable, and anyone who suffered through the Pentium 4 era was painfully aware of its limitations. Sure, we can find other areas to scale (like going from clock speed to core counts, and core counts to core types), but Moore’s Law is not infallible or infinite; eventually, a plateau will be reached that cannot be overcome without serious R&D or a fundamental sea-change in the marketplace (like moving from x86 to ARM), often a combination of both.
Apple, at least, has the unenviable position of being among the first in addressing this challenge: how do you sell more products when power or efficiency gains are increasingly thin, year over year? Their approach has been to leverage services for recurring revenue and gradually slowing down product refreshes over time, while tempering expectations of massive gains for those product lines seeing yearly refreshes. I suspect that will be the norm for a lot of companies going forward, hence the drive to close walled gardens everywhere and lock-in customers (see also the Android sideloading discourse).
The hardware cycle at present is fairly boring, and I quite like it. My M1 iPad Pro and M1 Pro Macbook Pro dutifully serve me well, and I have no need to replace either until they break.
If you think this is a boring architecture, more power to you. It's not boring enough for me.
Valve is spending a lot of resources and AFAIK so are all the AI companies in the asian market.
There are plenty of people who wants an open-source alternative that breaks the monopoly that Nvidia has over CUDA.
They’re not open source, for sure. But even setting that aside, they don’t offer anything like CUDA for their system. Nobody is taking an honest stab at this.
https://triton-lang.org/main/python-api/triton.language.html
Mojo has support for Apple Silicon kernels: https://forum.modular.com/t/apple-silicon-gpu-support-in-moj...
Edit: Checked on Youtube. Yeah, Windows 7 seems to be fast enough on an Apple silicon Macbook in full emulated mode. For example: https://www.youtube.com/watch?v=B9zqfv54CzI
But did customers want it?
I'll leave it here, as the point is made.
A Macbook with some of the best processors available in a laptop with the battery life and thermal characteristics of an iPhone or iPad is a pretty compelling product for many people.
M5 has performance/watt below Panther Lake.
Is that really what you want?
I want a laptop that gives me amazing performance, thermals, build quality, and battery life. It’s gonna take a while to see what manufacturers will do with panther lake.
I got an M3 Pro Macbook Pro on clearance recently for $1,600, 16 inch screen brighter than any PC laptop's I've ever seen, that's the fastest computer I have ever used, hands down and it's 2 generations out of date already. OR I can have a PC gaming laptop where the fit and finish isn't as nice, where the screen is blurrier, the battery life maxes out at 4 hours if I do absolutely nothing with it, and any time I do anything of remote consequence the fans kick up and make it sound like it's trying to take off.
And that's without even taking into account the awful mess Windows is lately, especially around power management. It makes every laptop experience frustrating, with the same issues that were there when I was in fucking high school.
Like if you just hate Mac, fine, obviously a Mac is a bad fit for you then and I wouldn't try and tell you otherwise. But I absolutely reserve the right to giggle when those same people are turning their logical brains into pretzels to justify hating a Mac when it has utterly left the PC behind in all things apart from gaming.
I have an M4 Mac Mini on my desk. At full tilt it pulls 30W. It scores higher in benchmarks than my gaming PC. It cost less than my 4090 did on its own, and that's including an upgraded third-party iBoff storage upgrade.
Of course, trade offs and process size differences abound; the M4 is newer, I can pack way more RAM into my PC years after I built it. I can swap cards. I can add another internal SSD. It can handle different kinds of load better, but at a cost of FAR more power draw and heat, and its in a full tower case with 4 180mm fans moving air over it (enough airflow to flap papers around on my desk). It's huge. Lumbering. A compute golem, straining under the weight of its own appetite, coils whining at the load of amps coursing through them.
Meanwhile, at idle, my Mac mini uses less power than the monitors connected to it, and eats up most of the same tasks without ruffling its suit. At full tilt, it uses less power than my air purifer. It's preposterous how good it is for what it costs to buy and run. I don't even regret not getting the M4 Pro.
OTOH, if you replaced your 7800X3D with Raptor Lake you'll get similar performance for less power.
That was the point I was trying to make: 2025 processors have flipped the script. x86 used to be the power hogs and apple M processors were efficient. In 2025, Apple chose to increase power to increase performance, Intel is the manufacturer with substantially increased efficiency in 2025.
Yes, Macs have incredible compute/watt, display quality, and design. However, I like to think of myself as logical, and I would not buy a Mac.
Given the choice between a M5 Mac and a latest-gen ThinkPad, I would not take the Mac. That is fine, and so are people who would do the opposite. We are just looking for different qualities in our computer.
It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that. (No Windows means that the battery life hit is relatively light. FWIW, there's no chance I would buy a computer locked down to Windows either.)
I also value non-commitment to a particular ecosystem so I prefer not to buy Apple, because I think a significant amount of the device's value is in how seamlessly it integrates with other Apple devices.
However, one day in the future when many of my beliefs have become "bought out", perhaps my priorities will change and I will go all in on the ecosystem. That's OK as well.
> It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that.
Genuine question: what do you mean locked down? By default the Mac won't run unsigned software, but that's not even today in MacOS 26 an unsolvable issue. I run all kinds of software not signed by Apple daily. There are nuances further still there, like sometimes if you want to install kernel level stuff or tweak certain settings, you have to disable SIP which is definitely a bit of a faff, but that's a Google-able thing that any tech literate person could accomplish inside of 30 minutes.
I would bow to the technical limitations, as you're rather locked to ARM64 compiled software, but I don't recall the last time I saw a piece of software getting current updates that doesn't include a binary for that.
Or I could drive across town and have a monitor today and pay $60 for the aluminum shell that hides dust better.
I hope that Intel does well in the future. It's better for us all if more than one company can push the boundaries on fabrication.
I also remember the days when the shoe was on the other foot. Motorola or IBM was going to put out a processor that would decimate Intel - it was always a year away. Meanwhile, Intel kept pushing the P6 architecture (Pentium Pro to Pentium 3) and then NetBurst (Pentium 4) and then Core. Apple keeps improving its M-series processors and single-core speed is up 80% since the M1 and 25% faster than the fastest desktop processor from AMD and 31% faster than the fastest desktop processor from Intel.
I'd love for Panther Lake to be amazing. It will put pressure on Apple to offer better performance for my dollar. Some of performance is how much CPU a company is willing to give me at a price point and what margins they'll accept. If an amazing Panther Lake pushes Apple to offer more cores at a cheaper price, that's a win for Apple users. If an amazing Panther Lake pushes Apple to offer 2nm processors quicker (at higher cost to them), that's a win for Apple users.
But I'm also skeptical of Intel. They kept promising 10nm for years and failed. They've done a bit better lately, but they've also stumbled a lot and they're way behind their roadmap. What kind of volume will we see for Panther Lake? What prices? It's hard to compare a hopeful product to something that actually exists today. Part of it isn't just whether Intel can make 18A chips, but how fast can they produce them. If most of Intel's laptop, desktop, and server processors in 2026 aren't 18A, then it isn't the same win. And before someone says "Apple is just a niche manufacturer," they aren't anymore. Apple is making CPUs for every iPhone in addition to Macs so it has to be able to get CPUs manufactured at a very high scale - around the same scale as the Intel's CPU market.
I hope Intel can do wonderfully, but given how much Intel has overpromised and underdelivered, I'm definitely not taking their word for it.
I want Intel to catch up this month. And then next month I want AMD to overtake them. And then ARM to make them all look slow. And then Apple to show them how it's done.
The absolute last thing I'd want is for Apple to have special magic chips that nobody else even comes close to.
But it’s like a margin call. Everything is great until it completely sucks. Of course a lot of that comes down to TSMC. So if Apple falls it’s likely others will too.
Volume takes time. That's why we're seeing 2026. And before someone says "that just gives Apple an advantage because they're smaller," Apple is shipping a comparable volume of CPUs - and they're doing basically all their volume on the latest fabrication tech.
There are no benchmarked samples yet.
I'd love Intel to do well with this, but Intel has disappointed before.
Comet Lake, Elkhart Lake, Cooper Lake, Rocket Lake, Adler Lake, Raptor Lake, Meteor Lake.
Though it sounds like it won't be a 400W desktop part at least.