OPs article heavily covers Vertex Amplification. I didn't realize VisionOS used Vertex Amplification for this, but it makes sense since the hardware supports it.
For those interested in Mesh Shading (a few good videos released this week [1][2][3]), Vertex Amplification is a key tech for Mesh Shading, where one writes Object/Mesh functions (names in Metal API, called Task/Mesh/Amplification shaders in other APIs). Introduced by Nvidia in 2018 and only really available en masse for the last couple years, Vertex Amplification was the first time a GPU could create vertices (or "destroy" them by not emitting anything), versus the fixed mesh inputs. It's so cool and powerful and a different way of thinking about the pipeline.
This article shows the same concept, but in Vertex Shaders for multiple render targets. While you might not make a Vision app, it could be worthwhile read to further understand this architecture. I've spent a few months in Metal Mesh Shading and hadn't realized this application of it at all.
[1] https://news.ycombinator.com/item?id=41839190
[2] https://www.youtube.com/watch?v=3EMdMD1PsgY
[3] https://www.youtube.com/watch?v=EtX7WnFhxtQ (good explanation and demo)
Your description is of a mesh shaders, but an amplification shader is able to basically reuse vertex data from a vertex shader pass without use of a mesh shader.
I was trying to see if the other graphics APIs (Vulkan, DirectX) had this Vertex Amplification in Vertex Shader feature, but it doesn't seem so? Maybe it was easier for Apple to inject the concept into Metal (advantage of controlling of the whole stack).
> The [M3] GPU is faster and more efficient, and introduces a new technology called Dynamic Caching, while bringing new rendering features like hardware-accelerated ray tracing and mesh shading to Mac for the first time.
https://www.apple.com/ne/newsroom/2023/10/apple-unveils-m3-m...
That means the M2 (which is inside the Apple Vision) must have at most some sort of partial mesh shading support.
The GPU Family Apple7 was the first with Mesh Shader support, but as you noted it might be software fallbacks down there.
I did just check with my Vision Pro and I get MTLGPUFamily [2] up to apple8, which does correspond to other M2 devices. I probed a max amplification count of 8.
[1] https://developer.apple.com/metal/Metal-Feature-Set-Tables.p...
[2] https://developer.apple.com/documentation/metal/mtlgpufamily
Don't bother developing for this platform. The customer base is miniscule.
I know that doesn't pay the bills, but I appreciate the hard work you do.
If it's good enough for me, it's probably already worth my time—though https://xkcd.com/1319/ comes to mind
Then maybe some subset of the population that is closer in interests to me would also find it useful, in which case the app might get more traction. That'd be nice, but not a necessary requirement for my definition of success
The big issue here is justifying spending thousands of dollars on the product. I'd go for it at $500 or so. Definitely not $2k or $5k.
Your advice makes sense in general but doesn’t apply to this specific case.
All I'm saying is "don't build an app for the masses if the market is small. build something that solves a real need that you have personally identified"
I released a version of my existing app because customers requested it. I had no independent interest in the platform.
> instead of solving a problem he identified would be best tackled with a Vision Pro than with a desktop app.
I can't think of any problems that would be best tackled with a Vision Pro than with a desktop app.
Then you also better hope Apple permits you to use the APIs you need to build it.
(Just a guy who has seen all proprietary APIs in this space being left in the dust when a standard arrived)
A good hockey player don't skate to where the puck is. He skates to where it will be.
I'm not a hockey player.
> He skates to where it will be.
The time scale for hockey players is seconds, not years.
This is such a useless cliché.
Most developers have very limited resources. Apple can afford to invest in pipe dreams (reportedly they spent $billions working on the now cancelled car project), but I certainly can't.
But, a good hockey player doesn’t skate in the direction that absolutely nobody else is heading, somewhere that might even be outside the rink (in the sense that we haven’t really shown if a good UI for VR can actually be created, yet, so it might not even be a possible “part of the game,” so to speak).
That doesn't make sense in this context. Unless you consider Meta, Loft Dynamics, Valve, HTC, and dozens of other companies to be "absolutely nobody."
A better business model would probably be to pay people a couple hundred dollars to take their headsets. At least that might result in some install base.
And yes I'm talking about most of the "native apps".
https://www.uploadvr.com/non-pro-apple-vision-headset-report...
Meta is having enough trouble attracting developers with hardware that's an order of magnitude cheaper and has two orders of magnitude more units in the wild. They got through the first hurdle of convincing people to try VR but keep falling at the second hurdle of user retention, headsets that are gathering dust in a closet don't make any money for developers.
Apples decision to go all-in on hand tracking compounds the software problem even further because the vast majority of VR software that has already been developed is designed around controllers with IMUs/buttons/sticks/triggers/haptics, and would be very difficult to port over to the Vision Pros much more limited control scheme.
That same rumor mill has always said that an affordable consumer version is the long term goal.
Apple has started development on a consumer version of the tech in a glasses form factor several times over the years, only to decide the current tech still doesn't allow for what they want.
For instance, an iteration that was tethered to an iPhone that made it to test production in 2019:
> Apple could begin production of its long-rumored augmented reality glasses as early as the end of this year, noted analyst Ming-Chi Kuo has said.
https://www.theverge.com/2019/3/8/18256256/apple-ar-glasses-...
They clearly haven’t executed it correctly yet. But at least they do have some hope of beating the bootstrapping problem that everybody else seems to have—they could try and get users first, and then I bet developers would quickly follow.
The Apple headset thing is $5000 and much bigger than a pair of sunglasses. I just don’t think the tech is here yet to make something that most people actually want. So nobody has the problem that Apple can solve yet: good enough hardware in search of useful apps.
No VR device has even gotten close to having to answer a question like “how is this useful” because the current janky hardware is only appealing to those of us who are happy to just play games, haha.
That's how they want to kickstart the app ecosystem by not doing anything.
Any non-ported app with any non-trivial functionality is likely to break in both obvious and subtle ways especially around UX. Who's gonna support the app on the platform and deal with customer complaints? Apple? You? Or the dev who might not even have the device to debug it on?
The main roadblock would be the OS support, and Meta's already has that. They are of course lacking another big roadblock, which is the store.
They just announced they're doing that: https://developers.meta.com/horizon/blog/building-2d-apps-on...
AFAICT there won't be Play Store support though, so devs will have to publish their 2D Android apps specifically to the Quest store.
I maintain an Android app which doesn't require Play Services. Might as well get it on the store if Meta is encouraging it
That said, visionOS can run iPad and iPhone apps unmodified. Meta will not bundle Google Play Services and a few other Android APIs, so APKs won’t be publishable on their store without some amount of work to use their alternate SDK.
it's just not officially supported, since they have been having a fight with google for years now. meta is claiming that google are the ones that don't want to see the play store on the platform.
Developers have to manually check a box to enable running the iPad version of their app on visionOS so it’s entirely out of Apple’s hands. Not sure why they took this approach
> Your iPhone and iPad apps will be made available to users on Apple Vision Pro, unless you edit your apps’ availability in the App Store Connect
https://developer.apple.com/help/app-store-connect/manage-yo...
And it makes sense for some devs to not provide their app on the platform if they use unsupported APIs or if their experience is otherwise degraded for some unknowable reason. Though the side effect is Netflix and YouTube can just withhold their apps that presumably otherwise work.
Every step of development is a reason not to write software for this thing, and it doesn't exist in a vacuum, either.
I have trouble believing that even a successor to the Vision Pro would look significantly better than my $800 Pimax Crystal Light.
This is a fictitious number. Apple never had the manufacturing capacity to reach that so-called target.
> the reported shareholder near-revolts
Citation needed.
> On track to having sold less than 15% of their Y1 target (400,000 vs 3 MM)
All the reports online reflect actual vs. expected more around 450-500k vs. 800k - 1M, not 3 million.
> the reported shareholder near-revolts
The stock market's reaction to the Vision Pro announcement saw Apple's shares dip by 3%, indicating a mix of investor optimism and concern, not revolt.
[1] https://www.wsj.com/tech/personal-tech/apple-vision-pro-soft...
Regarding stock, Apple almost always dips by a significant amount during an event. They’re very much traded as buying on the hype and selling on the news.
3MM was never possible, not sure where you got these numbers.
Thanks for putting this up!
Metal is roughly equivalent to Vulkan, which is on Windows and Linux. But Apple just had to Think Different. So all the major rendering libraries either have a layer which abstracts over Vulkan and Metal, or they don't support MacOS at all.
That's what WGPU does. If it were not for Apple ignoring standards, much of that would be unnecessary. It adds so much excess baggage that the consensus is to go direct to Vulkan or DX12, and blow off MacOS support.
The funny thing is that with the rise of Proton and Steam Deck driving its development -- there's literally no reason anymore for a game developer to ever target any "Desktop PC" platform except x86_64 Portable Executable files, using DirectX. Because Proton will just handle the port with nearly zero effort. Even on ARM64 devices like Snapdragon, the CPU is often not the limiting factor (and binary translation is only in the realm of 15%) so emulation for games is totally viable if the GPU can handle the load.
Windows/Xbox uses Direct3D, Mac/iDevices use Metal, PlayStation uses GNM/GNMX, Nintendo Switch uses NVN.
Compared to DirectX, Vulkan is barely used at all in gaming, except on Linux and the most recent subset of Android devices.
If it weren't for Proton's DirectX emulation running on top on Vulkan, Vulkan would be largely irrelevant, from a "widely used in real world gaming" perspective.
Metal is 1.7 years older than Vulkan. (June 2014 versus February 2016)
Metal has a couple of 'frame transient' objects (like MTLRenderPassDescriptor and MTLRenderCommandEncoder) which have a new instance created in each frame, and I guess the main job of the autoreleasepool is to clean up those transient objects (along with any other 'short-lived-junk' that might be returned from Metal API methods).
And my guess for why this is still needed in the age of ARC: I guess that ARC has a 'autorelease-blindness', e.g. it cannot figure out at compile time what objects are registered with autorelease pools (especially when the objects are passed across DLL boundaries) - it can only add retain/release calls on top. Just speculation on my part though.