Though I agree that language is not intelligence, suggesting that the AI boom is only about LLMs, or that LLMs do not create value is incredibly misleading.
I disagree with the base concept that to justify the current investment, we must arrive at AGI.
Estimates put total AI investement at $1.5T.
Is it a lot? Sure.
But what is going to come out of it.
Faster and likely improved results from medical imaging. Lower cost and widespread ability to create images and videos, denting the multi-trillion dollar marketing industry. Self-driving and advanced driver assistance, lives saves, costs reduced. Improvements in education and low-cost tutoring available to more people.
Let's say these investements just break even with spend. Isn't that better for society?
I know people are going to say "but what about the radiologists"? We have a shortage of GPs and doctors to care for an aging population. These highly trained doctors can do more in the medical community.
What about the actors/directors/sound engineers, etc in the media industry. This will likely shrink, but we can't ignore their expertise entirely. This industry won't go away. A friend is a voice actor. He isn't being replaced, however, he is getting less work, not because of the final production, but because in the pre-production phases, they don't need his expertise because they can use AI as 'good enough".
The lens I look at this through is my grandfather, who developed film for a living. That job disappeared. Nobody cried out for the film developers when we made the shift to digital. We create more imagery now than ever before, more people are employed in the imaging business than ever before.
As a (former) software engineer myself, do I believe AI will replace engineers? I think this is true as much as packages replaces engineers. When I started programming, there weren't a ton of open-source packages and there weren't a ton of tools like NPM, Cargo, for managing packages. I saw the transition from "write most of the code yourself" to "lego-brick" style programming. It didn't reduce the amount of programmers, it increased what we could do, and allowed us to do more interesting work than boilerplate.
It feels like a sort of smug escapism that ignores for many tasks LLM output regardless of how you want to define it is enough. It may not replace humans or our thought entirely, but many otherwise human tasks simply do not require such. Instead of facing the reality of such authors like this article's rejoice under the idea that our thought is special, our output unique and unmatched, those AI marketing fools are for naught as we cannot be replaced because of how they tried sticking LLM's output into a very human shaped box... if only it were that simple.