> In a productivity boom such as this, a rise in unemployment may not indicate increased slack. As such, our normal demand-side monetary policy may not be able to ameliorate an AI-caused unemployment spell without also increasing inflationary pressure
I'm not saying AI isn't impacting the employment market, but this statement isn't really about AI so much as it is an advance warning that inflationary monetary policy is unavoidable if all the people saying that software engineering is dead are correct.
AI is helping produce more software, right? Including more software that is for sale?[1] Or more online services that are for sale?
[1] One of the interesting things here is going to be liability. You can vibecode an app. You can throw together a corporation to sell it. But if it malfunctions and causes damage, your thrown-together corporation won't have the resources to pay for it. Yeah, you can just have the company declare bankruptcy and walk away, leaving the user high and dry.
After that happens a few times, the commercial market for vibecoded apps may get kind of thin. In fact, the market for software sold by any kind of startup may also get thin.
Today I am planning an exit strategy. Anyone else considering what they’ll do in a post AI software engineering world?
If the doom really comes to pass then what future is there for us? I fear a life of impecunious servitude and poverty more than death.
I don't have time to post significantly about it but I'd love to trade thoughts and figure this out.
Email?
A lot of companies will use the speed of AI to wallpaper over the fact that they don't know what to make or how to prioritize.
AI can code - but can it understand what it missing from the organisation and persuade it to chnage - to spend years at industry conferences?
Look at Starliner. NASA just announced that Boeing stuffed up, not with an engineering mistake (no one still knows exactly what broke) but that the whole organisation is so screwed up and so political Nasa just don’t believe Boeing can fix it.
AI cannot fix our turf wars. That’s not intelligence (humans know going to war is bad, but Putin still exists). I ye the systems we live in, and work in.
Changing those is feasible - once they are coded, transparent and open To inspection in a democracy.
We need programmable introspective systems of organisation - democracies in other words.
The engineering was not the problem - the problem was the organisation was more or less toxic and incapable of doing engineering. Writing code that won’t get used because if politics is a job we and AI can both do.
Another way to frame it is what would you do in a low trust environment where corporations and the government were not to be trusted. You would likely avoid things like bubble bursting AI stock investments, jostling for rank in a company, etc.
But writing code was never much more than 35-40% of my job while working for companies/others. Most my time has always gone towards communication, design, and validation. All three of those are not particularly vulnerable to mass AI automation except for the most trivial of scenarios and I have not seen evidence that has changed in over 2 years of so called "improvements".
My "exit plan" ultimately is to be one of the engineers capable of using these tools to scale my impact accordingly so I can focus on higher order problem solving, which ultimately is what is most valuable. I would be more concerned if I was in marketing/sales or frankly middle management.
Maybe this is just "copium" on my part, who knows, this sector is moving fast.
As for what happens after that, I'd really prefer not to have to do physical labor or trades. And it doesn't seem like any other white collar occupation is really going to be insulated, other than perhaps medical. So my strategy is to basically wait and see what society looks like after the transition and I guess I'll try and decide on something then?
https://www.reuters.com/world/us/us-third-quarter-productivi...
Productivity up 5%.
Productivity/dollar up 3% Q2 and 2% Q3 even as labor costs up 1%.
https://www.stlouisfed.org/on-the-economy/2025/nov/state-gen...
> ... on average, industries with 1 percentage point higher time savings experienced 2.7 percentage points higher productivity growth relative to their prepandemic trend. We stress that this correlation cannot be interpreted as causal, and that labor productivity is determined by many factors. However, the current results are suggestive that generative AI may already be noticeably affecting industry-level productivity.
> People in much more important and powerful positions than her
I said "understanding," you said "power." There's a difference: presidents and CEOs say lots of dumb stuff.
It would be helpful if this was articulated in depth. It's used as a shibboleth alongside "productivity" but it's rarely followed with the concrete details
This isn't the first time they new technology has reshaped society, or even just the economy. How well were the results of prior things predicted ahead of time?
More importantly, are they planning to do anything about it?
https://www.washingtonpost.com/technology/2026/02/23/ai-econ...
This is about labor productivity, a standard national-level economic indicator (see https://www.bls.gov/news.release/pdf/prod2.pdf and https://fred.stlouisfed.org/series/OPHNFB) going up 4.9%, as reported in this article linked in TFA: https://www.reuters.com/world/us/us-third-quarter-productivi...
That’s it. An eroding tax base necessitates one of those or a combination.
How comforting. Sounds to me like "ZIRP won't fix this one folks, it's gonna take something other than money to fix what's coming."
The closest thing we've seen in terms of scope/velocity is probably the introduction of the web in the late 90s to the broader world. Very few jobs were killed by that, though, relatively speaking.
Today we use Luddite as an epithet, but they were right about the effect of automation on their jobs.