Say I have a startup that vibe codes “AI for real estate”. What about customer acquisition?
On the other hand, if I’m Zillow, why can’t I just throw a developer on the same feature and automatically have a customer base for it?
If you look at most of the YC funded startups these days, they are just prompt engineers with no go to market strategy and some don’t even have any technical people and are looking for “technical cofounders” that they can underpay with a promise of equity that will statistically be meaningless.
Tech is dividing the society and driving a wedge deeper. There is a huge population that are being thrown wayside by the high-speed tech highway. Which means the tech is getting more and more unreachable.
AI assistants are only going to make this worse, by removing the direct touch between users and the tech. Tech becomes just unmanageable for average person.
Just like how you were able to do all repairs for your bike, as a kid. But you can't do the same for your car now. Tech never gets easier or reachable.
RTX doesn't count to me either, because that's some bullcrap pushed by gpu manufacturers that requires the aformentioned upscaling and frame generation techniques to fake actually being anywhere close to what gpu manufacturers want gamers to believe.
The generations gains haven’t been as great as past generations, but it’s getting silly to claim that GPUs aren’t getting faster for gaming.
Intentionally ignoring frame generation and DLSS up scaling also feels petty. Using those features to get 150-200fps at 4K is actually a very amazing experience, even if the purists turn their noses up at it.
The used GPU market is relatively good at calibrating for relative gaming performance. If new GPUs weren’t actually faster then old GPUs wouldn’t be depreciating much. Yet you can pick up 3000 series GPUs very cheaply right now (except maybe the 3090 which is prized for its large VRAM, though still cheap). Even 4000 series are getting cheap.
Doing it for a whole screenful of pixels, for the majority of frames (with multi-frame generation) is even less of it.
It does help that he has a small screen and regular DPI. Seems like everyone wants to run with 4x the pixels in the same space, which needs about 4x the GPU.
There was an article[1] going around about that recently, and I'm sure there are more, but it's also a trend I've seen first-hand. (I don't particularly care for the article's framing, I'm just linking to it to illustrate the underlying data.)
[1]: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...
Don't confuse technical deflation with the Osborne effect:
But at least with iPhones, there is a deflationary affect because Apple has since the 3GS in 2009, kept the old phone around and reduced the price. For instance my son wanted an iPhone 16 Plus. I told him to wait until the 17 was announced and he bought one cheaper from T-Mobile
Why not? Sounds like a pretty reasonable strategy.
> Nobody seems to be putting off buying GPUs
Many people doing exactly that.
Now a new computer barely does anything faster for me.
I disagree with this statement. It has become simpler, provided you don't care about it actually being correct, and you don't care about whether you really have tests that test what you think you asked for, you don't care about security, and other things.
Building the same thing involves doing the things that LLMs have proved time and again that they cannot do. But instead of writing it properly in the first place, you now need to look for the needle in the haystack that is the subtle bug that invariable get inserted by llms every single time I tried to use them. Which requires you to deeply understand the code anyway. Which you would have gotten automatically (and easier) if you were the one writing the code in the first place. developing the same thing at the same level of quality is harder with an LLM.
And the "table stakes" stuff is exactly the thing I would not trust an LLM with for sure, because the risk of getting it wrong could potentially be fatal (to the company, not the dev. Depends on his boss' temperament) with those.
Let's say there are 10 subtasks that need to be done.
Let's say a human has 99% chance of getting each one of them right, by doing the proper testing etc. And let's say that the AI has a 95% chance of getting it right (being very generous here).
0.99^10 = a 90% chance of the human getting it to work properly. 0.95^10 = only a 60% chance. Almost a coin toss.
Even with 98% success rate, the compounding success rate still goes down to about 81%.
The thing is that LLM's aren't just "a little bit" worse than humans. In comparison they're cavemen.
> writing functioning application code has grown easier thanks to AI.
> It's getting easier and easier for startups to do stuff.
> Another answer might be to use the fact that software is becoming free and disposable to your advantage.
For me, the logical conclusion here is: don't build a software startup!
I left an AI startup to do tech consulting. What do I do? Build custom AI systems for clients. (Specifically clients that decided against going with startups' solutions.) Sometimes I build it for them, but I prefer to work with their own devs to teach them how to build it.
Fast forward 3+ years and we're going to see more everyday SMBs hiring a dev to just build them the stuff in-house that they were stuck paying vendors for. It won't happen everywhere. Painful enough problems and worthwhile enough solutions probably won't see much of a shift.
But startups that think the market will lap up whatever they have to offer as long as it looks and sounds slick may be in for a rude surprise.
You aren’t doing it to get customers, it’s for investors and maybe a decent acquisition
I'm not so sure that's the reason. I mean, to believe LLMs replace engineers you first need to believe engineers spend the bulk of their time typing frantically churning out code in greenfield projects. That's not compatible with reality. Although LLMs excel at generating new code from scratch, that scenario is the exception. Introducing significant changes to existing projects still requires long iterations which ultimately end up consuming more development time than actually rolling out the changes yourself.
The truth if the matter is that we are not observing an economic boom. The US is either stagnant or in a recession, and LLMs are not their cause. In an economic downturn you don't see spikes in demand for skilled workers.
The charged cost of a frontier model is ~200x lower than 2 years ago, and the ones we are using now are much better - although measuring that and how much is challenging. Building a "better than GPT-4" model is also vastly cheaper than building GPT-4 was... perhaps 1/100th?
I have the legal structure, i know my collegues, i have potentially employees and more capacity.
The problem is not that a startup is starting after you but you do not give yourself time to keep an eye on AI and not leveraging it when its helpful.
We leverage AI and ML progress constantly and keep an eye on advances. Segment Anything? Yepp we use it. Claude? Yes Sir!
Don't conflate easy with simple. I'd argue they are actually easier and far more complex.
Or, you know, technological improvements that increase efficiency of production, or bountiful harvests, or generally anything else that suddenly expands the supply at the current price level across the economy. Thankfully, we have mechanisms in place that keep the prices inflating even when those unlikely events happen.
Anyway, WTF, economics communication has a huge problem. I've seen the article's explanation repeated in plenty of places, it's completely wrong and borderline nonsense.
The reason deflation is bad is not because it makes people postpone buying things. It's because some prices, like salaries or rent just refuse to go down. That causes rationing of those things.
The reverse of this is that high inflation tends to cause a lot of strikes, because salaries refuse to go up and very high levels of inflation need salary repricing every month or even week.
It got old really quick having to negotiate with the boss every 6 months.
a common argument, but one that doesn't bear out in the absence of regulation enforcing that.
That is why we are all waiting to buy our first personal computers and our first cell phones.
Economists have managed to be ludicrous for a very long time and yet we still trust them.
Ugh. I don't like that kind of 'desktop' apps. Huge bloat with a blip of actual app.
Let’s say you have the a fusion rocket and can hit 5% the speed of light. You want to migrate to the stars for some reason.
So do you build a generational ship now, which is possible, or… do you wait?
Because if you build it now someone with a much better drive may just fly right past you at 20% the speed of light.
In this one the answer is to plot it out under the assumption there is no totally undiscovered major physics that would allow, say, FTL, and plot the curves for advancement against that.
So can we do this with software? We have the progress of hardware, which is somewhat deterministic, and we know something about the progress of software from stats we can make via GitHub.
The software equivalent of someone discovering some “fantasy” physics and building a warp drive would be runaway self-improving AGI/ASI. I’d argue this is impossible for information theoretical reasons, but what if I’m wrong?
Maybe the time value of time is only increasing as we go.
Knowing that GTK $n-1 will soon be obsolete is enough reason to not put effort into learning it.
> Used to be, you had to find a customer in SO much pain that they'd settle for a point solution to their most painful problem, while you slowly built the rest of the stuff. Now, you can still do that one thing really well, but you can also quickly build a bunch of the table stakes features really fast, making it more of a no-brainer to adopt your product.
The answer to should we just sit around and wait for better technology is obviously no. We gain a lot of knowledge by building with what we have; builders now inform where technology improves. (The front page has an article about Voyager being a light day away...)
I think the more interesting question is what would happen if we induced some kind of 2% "technological inflation" - every year it gets harder to make anything. Would that push more orgs to build more things? Everyone pours everything they have into making products now because their resources will go less far next year.
Government bonds already do this for absolutely everything. If I can put my money in a guaranteed bond at X%/year then your startup that's a risky investment has to make much better returns to make it worth my while. That's why the stock market is always chasing growth.
Yeah, and will be done by somebody else. I think this is the main problem, and if you get rid of it, you'll have a completely sensible strategy. I mean there are many government contractors who, through corrupt connections, can guarantee that work will be awarded to them, and very often doing just that.