Nvidia Stock Crash Prediction
250 points
5 hours ago
| 34 comments
| entropicthoughts.com
| HN
_fat_santa
4 hours ago
[-]
This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump.

My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.

The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.

reply
agentcoops
2 hours ago
[-]
I hear your argument, but short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon. Of course I could easily be wrong, but regardless I think the most predictable cause for a drop in the NVIDIA price would be that the CHIPS act/recent decisions by the CCP leads a Chinese firm to bring to market a CUDA compatible and reliable GPU at a fraction of the cost. It should be remembered that NVIDIA's /current/ value is based on their being locked out of their second largest market (China) with no investor expectation of that changing in the future. Given the current geopolitical landscape, in the hypothetical case where a Chinese firm markets such a chip we should expect that US firms would be prohibited from purchasing them, while it's less clear that Europeans or Saudis would be. Even so, if NVIDIA were not to lower their prices at all, US firms would be at a tremendous cost disadvantage while their competitors would no longer have one with respect to compute.

All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.

reply
coryrc
1 hour ago
[-]
reply
iLoveOncall
1 hour ago
[-]
> short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon

Or, you know, when LLMs don't pay off.

reply
selfhoster11
1 hour ago
[-]
They already are paying off. The nature of LLMs means that they will require expensive, fast hardware that's a large capex.
reply
kortilla
1 hour ago
[-]
They aren’t yet because the big providers that paid for all of this GPU capacity aren’t profitable yet.

They continually leap frog each other and shift around customers which indicates that the current capacity is already higher than what is required for what people actually pay for.

reply
MrDarcy
13 minutes ago
[-]
Google, Amazon, and Microsoft aren’t profitable?
reply
notyourwork
19 seconds ago
[-]
I assume the reference was AI use cases are not profitable. Those companies are subsidizing and OpenAI/grok are burning money.
reply
Forgeties79
31 minutes ago
[-]
Where? Who’s in the black?
reply
lairv
2 hours ago
[-]
NVIDIA stock tanked in 2025 when people learned that Google used TPUs to train Gemini, which everyone in the community knows since at least 2021. So I think it's very likely that NVIDIA stock could crash for non-rationale reasons

edit: 2025* not 2024

reply
Der_Einzige
2 minutes ago
[-]
Google did not use TPUs for literally every bit of compute that led to Gemini. GCP has millions of high end Nvidia GPUs and programming for them is an order of magnitude easier, even for googlers.

Any claim from google that all of Gemini (including previous experiments) was trained entirely by TPUs is lies. What they are truthfully saying is that the final training run was done on all TPUs. The market shouldn’t react heavily to this, but instead should react positively to the fact that google is now finally selling TPUs externally and their fab yields are better than expected.

reply
readthenotes1
1 hour ago
[-]
It also tanked to ~$90 when Trump announced tariffs on all goods for Taiwan except semiconductors.

I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.

reply
gertlex
28 minutes ago
[-]
This was also on top of claims (Jan 2025) that Deepseek showed that "we don't actually need as much GPU, thus NVidia is less needed"; at least it was my impression this was one of the (now silly-seeming) reasons NVDA dropped then.
reply
Loudergood
1 hour ago
[-]
The market is full of people trying to anticipate how other people are going to react and exploit that by getting there first. There's a layer aimed at forecasting what that layer is going to do as well.

It's guesswork all the way down.

reply
recursive
45 minutes ago
[-]
Personally, I try to predict how others are going to predict that yet others will react.
reply
mnky9800n
3 hours ago
[-]
I really don't understand the argument that nvidia GPUs only work for 1-3 years. I am currently using A100s and H100s every day. Those aren't exactly new anymore.
reply
mbrumlow
2 hours ago
[-]
It’s not that they don’t work. It’s how businesses handle hardware.

I worked at a few data centers on and off in my career. I got lots of hardware for free or on the cheap simply because the hardware was considered “EOL” after about 3 years, often when support contracts with the vendor ends.

There are a few things to consider.

Hardware that ages produce more errors, and those errors cost, one way or another.

Rack space is limited. A perfectly fine machine that consumes 2x the power for half the output cost. It’s cheaper to upgrade a perfectly fine working system simply because it performs better per watt in the same space.

Lastly. There are tax implications in buying new hardware that can often favor replacement.

reply
fooker
2 hours ago
[-]
I’ll be so happy to buy a EOL H100!

But no, there’s none to be found, it is a 4 year, two generations old machine at this point and you can’t buy one used at a rate cheaper than new.

reply
pixl97
36 minutes ago
[-]
Well demand is so high currently that it's likely this cycle doesn't exist yet for fast cards.

For servers I've seen where the slightly used equipment is sold in bulk to a bidder and they may have a single large client buy all of it.

Then around the time the second cycle comes around it's split up in lots and a bunch ends up at places like ebay

reply
aswegs8
1 hour ago
[-]
Not sure why this "GPUs obsolete after 3 years" gets thrown around all the time. Sounds completely nonsensical.
reply
bmurphy1976
1 hour ago
[-]
It's because they run 24/7 in a challenging environment. They will start dying at some point and if you aren't replacing them you will have a big problem when they all die en masse at the same time.

These things are like cars, they don't last forever and break down with usage. Yes, they can last 7 years in your home computer when you run it 1% of the time. They won't last that long in a data center where they are running 90% of the time.

reply
Der_Einzige
41 seconds ago
[-]
With good enough cooling they can run indefinitely!!!!! The vast majority of failures are either at the beginning due to defects or at the end due to cooling! It’s like the idea that no moving parts (except the HVAC) is somehow unreliable is coming out of thin air!
reply
zozbot234
12 minutes ago
[-]
A makeshift cryptomining rig is absolutely a "challenging environment" and most GPUs by far that went through that are just fine. The idea that the hardware might just die after 3 years of usage is bonkers.
reply
belval
1 hour ago
[-]
Especially since AWS still have p4 instances that are 6 years old A100s. Clearly even for hyperscalers these have a useful life longer than 3 years.
reply
JMiao
1 hour ago
[-]
Do you know how support contract lengths are determined? Seems like a path to force hardware refreshes with boilerplate failure data carried over from who knows when.
reply
linkregister
3 hours ago
[-]
The common factoid raised in financial reports is GPUs used in model training will lose thermal insulation due to their high utilization. The GPUs ostensibly fail. I have heard anecdotal reports of GPUs used for cryptocurrency mining having similar wear patterns.

I have not seen hard data, so this could be an oft-repeated, but false fact.

reply
Melatonic
3 hours ago
[-]
It's the opposite actually - most GPU used for mining are run at a consistent temp and load which is good for long term wear. Peaky loads where the GPU goes from cold to hot and back leads to more degradation because of changes in thermal expansion. This has been known for some time now.
reply
Yizahi
2 hours ago
[-]
That is commonly repeated idea, but it doesn't take into account countless token farms which are smaller than a datacenter. Basically anything from a single MB with 8 cards to a small shed with rigs, all of which tend to disregard common engineering practices and run hardware into a ground to maximize output until next police raid or difficulty bump. Plenty of photos in the internet of crappy rigs like that, and no one guarantees which GPU comes whom where.

Another commonly forgotten issue is that many electrical components are rated by hours of operation. And cheaper boards tend to have components with smaller tolerances. And that rated time is actually a graph, where hour decrease with higher temperature. There were instances of batches of cards failing due to failing MOSFETs for example.

reply
Melatonic
17 minutes ago
[-]
While I'm sure there are small amateur setups done poorly that push cards to their limits this seems like a more rare and inefficient use. GPUS (even used) are expensive and running them at maximum would require large costs and time to be replacing them regularly. Not to mention the increased cost of cooling and power.

Not sure I understand the police raid mentality - why are the police raiding amateur crypto mining setups ?

I can totally see cards used by casual amateurs being very worn / used though - especially your example of single mobo miners who were likely also using the card for gaming and other tasks.

I would imagine that anyone purposely running hardware into the ground would be running cheaper / more efficient ASICS vs expensive Nvidia GPUs since they are much easier and cheaper to replace. I would still be surprised however if most were not proritising temps and cooling

reply
coryrc
1 hour ago
[-]
Specifically, we expect a halving of lifetime per 10K increase in temperature.
reply
whaleofatw2022
1 hour ago
[-]
Let's also not forget the set of miners that either overclock or dont really care about long term in how they set up thermals
reply
belval
1 hour ago
[-]
Miners usually don't overclock though. If anything underclocking is the best way to improve your ROI because it significantly reduces the power consumption while retaining most of the hashrate.
reply
Melatonic
21 minutes ago
[-]
Exactly - more specifically undervolting. You want the minimum volts going to the card with it still performing decently.

Even in amateur setups the amount of power used is a huge factor (because of the huge draw from the cards themselves and AC units to cool the room) so minimising heat is key.

From what I remember most cards (even CPUs as well) hit peak efficiency when undervolted and hitting somewhere around 70-80% max load (this also depends on cooling setup). First thing to wear out would probably be the fan / cooler itself (repasting occasionally would of course help with this as thermal paste dries out with both time and heat)

reply
zozbot234
9 minutes ago
[-]
Wouldn't the exact same considerations apply to AI training/inference shops, given that gigawatts are usually the key constraint?
reply
mbesto
1 hour ago
[-]
Source?
reply
zozbot234
3 hours ago
[-]
> I have heard anecdotal reports of GPUs used for cryptocurrency mining having similar wear patterns.

If this was anywhere close to a common failure mode, I'm pretty sure we'd know that already given how crypto mining GPUs were usually ran to the max in makeshift settings with woefully inadequate cooling and environmental control. The overwhelming anecdotal evidence from people who have bought them is that even a "worn" crypto GPU is absolutely fine.

reply
munk-a
3 hours ago
[-]
I can't confirm that fact - but it's important to acknowledge that consumer usage is very different from the high continuous utilization in mining and training. It is credulous that the wear on cards under such extreme usage is as high as reported considering that consumers may use their cards at peak 5% of waking hours and the wear drop off is only about 3x if it is used near 100% - that is a believable scale for endurance loss.
reply
denimnerd42
3 hours ago
[-]
1-3 is too short but they aren’t making new A100s, theres 8 in a server and when one goes bad what do you do? you wont be able to renew a support contract. if you wanna diy you eventually you have to start consolidating pick and pulls. maybe the vendors will buy them back from people who want to upgrade and resell them. this is the issue we are seeing with A100s and we are trying to see what our vendor will offer for support.
reply
iancmceachern
3 hours ago
[-]
They're no longer energy competitive. I.e. the amount of power per compute exceeds what is available now.

It's like if your taxi company bought taxis that were more fuel efficient every year.

reply
bob1029
3 hours ago
[-]
Margins are typically not so razor thin that you cannot operate with technology from one generation ago. 15 vs 17 mpg is going to add up over time, but for a taxi company it's probably not a lethal situation to be in.
reply
iancmceachern
54 minutes ago
[-]
Tell that to the airline industry
reply
bob1029
52 minutes ago
[-]
I don't think the airline industry is a great example from an IT perspective, but I agree with regard to the aircraft.
reply
mikkupikku
3 hours ago
[-]
If a taxi company did that every year, they'd be losing a lot of money. Of course new cars and cards are cheaper to operate than old ones, but is that difference enough to offset buying a new one every one to three years?
reply
gruez
3 hours ago
[-]
>If a taxi company did that every year, they'd be losing a lot of money. Of course new cars and cards are cheaper to operate than old ones, but is that difference enough to offset buying a new one every one to three years?

That's where the analogy breaks. There are massive efficiency gains from new process nodes, which new GPUs use. Efficiency improvements for cars are glacial, aside from "breakthroughs" like hybrid/EV cars.

reply
dylan604
3 hours ago
[-]
>offset buying a new one every one to three years?

Isn't that precisely how leasing works? Also, don't companies prefer not to own hardware for tax purposes? I've worked for several places where they leased compute equipment with upgrades coming at the end of each lease.

reply
mikkupikku
1 hour ago
[-]
Who wants to buy GPUs that were redlined for three years in a data center? Maybe there's a market for those, but most people already seem wary of lightly used GPUs from other consumers, let alone GPUs that were burning in a crypto farm or AI data center for years.
reply
pixl97
22 minutes ago
[-]
Depending on the discount, a lot of people.
reply
coryrc
1 hour ago
[-]
Depends on the price, of course. I'm wary of paying 50% of new for something run hard 3 years. Seems an NVIDIA H100 is going for $20k+ on EBay. I'm not taking that risk.
reply
gowld
2 hours ago
[-]
That works either because someone wants to buy old hardware for the manufacturer/lessor, or because the hardware is EOL in 3 years but it's easier to let the lessor deal with recyling / valuable parts recovery.
reply
wordpad
3 hours ago
[-]
If your competitor refreshes their cards and you dont, they will win on margin.

You kind of have to.

reply
lazide
3 hours ago
[-]
Not necessarily if you count capital costs vs operating costs/margins.

Replacing cars every 3 years vs a couple % in efficiency is not an obvious trade off. Especially if you can do it in 5 years instead of 3.

reply
zozbot234
3 hours ago
[-]
You can sell the old, less efficient GPUs to folks who will be running them with markedly lower duty cycles (so, less emphasis on direct operational costs), e.g. for on-prem inference or even just typical workstation/consumer use. It ends up being a win-win trade.
reply
lazide
1 hour ago
[-]
Then you’re dealing with a lot of labor to do the switches (and arrange sales of used equipment), plus capital float costs while you do it.

It can make sense at a certain scale, but it’s a non trivial amount of cost and effort for potentially marginal returns.

reply
pixl97
18 minutes ago
[-]
Building a new data center and getting power takes years to double your capacity. Swapping out out a rack that is twice as fast takes very little time in comparison.
reply
philwelch
3 hours ago
[-]
If there was a new taxi every other year that could handle twice as many fares, they might. That’s not how taxis work but that is how chips work.
reply
echelon
3 hours ago
[-]
Nvidia has plenty of time and money to adjust. They're already buying out upstart competitors to their throne.

It's not like the CUDA advantage is going anywhere overnight, either.

Also, if Nvidia invests in its users and in the infrastructure layouts, it gets to see upside no matter what happens.

reply
mbesto
2 hours ago
[-]
Not saying your wrong. A few things to consider:

(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.

(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.

(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.

reply
epolanski
2 hours ago
[-]
Nvidia is moving to a 1 year release life cycle for data center, and in Jensen's words once a new gen is released you lose money for being on the older hardware. It makes no longer financially sense to run it.
reply
pixl97
16 minutes ago
[-]
That will come back to bite them in the ass if money leaves the AI race.
reply
swalsh
36 minutes ago
[-]
If power is the bottleneck, it may make business sense to rotate to a GPU that better utilizes the same power if the newer generation gives you a significant advantage.
reply
legitster
3 hours ago
[-]
From an accounting standpoint, it probably makes sense to have their depreciation be 3 years. But yeah, my understanding is that either they have long service lives, or the customers sell them back to the distributor so they can buy the latest and greatest. (The distributor would sell them as refurbished)
reply
savorypiano
3 hours ago
[-]
You aren't trying to support ad-based demand like OpenAI is.
reply
linuxftw
3 hours ago
[-]
I think the story is less about the GPUs themselves, and more about the interconnects for building massive GPU clusters. Nvidia just announced a massive switch for linking GPUs inside a rack. So the next couple of generations of GPU clusters will be capable of things that were previously impossible or impractical.

This doesn't mean much for inference, but for training, it is going to be huge.

reply
nospice
3 hours ago
[-]
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.

Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.

Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.

reply
ericmcer
2 hours ago
[-]
Crypto & AI can both be linked to part of a broader trend though, that we need processors capable of running compute on massive sets of data quickly. I don't think that will ever go down, whether some new tech emerges or we just continue shoveling LLMs into everything. Imagine the compute needed to allow every person on earth to run a couple million tokens through a model like Anthropic Opus every day.
reply
pixl97
15 minutes ago
[-]
Agreed, single thread performance increases are dead and things are moving to massively parallel processing.
reply
mysteria
3 hours ago
[-]
Personally I wonder even if the LLM hype dies down we'll get a new boom in terms of AI for robotics and the "digital twin" technology Nvidia has been hyping up to train them. That's going to need GPUs for both the ML component as well as 3D visualization. Robots haven't yet had their SD 1.1 or GPT-3 moment and we're still in the early days of Pythia, GPT-J, AI Dungeon, etc. in LLM speak.
reply
iwontberude
1 hour ago
[-]
Exactly, they will pivot back to AR/VR
reply
mysteria
1 hour ago
[-]
That's going to tank the stock price though as that's a much smaller market than AI, though it's not going to kill the company. Hence why I'm talking about something like robotics which has a lot of opportunity to grow and make use of all those chips and datacenters they're building.

Now there's one thing with AR/VR that might need this kind of infrastructure though and that's basically AI driven games or Holodeck like stuff. Basically have the frames be generated rather than modeled and rendered traditionally.

reply
bigyabai
57 minutes ago
[-]
Nvidia's not your average bear, they can walk and chew bubblegum at the same time. CUDA was developed off money made from GeForce products, and now RTX products are being subsidized by the money made on CUDA compute. If an enormous demand for efficient raster compute arises, Nvidia doesn't have to pivot much further than increasing their GPU supply.

Robotics is a bit of a "flying car" application that gets people to think outside the box. Right now, both Russia and Ukraine are using Nvidia hardware in drones and cruise missiles and C2 as well. The United States will join them if a peer conflict breaks out, and if push comes to shove then Europe will too. This is the kind of volatility that crazy people love to go long on.

reply
munk-a
3 hours ago
[-]
That's the rub - it's clearly overvalued and will readjust... the question is when. If you can figure out when precisely then you've won the lottery, for everyone else it's a game of chicken where for "a while" money that you put into it will have a good return. Everyone would love if that lasted forever so there is a strong momentum preventing that market correction.
reply
jama211
2 hours ago
[-]
It was overvalued when crypto was happening too, but another boom took its place. Of course, lightening rarely strikes twice and all that, but it proves overvalued doesn’t mean the price is guaranteed to go down it seems. Predicting the future is hard.
reply
pixl97
13 minutes ago
[-]
As they say, the market can remain irrational far longer than you can remain solvent.
reply
sidrag22
2 hours ago
[-]
if there was anything i was going to bet against between 2019 and now, it was nvidia... and wow it feels wild how much in the opposite direction it went.

I do wonder what people would think the reasoning would be for them to increase in value this much back then, prolly would just assume crypto related still.

reply
AnotherGoodName
3 hours ago
[-]
I'll also point out there were insane takes a few years ago before nVidia's run up based on similar technical analysis and very limited scope fundamental analysis.

Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.

I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.

I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.

reply
richardw
1 hour ago
[-]
I’m sad about Grok going to them, because the market needs the competition. But ASIC inference seems to require a simpler design than training does, so it’s easier for multiple companies to enter. It seems inevitable that competition emerges. And eg a Chinese company will not be sold to Nvidia.

What’s wrong with this logic? Any insiders willing to weigh in?

reply
bigyabai
1 hour ago
[-]
I'm not an insider, but ASICs come with their own suite of issues and might be obsolete if a different architecture becomes popular. They'll have a much shorter lifespan than Nvidia hardware in all likelihood, and will probably struggle to find fab capacity that puts them on equal footing in performance. For example, look at the GPU shortage that hit crypto despite hundreds of ASIC designs existing.

The industry badly needs to cooperate on an actual competitor to CUDA, and unfortunately they're more hostile to each other today than they were 10 years ago.

reply
cortesoft
2 hours ago
[-]
> The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years

Isn’t this entirely dependent on the economic value of the AI workloads? It all depends on whether AI work is more valuable than that cost. I can easily see arguments why it won’t be that valuable, but if it is, then that cost will be sustainable.

reply
alfalfasprout
2 hours ago
[-]
100% this. all of this spending is predicated on a stratospheric ROI on AI investments at the proposed investment levels. If that doesn't pan out, we'll see a lot of people left holding the cards including chip fabs, designers like Nvidia, and of course anyone that ponied up for that much compute.
reply
jpadkins
17 minutes ago
[-]
How much did you short the stock?
reply
KeplerBoy
4 hours ago
[-]
Also there's no way Nvidia's market share isn't shrinking. Especially in inference.
reply
gpapilion
3 hours ago
[-]
The large api/token providers, and large consumers are all investing in their own hardware. So, they are in an interesting position where the market is growing, and NVIDIA is taking the lion's share of enterprise, but is shrinking at the hyperscaler side (google is a good example as they shift more and more compute to TPU). So, they have a shrinking market share, but its not super visible.
reply
zozbot234
2 hours ago
[-]
> The large api/token providers, and large consumers are all investing in their own hardware.

Which is absolutely the right move when your latest datacenter's power bill is literally measured in gigawatts. Power-efficient training/inference hardware simply does not look like a GPU at a hardware design level (though admittedly, it looks even less like an ordinary CPU), it's more like something that should run dog slow wrt. max design frequency but then more than make up for that with extreme throughput per watt/low energy expense per elementary operation.

The whole sector of "neuromorphic" hardware design has long shown the broad feasibility of this (and TPUs are already a partial step in that direction), so it looks like this should be an obvious response to current trends in power and cooling demands for big AI workloads.

reply
dogma1138
3 hours ago
[-]
Market share can shrink but if the TAM is growing you can still grow.
reply
blackoil
3 hours ago
[-]
But will the whole pie grow or shrink?
reply
kqr
1 hour ago
[-]
Fundamental analysis is great! But I have trouble answering concrete questions of probability with it.

How do you use fundamental analysis to assign a probability to Nvidia closing under $100 this year, and what probability do you assign to that outcome?

I'd love to hear your reasoning around specifics to get better at it.

reply
esafak
1 hour ago
[-]
Don't you need a model for how people will react to the fundamentals? People set the price.
reply
baxtr
3 hours ago
[-]
I no AI fanboy at all. I think it there won’t be AGI anytime soon.

However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.

AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.

reply
Ekaros
3 hours ago
[-]
At this point computation is in essence commodity. And commodities have demand cycles. If other economic factors slowdown or companies go out of business they stop using compute or start less new products that use compute. Thus it is entirely realistic to me that demand for compute might go down. Or that we are just now over provisioning compute in short or medium term.
reply
galaxyLogic
3 hours ago
[-]
I wonder, is the quality of AI answers going up over time or not? Last weekend I spent a lot of time with Preplexity trying to understand why my SeqTrack device didn't do what I wanted it to do and seems Perplexity had a wrong idea of how the buttons on the device are laid out, so it gave me wrong or confusing answers. I spent literally hours trying to feed it different prompts to get an answer that would solve my problem.

If it had given me the right easy to understand answer right away I would have spent 2 minutes of both MY time and ITS time. My point is if AI will improve we will need less of it, to get our questions answered. Or, perhaps AI usage goes up if it improves its answers?

reply
jama211
2 hours ago
[-]
Always worth trying a different model, especially if you’re using a free one. I wouldn’t take one data point to seriously either.

The data is very strongly showing the quality of AI answers is rapidly improving. If you want a good example, check out the sixty symbols video by Brady Haran, where they revisited getting AI to answer a quantum physics exam after trying the same thing 3 years ago. The improvement is IMMENSE and unavoidable.

reply
zozbot234
3 hours ago
[-]
If the AI hasn't specifically learned about SeqTracks as part of its training it's not going to give you useful answers. AI is not a crystal ball.
reply
wordpad
3 hours ago
[-]
So...like Cisco during dot com bust?
reply
Ekaros
3 hours ago
[-]
More so I meant to think of oil, copper and now silver. All follow demand for the price. All have had varying prices at different times. Compute should not really be that different.

But yes. Cisco's value dropped when there was not same amount to spend on networking gear. Nvidia's value will drop as there is not same amount of spend on their gear.

Other impacted players in actual economic downturn could be Amazon with AWS, MS with Azure. And even more so those now betting on AI computing. At least general purpose computing can run web servers.

reply
Ronsenshi
3 hours ago
[-]
What if its penetration ends up being on the same level as modern crypto? Average person doesn't seem to particularly care about meme coins or bitcoin - it is not being actively used in day to day setting, there's no signs of this status improving.

Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.

So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.

reply
infecto
3 hours ago
[-]
I have still been unable to see how folks connect AI to Crypto. Crypto never connected with real use cases. There are some edge cases and people do use it but there is not a core use.

AI is different and businesses are already using it a lot. Of course there is hype, it’s not doing all the things the talking heads said but it does not mean immense value is not being generated.

reply
Ronsenshi
2 hours ago
[-]
It's an analogy, it doesn't have to map 1:1 to AI. The point is that current situation around AI looks kind of similar to the situation and level of hype around Crypto when it was still growing: all the "ledger" startups, promises of decentralization, NFTs in video games and so on. We are somewhere around that point when it comes to AI.
reply
marricks
3 hours ago
[-]
> I no AI fanboy at all.

While thinking computers will replace human brains soon is rabid fanaticism this statement...

> AI will conquer the world like software or the smartphone did.

Also displays a healthy amount of fanaticism.

reply
jwoods19
2 hours ago
[-]
Even suggesting that computers will replace human brains brings up a moral and ethical question. If the computer is just as smart as a person, then we need to potentially consider that the computer has rights.

As far as AI conquering the world. It needs a "killer app". I don't think we'll really see that until AR glasses that happen to include AI. If it can have context about your day, take action on your behalf, and have the same battery life as a smartphone...

reply
xenospn
2 hours ago
[-]
I don’t see this as fanaticism at all. No one could predict a billion people mindlessly scrolling tiktok in 2007. This is going to happen again, only 10x. Faster and more addictive, with content generated on the fly to be so addictive, you won’t be able to look away.
reply
jwoods19
3 hours ago
[-]
“In a gold rush, sell shovels”… Well, at some point in the gold rush everyone already has their shovels and pickaxes.
reply
krupan
3 hours ago
[-]
Or people start to realize that the expected gold isn't really there and so stop buying shovels
reply
gopher_space
2 hours ago
[-]
The version I heard growing up was "In a gold rush, sell eggs."
reply
FergusArgyll
1 hour ago
[-]
Selling jeans is the one that actually worked
reply
stego-tech
2 hours ago
[-]
Add in the fact companies seriously invested in AI (and like workloads typically reliant on GPUs) are also investing more into bespoke accelerators, and the math for nVidia looks particularly grim. Google’s TPUs set them apart from the competition, as does Apple’s NPU; it’s reasonable to assume firms like Anthropic or OpenAI are also investigating or investing into similar hardware accelerators. After all, it’s easier to lock-in customers if your models cannot run on “standard” kit like GPUs and servers, even if it’s also incredibly wasteful.

The math looks bad regardless of which way the industry goes, too. A successful AI industry has a vested interest in bespoke hardware to build better models, faster. A stalled AI industry would want custom hardware to bring down costs and reduce external reliance on competitors. A failed AI industry needs no GPUs at all, and an inference-focused industry definitely wants custom hardware, not general-purpose GPUs.

So nVidia is capitalizing on a bubble, which you could argue is the right move under such market conditions. The problem is that they’re also alienating their core customer base (smaller datacenters, HPC, gaming market) in the present, which will impact future growth. Their GPUs are scarce and overpriced relative to performance, which itself has remained a near-direct function of increased power input rather than efficiency or meaningful improvements. Their software solutions - DLSS frame-generation, ray reconstruction, etc - are locked to their cards, but competitors can and have made equivalent-performing solutions of their own with varying degrees of success. This means it’s no longer necessary to have an nVidia GPU to, say, crunch scientific workloads or render UHD game experiences, which in turn means we can utilize cheaper hardware for similar results. Rubbing salt in the wound, they’re making cards even more expensive by unbundling memory and clamping down on AIB designs. Their competition - Intel and AMD primarily - are happily enjoying the scarcity of nVidia cards and reaping the fiscal rewards, however meager they are compared to AI at present. AMD in particular is sitting pretty, powering four of the five present-gen consoles, the Steam Deck (and copycats), and the Steam Machine, not to mention outfits like Framework; if you need a smol but capable boxen on the (relative) cheap, what used to be nVidia + ARM is now just AMD (and soon, Intel, if they can stick the landing with their new iGPUs).

The business fundamentals paint a picture of cannibalizing one’s evergreen customers in favor of repeated fads (crypto and AI), and years of doing so has left those customer markets devastated and bitter at nVidia’s antics. Short of a new series of GPUs with immense performance gains at lower price and power points with availability to meet demand, my personal read is that this is merely Jenson Huang’s explosive send-off before handing the bag over to some new sap (and shareholders) once the party inevitably ends, one way or another.

reply
bArray
2 hours ago
[-]
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.

Exactly, it is currently priced as though infinite GPUs are required indefinitely. Eventually most of the data centres and the gamers will have their GPUs, and demand will certainly decrease.

Before that, though, the data centres will likely fail to be built in full. Investors will eventually figure out that LLMs are still not profitable, no matter how many data centres you produce. People are interested in the product derivatives at a lower price than it costs to run them. The math ain't mathin'.

The longer it takes to get them all built, the more exposed they all are. Even if it turns out to be profitable, taking three years to build a data centre rather than one year is significant, as profit for these high-tech components falls off over time. And how many AI data centres do we really need?

I would go further and say that these long and complex supply chains are quite brittle. In 2019, a 13 minute power cut caused a loss of 10 weeks of memory stock [1]. Normally, the shops and warehouses act as a capacitor and can absorb small supply chain ripples. But now these components are being piped straight to data centres, they are far more sensitive to blips. What about a small issue in the silicon that means you damage large amounts of your stock trying to run it at full power through something like electromigration [2]. Or a random war...?

> The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.

Yep. Nothing about this adds up. Existing data centres with proper infrastructure are being forced to extend use for previously uneconomical hardware because new data centres currently building infrastructure have run the price up so high. If Google really thought this new hardware was going to be so profitable, they would have bought it all up.

[1] https://blocksandfiles.com/2019/06/28/power-cut-flash-chip-p...

[2] https://www.pcworld.com/article/2415697/intels-crashing-13th...

reply
cheschire
2 hours ago
[-]
Well, not to be too egregiously reductive… but when the M2 money supply spiked in the 2020 to 2022 timespan, a lot of new money entered the middle class. That money was then funneled back into the hands of the rich through “inflation”. That left the rich with a lot of spare capital to invest in finding the next boom. Then AI came along.

Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc

It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.

reply
reflexe
2 hours ago
[-]
According to nvidia’s 2025 annual report [1], 34% of their sales for 2025 comes from just 3 customers.

Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].

This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).

To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.

[1] https://s201.q4cdn.com/141608511/files/doc_financials/2025/a..., page 155: > Sales to direct Customers A, B and C represented 12%, 11% and 11% of total revenue, respectively, for fiscal year 2025.

[2] same, page 116: > Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.

reply
smw
1 hour ago
[-]
I have lots of skepticism about everything involved in this, but on this particular point:

It's a bit like TSMC: you couldn't buy space on $latestGen fab because Apple had already bought it all. Many companies would have very much liked to order H200s and weren't able to, as they were all pre-sold to hyperscalers. If one of them stopped buying, it's very likely they could sell to other customers, though there might be more administrative overhead?

Now there are some interesting questions about Nvidia creating demand by investing huge amounts of money in cloud providers that will order nv hardware, but that's a different issue.

reply
NewCzech
4 hours ago
[-]
He doesn't really address his own question.

He's answering the question "How should options be priced?"

Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.

Whether Nvidia specifically takes a big dive depends much more on whether they continue to meet growth estimates than general volatility. If they miss earnings estimates in a meaningful way the market is going to take the stock behind the shed and shoot it. If they continue to exceed estimates the stock will probably go up or at least keep its present valuation.

reply
dsr_
4 hours ago
[-]
> Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.

Other way around: if NVidia sinks, it likely takes a bunch of dependent companies with it, because the likely causes of NVidia sinking all tell us that there was indeed an AI bubble and it is popping.

reply
weslleyskah
3 hours ago
[-]
Indeed, the market as a whole would be affected. But is not NVIDIA more of a software company than a hardware one? This bugs the shit out of me.

They are maintaining this astronomical growth through data centers margins from the design of their chips and all of that started from graphics related to video games.

reply
coffeebeqn
3 hours ago
[-]
> But is not NVIDIA more of a software company than a hardware one?

No? That’s why they have almost no competition. Hardware starting costs are astronomical

reply
weslleyskah
3 hours ago
[-]
But the actual manufacturing foundry is TSMC no? And they create the whole software environment based on their chips.
reply
immibis
41 minutes ago
[-]
It costs eight figures to create the masks (patterns) to use in the process of creating a modern chip. Just because it doesn't cost the eleven figures of the factory itself doesn't make it cheap.
reply
IceHegel
55 minutes ago
[-]
I'm surprised more people are not talking about the fact that the two best models in the world, Gemini 3 and Claude 4.5 Opus, were both trained on Google TPU clusters.

Presumably, inference can be done on TPUs, Nvidia chips, in Anthropic's case, new stuff like Trainium.

reply
originalvichy
1 hour ago
[-]
As others have noted, the article is analysing the actual financial markets angle.

For my two cents on the technical side, it is likely that any Western-origin shakiness will come from Apple and how it manages to land the Gemini deal and Apple Intelligence v2. There is an astounding amount of edge inference sitting in people’s phones and laptops that only slightly got cracked open with Apple Intelligence.

Data centre buildouts will get corrected when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet. This will serve as a guiding principle for any future buildout and heavyweight inference cards that Nvidia is supplying. The 2-5 year moat top providers have with the largest models will get chomped at by the leisure/hobby/educational use cases that lightweight models capably handle. Small language and visual models are already amazing. The next crack will appear when the past gen cards (if they survive the around the clock operation) get bought up by second hand operators that can provide capable inference of even current gen models.

If past knowledge of DC operators holds (e.g. Google and its aging TPUs that still get use), the providers with the resources to buy new space for newer gens will accumulate the amount of hardware, but the providers who need to continuously shave off the financial hit that comes with using less efficient older cards.

I’m excited to see future blogs about hardware geeks buying used inference stacks and repurposing them for home use :)

reply
notatoad
1 hour ago
[-]
>when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet

is there any reason to expect that this information will ever be known outside of apple?

reply
hagope
18 minutes ago
[-]
NVIDIA Vera Rubin NVL72 unveiled at CES makes any other computer look like a pocket calculator, and that's why I wouldn't want to be bearish on NVDA right now, see https://www.nvidia.com/en-us/data-center/vera-rubin-nvl72
reply
rwmj
4 hours ago
[-]
It goes to nearly zero if China invades Taiwan, and that seems like it has at least a 10% chance of happening in the next year or two.
reply
toephu2
2 hours ago
[-]
It doesn't goto nearly zero. TSMC has a large fab in Arizona and they are continuing to expand it. They also have a fab in Washington, and in Japan. [1]

[1]https://www.tsmc.com/english/aboutTSMC/TSMC_Fabs

reply
fkarg
4 hours ago
[-]
I agree. It's funny that this is one of the cited reason for the (relative) value suppression of tsmc, but the same factors should apply to Nvidia too.
reply
eagerpace
4 hours ago
[-]
Going to zero is one potential outcome. Equally plausible is it goes up 10% in a relatively quick battle or diplomatic outcome which ends the geopolitical uncertainty.
reply
rwmj
3 hours ago
[-]
There's approximately 0% chance that China will ship leading edge wafers from captured TSMC to the West.
reply
wordpad
3 hours ago
[-]
Not true, it might be something they compromise on to restore relations
reply
IsTom
2 hours ago
[-]
That's possible only if fabs are operational after the invasion.
reply
eagerpace
3 hours ago
[-]
This is the beauty of Polymarket. Then bet on it. There are so many more outcomes possible to this conflict than what you see reported in the media. Don't be so reductive.
reply
rbtprograms
3 hours ago
[-]
dont be a weird gambling degenerate
reply
u8080
2 hours ago
[-]
Short the stocks then, lol.

When people do this kind of predictions, they often driven by emotional reaction. Best thing to switch actual evaluation on certain hypothesis is to make actual risks cost something.

reply
eagerpace
3 hours ago
[-]
Im not recommending it, but the subject here (shorting the most valuable company on earth) is pretty degen to begin with.
reply
alecco
1 hour ago
[-]
I think they are already hedging for Taiwan. 1. They just pseudo-acquired Groq, fully made in USA (GlobalFoundries) and with a diversified supply chain. 2. And they just announced they will be re-introducing RTX 3090 made in Korea (Samsung). 3. And they plan to produce chips in Intel's new US fabs soon.

I think the bigger problems of the AI bubble are energy and that it's gaining a terrible reputation for being the excuse for mass layoffs while suffocating the Internet with slop/brainrot content. All while depending on government funding to grow.

reply
khalic
4 hours ago
[-]
Idk the pro china side is getting more and more support, at this rate they’ll vote themselves into mainland
reply
whatevaa
4 hours ago
[-]
Well, the reality is that most people don't want a bloodbath and it's increasingly looking like external support won't come, so what you gonna do... life is a very complex chess game, gotta play your pieces right.
reply
mikkupikku
3 hours ago
[-]
At this rate, even if they can't get the Taiwanese population to consent, it probably makes more sense to wait anyway to see how low America can sink. The lower America goes, the better their chance for success.
reply
Ekaros
3 hours ago
[-]
China is capable of taking long term view, beyond single election cycle. And currently USA really seems to be heading down faster and faster.

If something even more drastic happens. China might even attempt unification with some reasoning like protecting Taiwan from USA or other nations.

reply
blackoil
3 hours ago
[-]
An EU type agreement will keep peace for some time. Remove all trade barriers between two countries, have a treaty preventing any side to be used militarily by third party, no attacking each other and free movement of all vessels through each other's seas. Maybe few more
reply
nebula8804
3 hours ago
[-]
Thats just buying China more time until they can get their chip manufacturing to at least a similar ballpark. Then Taiwan has no cards left to play. China can cripple TSMC depriving the west of chips while they continue onwards.
reply
blackoil
2 hours ago
[-]
"buying China more time". China has no time-pressure to attack immediately, but all the upside right now of pretending to a stable, sensible world leader. Treaty with Taiwan will keep the ego of One China, prevent it from naval blockade by Taiwanese territories and will remove one of the major territorial issues raised against it.
reply
nebula8804
2 hours ago
[-]
I don't know about that...don't they have massive overcapacity in many of their industries as well as ~25% youth unemployment? For all the mess the US is going through at least we are seeing it out in the open. China seems to be going through their own messes right now but it is behind the great wall. Will a treaty be enough or will their leaders falter and try to push for more. Guess we will see.
reply
cjbgkagh
4 hours ago
[-]
I think Taiwanese elites can be bought, they say they can’t but I think that’s just part of the bargaining for a higher price. The overtures towards a costly and destructive invasion is Chinas attempt at lowering that price. As is the strategy of building up an indigenous chip manufacturing industry. The aggressive rhetoric from China has the added benefit of keeping the US on a self sabotaging aggressive posture.
reply
ghosty141
2 hours ago
[-]
I mean that's obviously the best outcome for the Chinese government. Same thing that happens/ed to Hongkong. War is bad for everybody.
reply
utopiah
4 hours ago
[-]
But then again what won't? Non tech stocks?
reply
rwmj
4 hours ago
[-]
Yes, lots of other companies would be affected to a greater or lesser extent (even non-tech stocks), but specifically any company that relies on manufacturing all their product in Taiwan will be affected most of all.
reply
zitterbewegung
4 hours ago
[-]
Industrial military complex and government contractors.
reply
LunaSea
4 hours ago
[-]
Don't they also depend on chips for a lot of components?
reply
mikkupikku
3 hours ago
[-]
Probably a lot more from TI and Intel than Taiwan.
reply
utopiah
2 hours ago
[-]
I'd be curious how many of the design and verification (using computer vision) tools used at TI and Intel rely on on farms of stock GPUs thus chips still made in Taiwan. They might have in house chips just for such part of their workflows though, any insight appreciated.
reply
utopiah
4 hours ago
[-]
Jets, tanks, drones and data centers for intelligence services, even design, are full of electronics but what's the share of those not made in Taiwan?
reply
throwaway5752
4 hours ago
[-]
Gold stocks, basic materials, MSCI world and emerging market indexes. Look at their prices and see how very smart people are positioning their money.
reply
immibis
4 hours ago
[-]
The whole economy will crash. Probably won't be due to China invading Taiwan though. More likely because the president decided to delete their country's world reserve currency status (which is another word for a trade deficit).
reply
fullshark
3 hours ago
[-]
What does the US gov't do in response? Wouldn't they throw globs of money at Intel and Nvidia?
reply
bob1029
3 hours ago
[-]
They already have.
reply
heathrow83829
3 hours ago
[-]
but they're expected to have 8 or 9 aircraft carriers by 2035, doesn't it make sense to wait until then?
reply
flowerthoughts
3 hours ago
[-]
If the US is fighting with Europe and South America, China might not that many.
reply
bpodgursky
4 hours ago
[-]
NVIDIA has been producing Blackwell in Arizona since October. Don't be dramatic.

There would be a supply crunch but a lot of dollars will be shuffled VERY fast to ramp up production.

reply
maxglute
3 hours ago
[-]
Arizona fabs don't work without TW's many sole source suppliers for fab consumables. They'll likely grind to halt after few months when stock runs out. All the dollar shuffling's not going to replace supply chain that will take (generously) years to build, if ever.
reply
rwmj
4 hours ago
[-]
They definitely made at least one wafer in Arizona in October.
reply
georgeburdell
4 hours ago
[-]
Packaging? Assembling onto boards?
reply
blackoil
3 hours ago
[-]
Can outsource to China. Only partial /s
reply
koolba
4 hours ago
[-]
> One of the questions of the 2026 acx prediction contest is whether Nvidia’s stock price will close below $100 on any day in 2026.

Maybe I’m missing something, but isn’t this just a standard American put option with a strike of $100 and expiry of Dec 31st?

reply
amelius
4 hours ago
[-]
No because if it goes to $99.99, you don't win much. With a prediction contest it is either you win or you lose.
reply
mklyachman
4 hours ago
[-]
Not really. American put options will pay differently for 95 dollars vs 99 dollars, while this contract settles to 1 either which way.
reply
r_lee
3 hours ago
[-]
They're enjoying a massive demand for GPUs due to AI blowing up, at a time when there isn't much competition, yet the technology is already pleateauing, with similar offerings from AMD, not to mention proven training & inference chips from Google & AWS, plus the Chinese national strategy of prioritizing domestic chips

The only way the stock could remain at its current price or grow (which is why you'd hold it) is if demand would just keep going up (with the same lifecycle as current GPUs) and that there would be no competition, which the latter to me us just never going to be a thing.

Investors are convinced that Nvidia can maintain its lead because they have the "software" side, I.e. CUDA, which to me is so ridiculous, as if with the kind of capital that's being deployed into these datacenters, you couldn't fit your models into other software stacks by hiring people....

reply
mythical_39
3 hours ago
[-]
or couldn't use a LLM to help port your CUDA code to "new framework", i.e. software is no longer a lock-in....

assuming LLM coding agents are good, but if they aren't any good, then what is the value of the CUDA code?

reply
ironbound
1 hour ago
[-]
It's a problem if you have to keep asking "are we in a bubble?"
reply
dexterlagan
2 hours ago
[-]
There is one thing everybody forgets when making such predictions: companies don't stand still. Nvidia and every other tech business is constantly exploring new options, taking over competitors, buying startups with novel technologies etc... Nvidia is no slouch in that regard, and their recent quasi-acquisition of Groq is just one example of this. So, when attempting at making predictions, we're looking at a moving target, not systems set in stone. If the people at the helm are smart (and they are), you can expect lots of action and ups and downs - especially in the AI sphere.

My personal opinion, having witnessed first hand nearly 40 years of tech evolution, is that this AI revolution is different. We're at the very beginning of a true paradigm shift: the commoditization of intelligence. If that's not enough to make people think twice before betting against it, I don't know what is. And it's not just computing that is going to change. Everything is about to change, for better or worse.

reply
baal80spam
3 hours ago
[-]
checks calendar Ah, NVIDIA earnings call is close - prepare for the inevitable doomer articles.
reply
kwar13
3 hours ago
[-]
This is more of a derivative pricing article and has nothing to do with nvidia really
reply
matt3210
1 hour ago
[-]
New competition is an issue. It wasn’t as lucrative to compete with nvidia in the past
reply
vatsachak
3 hours ago
[-]
Who said that monads don't have any application?
reply
vatsachak
3 hours ago
[-]
They implement Applicative, so by definition they do
reply
huqedato
1 hour ago
[-]
That's smoke and mirrors. You can't logically predict the market. It never worked.
reply
javcasas
1 hour ago
[-]
The thing is, in this gold rush, Nvidia is the one selling shovels.
reply
traceroute66
3 hours ago
[-]
The simple answer to the question:

Nvidia stock crash will happen when the vendor financing bubble bursts.

They are engaged in a dangerous game of circular financing. So it is case of when, not if the chickens come home to roost.

It is simply not sustainable.

reply
PeterStuer
4 hours ago
[-]
How much of their turnover is financed directly or indirectly by themselves, then leveraged further by their 'customers' to collaterize further investments?

Are they already "too big to fail"? For better or worse, they are 'all in' on AI.

reply
iwontberude
1 hour ago
[-]
Click bait for teaching options analysis
reply
iancmceachern
3 hours ago
[-]
The real question is what else will this cause to fall when it does happen.
reply
dist-epoch
2 hours ago
[-]
I'm calling it - this is a submarine article to prove that Haskell is used in the real world to solve actual problems
reply
weirdmantis69
3 hours ago
[-]
It's forward looking P/E is 24-26. That doesn't seem like a huge crash is coming. It could come down a bit but they print money. They also have potential car market and robots coming in.
reply
bilater
3 hours ago
[-]
was expecting some actual reasons presented as to why this would happen. instead got some math.
reply
bigbuppo
4 hours ago
[-]
Since there's such an interdependence between nvidia and the other companies involed in AI to the point that if one fails they all fail, shouldn't the analysis focus on the weakest link in the AI circle jerk?
reply
tuetuopay
3 hours ago
[-]
Nvidia is the biggest link, however, I'd wager OpenAI and the likes are big enough to make a significant dent in the mammoth. So yeah, this analysis is sort of a spherical cows in a vacuum situation.

Still, it's interesting the probability is so high while ignoring real-world factors. I'd expect it to be much higher due to: - another adjacent company dipping - some earnings target not being met - china/taiwan - just the AI craze slowing down

reply
mwkaufma
1 hour ago
[-]
"Predictably" prediction markets have opened up space in the void left by journalism for tea-leaf reading with the fig leaf of mathy jargon.
reply
MuffinFlavored
3 hours ago
[-]
Worth noting that the implied volatility extracted here is largely a function of how far OTM the strike is relative to current spot, not some market-specific view on $100. If NVDA were trading at $250 today, the options chain would reprice and you'd extract similar vol for whatever strike was ~45% below. The analysis answers "what's the probability of a near-halving from here" more than "what's special about $100." Still useful for the prediction contest, but the framing makes it sound like the market is specifically opining on that price level.
reply
visarga
3 hours ago
[-]
this is gpt, right?
reply
Sohcahtoa82
3 hours ago
[-]
There are grammatical mistakes and abbreviations, big tells that it's NOT ChatGPT.
reply
MuffinFlavored
3 hours ago
[-]
I had a conversation (prompts) with Claude about this article because I didn't feel I could as succinctly describe my point alone.
reply
fooey
4 hours ago
[-]
reply
syntaxing
3 hours ago
[-]
I’m more curious how these “future” contract will work out. Supposedly, a bunch of RAM is paid and allocated for that isn’t even made yet. If the bubble ever pops, the collateral is going to be on the order of 2007 subprime mortgage crisis
reply
10xDev
3 hours ago
[-]
I mean common sense reasoning tells me that if OpenAI has decided to turn into an ad business, the actual return expected from investing into compute isn't going to be nearly as great as advertised.
reply
mvdtnz
3 hours ago
[-]
People don't actually believe this type of analysis... do they?
reply
bitshiftfaced
2 hours ago
[-]
You have it turned upside down. The analysis is of people's beliefs. In other words, the underlying data is created from the beliefs of the people who trade it, and the analysis is taking those beliefs and applying it to a specific question.
reply
cheald
3 hours ago
[-]
The entire options market is built on this kind of analysis.
reply
incomingpain
3 hours ago
[-]
Nvidia PE ratio: 44

I do hope they crash so that I can buy as much as possible at a discount.

reply
4fterd4rk
3 hours ago
[-]
Them being far above the median PE ratio for the S&P 500 tells you that a future correction would be a discount and you should buy? Please walk me through your logic on this one.
reply
Joel_Mckay
41 minutes ago
[-]
Every gambler thinks they can time the market, and buy the dip.

In general, they often get stung by the dead cat bounce, =3

https://en.wikipedia.org/wiki/Dead_cat_bounce

reply
linkregister
1 hour ago
[-]
This implies you think a crash would be a temporary mispricing of the stock, which will recover in value, correct?
reply
Joel_Mckay
34 minutes ago
[-]
While I am no fan of NVIDIA, they are effectively a Monopoly for CUDA GPU.

This means that cash revenue will likely remain high long after the LLM hype bubble undergoes correction. The market will eventually saturate as incremental product improvements stall, and demand rolls off rather than implodes. =3

reply
immibis
4 hours ago
[-]
It's easy to predict that a bubble will pop, but there's a variance in the timing of approximately half a human lifetime, and if you don't guess that correctly, you throw away yours.

Everything that can't go on forever will eventually stop. But when?

reply
baal80spam
3 hours ago
[-]
Well put and clearly explains why "timing the market" is never a good plan.
reply
zvqcMMV6Zcr
4 hours ago
[-]
Technical analysis is amazing, it is most refined form of pseudoscience.
reply
cheald
3 hours ago
[-]
This isn't technical analysis, this is an article on how to use the options market's price discovery mechanism to understand what the discovered price implies about the collective belief about the future price of the underlying.
reply
stonogo
3 hours ago
[-]
That's what "technical analysis" means in the finance world, though... so, am I missing a joke?
reply
cheald
2 hours ago
[-]
Technical analysis is the projection of future price data through analysis of past price data (usually for the purpose of trying to create trendlines or find "patterns"). Options pricing is quite a different beast - it encodes marketwide uncertainty about the future price of the underlying, which has little to do with the past price action of the underlying, and everything to do with all known information about the actual underlying company, including fundamentals analysis, market sentiment, future expectations and risks, etc.

To put it another way, to price an option I need a) the current price of the underlying, b) the time until option expiry, c) the strike price of the option, and d) the collective expectation of how much the underlying's price will vary over the period between now and expiry. This last piece is "volatility", and is the only piece that can't be empirically measured; instead, through price discovery on a sufficiently liquid contract, we can reparameterize the formula to empirically derive the volatility expectation which satisfies that current price (or "implied volatility"). Due to the efficient market hypothesis, we can generally treat this as a best-effort proxy for all public information about the underlying. None of this calculation requires any measurement or analysis of the underlying's past price action, patterns, etc. The options price will necessarily include TA traders' sentiments about the underlying based on their TA (or whatever else), just as it will include fundamentals traders' sentiments (and, if you're quick and savvy enough, insiders' advance knowledge!) The price fundamentally reflects market sentiment about the future, not some projection of trends from the past.

reply
t_serpico
3 hours ago
[-]
how so? (i'm not too familiar)
reply