Is there some intuition we can apply to estimate how long it will take for supply to catchup to demand?
A RAM chip takes several months to make, starting from an empty silicon wafer. Each chip takes 8-10 weeks to go through the process of lithography, deposition, etching, cleaning, etc. It then must be tested, which can take another couple of weeks, then packaged, before it can be sold to manufacturers. Thus, even if fab capacity were available today (it isn't), you'd still see a multi-month lag before new supply hit the market.
(This is an extraordinarily sensitive process, and disrupting it can cause you to lose the entire batch. You might have heard of cases where "wafer starts" had to be discarded due to a tsunami or power disruption - this is why.)
The problem is actually making chips. The machines use to make modern integrated circuits are some of the most precise equipment in the world, manufacturing structures just tens of atoms across.
Getting more factories online might take close to a decade, and that's if anyone wants to pay: The current demand showed up basically overnight as some of the companies (running of investor money with no way to make profit) started a bidding war. Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.
Historically dynamic RAM has gone through several boom/bust cycles, oscillating between manufacturers struggling to break-even and cutting production and then a few years later not being able to make enough chips. I remember the late 80s being another time where companies were delaying new product launches because they couldn't get DRAM.
In fact, memory prices often change when a new factory is built. Manufacturers usually buy memory when prices are low, so we don’t really notice these changes when we buy products.
I work in the semiconductor industry, I recently asked the same question to some experts around me. They told me there is a lesson from the early days of personal computers. Improvements in operating systems reduced the amount of RAM needed, which caused serious problems for the memory industry.
Modern CPUs have built in memory controllers. Most of them can only talk to a small variety of memory; maybe DDR4 or DDR5 and sometimes a LP variant. If you wanted to make DDR3, you'd also need a special CPU with a DDR3 compatible memory controller.
I think once production stops for a given memory type at a specific fab, they get rebuilt for a new type of RAM. It doesn't make sense to keep the stuff around because there's been enough DDR3 ram made for the rest of time.
It's different for other fabs, where old lines can remain valuable for lower cost production.
New production capacity takes years to bring online, and manufacturers are rightly cautious of the current demand bubble bursting, leaving them billions of dollars out of pocket.
Price collusion, and dumping (flooding market with low prices) if any real competitor shows up.
Someone please correct me if I'm wrong.
Another factor is that the viability of building new fabs is based on the assumption that there is no AI bubble that will burst. Opinions differ on how large that risk is.
See this video by Anastasi In Tech to understand the memory crisis - https://www.youtube.com/watch?v=KghkI5Oh_lY