10 points chatmasta 2 hours ago 17 comments

Is there some bottleneck in the supply chain, like rare earth metals or something, that’s limiting production throughput? Or do we simply have every factory already operating at max capacity and scaling up supply will require building more of them?

Is there some intuition we can apply to estimate how long it will take for supply to catchup to demand?

layer8 2 hours ago | parent

New fabs are being built, but that takes time. Estimates are that full production will start towards the end of 2027, or in some cases (Samsung, Micron) only in 2028. So expect RAM to remain scarce/expensive for the next two years.

Another factor is that the viability of building new fabs is based on the assumption that there is no AI bubble that will burst. Opinions differ on how large that risk is.

proee 1 hour ago | parent

Outside of fabrication, memory chips also require some very fancy high speed testers that need specialized ICs which are most likely back ordered.

bigfatkitten 1 hour ago | parent

The reason we have so few RAM manufacturers in the first place is that was (until just a few months ago) an extremely low margin business.

New production capacity takes years to bring online, and manufacturers are rightly cautious of the current demand bubble bursting, leaving them billions of dollars out of pocket.

quadramaDEV 1 hour ago | parent

1. Raw materials (rare earths, lithium, etc.) Yes, there are bottlenecks. Mining and refining take years to scale. A new lithium mine can take 5–10 years from discovery to production. Rare earth refining is also heavily concentrated in China, so geopolitics plays a role.

2. Factory capacity Also yes. Chip fabs (factories) cost $10–20 billion and take 3–5 years to build. Even then, they run 24/7 and are already at max. You can't just "add another shift" – they're already running flat out.

3. The intuition for timing:

Chips: 3–5 years for new fabs (plus tool lead times)

Batteries: 2–4 years for new gigafactories

Mining: 5–10 years for new mines

So for things like EVs and GPUs, we're looking at 2–5 years before supply really catches up, assuming no new disruptions.

The good news? Once new capacity comes online, it tends to stay online – so the next shortage won't be as bad.

aappleby 1 hour ago | parent

hello chatgpt

mangogogo 12 minutes ago | parent

it's beyond parody

ternus 1 hour ago | parent

Further to what's listed elsewhere:

A RAM chip takes several months to make, starting from an empty silicon wafer. Each chip takes 8-10 weeks to go through the process of lithography, deposition, etching, cleaning, etc. It then must be tested, which can take another couple of weeks, then packaged, before it can be sold to manufacturers. Thus, even if fab capacity were available today (it isn't), you'd still see a multi-month lag before new supply hit the market.

(This is an extraordinarily sensitive process, and disrupting it can cause you to lose the entire batch. You might have heard of cases where "wafer starts" had to be discarded due to a tsunami or power disruption - this is why.)

maurycyz 1 hour ago | parent

The raw materials are cheap: It's mostly just quartz, which is the most abundant mineral on earth.

The problem is actually making chips. The machines use to make modern integrated circuits are some of the most precise equipment in the world, manufacturing structures just tens of atoms across.

Getting more factories online might take close to a decade, and that's if anyone wants to pay: The current demand showed up basically overnight as some of the companies (running of investor money with no way to make profit) started a bidding war. Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.

mrandish 22 minutes ago | parent

> Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.

Historically dynamic RAM has gone through several boom/bust cycles, oscillating between manufacturers struggling to break-even and cutting production and then a few years later not being able to make enough chips. I remember the late 80s being another time where companies were delaying new product launches because they couldn't get DRAM.

WheelsAtLarge 55 minutes ago | parent

What's the current thinking on used ram? Is it worth it?

rkagerer 16 minutes ago | parent

Avoid suspiciously low-priced bargains, give it a good several days of solid memtesting (I used to use Memtest86 and Memtest86+, and Prime95 for good measure) and you should be fine.

terrycody 38 minutes ago | parent

It's price gouging, no matter what they say.

manofmanysmiles 30 minutes ago | parent

I read recently that there is an effective cartel of Samsung, SK Hynix and Micron.

Price collusion, and dumping (flooding market with low prices) if any real competitor shows up.

Someone please correct me if I'm wrong.

cocodill 29 minutes ago | parent

This could prevent people from making a lot of money. Here's why.

loose-cannon 17 minutes ago | parent

I'm curious about manufacturing from like 10-20 years ago. Would it be cheap to make a RAM chip from like 15 years ago? Or would it just not even work with our modern hardware? Like if we want more of something, can we pay with performance and efficiency instead? We seem to have this option with other types of technology.

raw_anon_1111 6 minutes ago | parent

Only TSMC as far as I know keeps fully depreciated fab lines online to make older processors for things that don’t need to make the latest tech.

taki4416 3 minutes ago | parent

RAM factories can only produce memory chips, so companies are careful about expanding their production lines. If demand for memory drops later, they could suffer huge losses.

In fact, memory prices often change when a new factory is built. Manufacturers usually buy memory when prices are low, so we don’t really notice these changes when we buy products.

I work in the semiconductor industry, I recently asked the same question to some experts around me. They told me there is a lesson from the early days of personal computers. Improvements in operating systems reduced the amount of RAM needed, which caused serious problems for the memory industry.