AI’s RAM Hunger Hands Samsung and Memory Rivals Record Profits

January 8, 2026
5 min read
Close-up of Samsung memory chips on a circuit board

PC builders staring at $340 RAM kits have someone to blame: the AI boom that’s turning memory makers into profit machines.

Samsung, SK Hynix, and Micron are all reporting record-setting quarters as data centers hoard DRAM and high‑bandwidth memory (HBM) for generative AI workloads.

Samsung’s profit whiplash

Samsung Electronics says it expects between 19.9 and 20.1 trillion Korean won in operating profit for Q4 2025. That’s about $13.8 billion and more than triple the 6.49 trillion won it made in Q4 2024.

Samsung is much more than a memory company, but its earnings tend to move with the DRAM and NAND cycles. In 2023, it was on the wrong side of that curve, stuck with a memory oversupply that pushed its memory division into multi‑billion‑dollar losses. Two years later, tight supply and AI demand have flipped the script.

SK Hynix and Micron are cashing in too

Pure‑play memory vendors are even more exposed to the AI wave—and they’re loving it.

  • SK Hynix logged its “highest-ever quarterly performance” in Q3 2025, posting 11.38 trillion won in operating profit (about $7.8 billion), up from 7.03 trillion won in Q3 2024.
  • Its operating margin jumped from 40% to 47% year over year.
  • The company explicitly credits “expanding investments in AI infrastructure” and “surging demand for AI servers” for the jump.

Micron is seeing the same pattern even after it walked away from the consumer RAM and SSD business.

  • Net income climbed from $1.87 billion in Q1 2025 to $5.24 billion in Q1 2026.
  • CEO Sanjay Mehrotra says “total company revenue, DRAM and NAND revenue, as well as HBM and data center revenue and revenue in each of our business units, also reached new records [in fiscal Q1],” and the company generated its “highest ever free cash flow.”

When all three of the world’s major DRAM makers are posting record or near‑record numbers at the same time, you know something structural is going on.

Why your RAM suddenly costs 4x more

If you tried to build or upgrade a PC recently, you’ve felt the other side of this boom.

A 32 GB DDR5‑6000 kit that cost around $80 in August 2025 is closer to $340 by early 2026. That’s more than a 4x jump in under half a year.

So far, we haven’t seen similarly brutal price hikes in pre‑built laptops, phones, or GPUs, but most manufacturers expect that to change if shortages drag on.

There are multiple drivers, including scalpers hoarding popular kits to flip them at a markup. But two AI‑specific dynamics are doing most of the damage—and they reinforce each other.

1. You’re bidding against OpenAI for DRAM

The first problem is simple demand. The same standard DRAM that ships in consumer PCs and traditional servers is exactly what hyperscalers and AI companies are buying at scale.

By some estimates, OpenAI’s planned “Stargate” data center alone could consume as much as 40% of the world’s DRAM output—based on 2024–2025 production levels, without assuming any factory expansions.

That number may come down as new capacity comes online, but the signal is clear: AI players are no longer just big customers; they’re crowding everyone else out.

2. HBM eats wafer space, starving DDR5

The second problem is on the supply side.

The HBM (high‑bandwidth memory) stacks used alongside Nvidia’s AI data center GPUs take up far more room on silicon wafers than conventional DRAM.

According to the companies, producing a given amount of HBM uses roughly three times as much wafer area as the same amount of standard DDR5.

Every time a manufacturer retools a line from regular DRAM to HBM to feed AI GPUs, it doesn’t just shift one‑for‑one; it disproportionately cuts the potential output of DDR5 and other mainstream memory.

Either of these trends—AI demand or HBM reallocation—would have nudged RAM prices upward. Hitting both at once has launched them into the stratosphere.

How long does this last?

Right now, there’s little relief in sight if you’re buying memory instead of selling it.

  • Bank of America analysts, cited by SK Hynix, expect average DRAM selling prices to rise by as much as 33% in 2026.
  • They also project that by 2028, the HBM market alone could be larger than the entire RAM market was in 2024.
  • Micron’s Mehrotra expects both strong demand and constrained supply “to persist beyond calendar 2026.”

Put differently: the industry is planning for years, not months, of elevated memory pricing, driven heavily by AI infrastructure.

What could break the cycle

All of these bullish forecasts rest on one key assumption: that AI data centers keep scaling at their current pace.

If there’s an AI bubble and it bursts or gradually deflates, memory makers could end up right back where Samsung was in 2023—sitting on mountains of unsold chips and slashing prices to move inventory.

For now, though, the imbalance is very real. AI giants are paying whatever it takes to lock in DRAM and HBM supply, and the rest of the market—PC enthusiasts included—is footing the bill.

Comments

Leave a Comment

No comments yet. Be the first to comment!

Related Articles

Stay Updated

Get the latest AI and tech news delivered to your inbox.