Brace for a barren landscape of new hardware launches, as AI demand reshapes the world of consumer electronics — trillions in AI investment threaten to derail entire industries
As DRAM demand outstrips supply, the future of consumer electronics looks uncertain.
Get 3DTested's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
CES 2026 is a show that usually paints us a picture of what's to come. But this year was different. Dominated by AI and a handful of product launches in Panther Lake and AMD's new AI 400 chips, there wasn't exactly a whole lot to see. When the Consumer Electronics Show starts looking like the Corporate Electronics Show, alarm bells should start going off in your head.
Sure, Nvidia's upcoming Rubin platform looks incredibly impressive, but there were no new consumer GPUs to speak of. There is little to show what actual consumers and enthusiasts might look forward to for the rest of the year, and that's not only illustrated by what our staff on the show floor had to say, but also by the companies and supporting industries around it.
Now, we don't have to get into all the reasons why there's an ongoing NAND and DRAM shortage; we've explained it multiple times before. But, how far-reaching are the impacts of the current demand for AI chips, and why is it going this far? The picture that current market conditions paint is grim, and it's already too late to prepare for the great consumer chip winter upon us.
Everyone is affected
As tech enthusiasts, everyone wants something new to look forward to. A new chip on the latest leading-edge nodes, packed to the gills with power to run the most demanding games without breaking a sweat. Breakthroughs in efficiency to bring down power envelopes and chips to break records with. But, none of that seems to be happening in 2026 (Unless you were one of the lucky few who purchased the $5000 MSI Lightning Z RTX 5090).
Now, according to a Bloomberg report, the current tightened chip supply might push Sony's next PlayStation consoles to 2028 or 2029, with Nintendo contemplating raising the price of its Switch 2 consoles. Valve's Steam Frame VR Headset and Steam Console still don't have any pricing details announced, as the company struggles to keep its four-year-old Steam Deck in stock. We've seen some impact on DIY hardware, too, with AMD explaining that they've seen an uptick in AM4-based builds, over the newer DDR5-only AM5, likely due to how expensive memory has become for builders.
But it's not just gaming companies that are feeling the pinch. DRAM and NAND chips soaring to eye-watering levels are having a considerable impact on other areas. Think about every device that might use a RAM IC or house a small bit of flash. Entry-level electronics like smartphones, and much more, all rely on memory and flash in some capacity, and are also affected by pricing pressure. Even electronics such as routers are vulnerable to these pricing shifts. So, once current stocks run out, manufacturers and companies will have to buy at market rates, which are being pushed significantly higher due to AI data center demands.
The auto industry has also been subject to the onslaught of AI demand. EVs and other vehicles use specially qualified ICs (AEC-Q100) for use in extreme temperatures, and once stocks are gone, companies may find themselves scrambling for supply. “It’s not that we can no longer make semiconductors, or we can’t make enough semiconductors. We’re in a situation where the industry is constrained by qualification and requalification,” said Akshay Baliga, director at AlixPartners, in a recent interview with 3DTested Premium. Qualifying these specialty chips, which are comparatively low-margin compared to AI, means that memory makers have their eyes on a much bigger prize: the lucrative AI market.
The industry stands on a cliff's edge
New reports suggest that memory makers are set to earn a staggering $551 billion thanks to the AI boom, but it comes at a dear price. Additionally, memory and flash supply contracts are reportedly getting shorter: even if a company had a contract in place to lock pricing in, shorter contracts mean that companies become more exposed to market volatility, so end-product pricing and BOM costs could fluctuate. The obvious adjustment that product manufacturers might make is to adjust MSRP pricing on existing and upcoming retail products. This is likely one of the key reasons we're not seeing companies like Valve announce pricing on upcoming consumer hardware quite yet. No one can predict how high market pricing might get. This market pricing will inevitably affect consumer and enterprise products.
"The combined effect of these factors [related to increased memory prices] has resulted in a decline in mid to low-end [smartphone processor] orders received by foundries," said Zhao Haijun, co-CEO of SMIC, during an earnings call with financial analysts and investors.
Haijun continued to say that the ongoing cost increases may end up in a decline in demand for products, which may lead to disastrous consequences for some companies that rely on such chips.
Smaller electronics-makers may be disproportionally affected
To put the economics of how this all works into context, if you are a larger customer of DRAM, the likelihood is that you can secure better terms or contracts for pricing; the smaller you are, the less leverage you'll ultimately have. When combined with the fact that the AI industry is not only sucking up demand, but paying top-dollar for chips, means that smaller customers of DRAM and NAND are pulling the short straw, and might be more heavily affected.
Framework's latest update on the ongoing crisis states that DDR5 memory pricing is now between $12-16 per gigabyte, and their end product pricing has to be increased as a result of that. "The new system and Mainboard prices are 6-16% higher than before. We anticipate that here as well, costs from our suppliers are going to continue to increase over the next few months," says Nirav Patel, CEO and Founder of Framework.
So, if costs get higher, and consumer appetite for these products is lower, the future for smaller manufacturers gets called into question: How will they survive if product run-rates get lower, margins get slimmer, and there's seemingly no end in sight? The reality is, it's already too late to prepare for what's to come over the next few years, and the damage to these smaller companies has yet to be fully quantified.
IDC's latest analysis suggests that the PC market alone could shrink by up to 9%, which may not sound like much on paper, but a figure like this might be life-or-death for some businesses. If the chip supply crisis is hitting the PC market this hard alone, what about other industries, where the risk has yet to be factored in?
Why AI has an insatiable appetite for chips
Now that we've laid out the effects of AI demand on the memory and storage industries, it's important to note exactly how AI is using these chips in large-scale deployments across the globe. The average Nvidia Rubin NVL72 superchip is equipped with 288 GB of HBM 4 memory, which uses vertically stacked memory ICs, bonded together to offer more density in the same physical footprint. Therefore, High Bandwidth Memory requires around three times the number of ICs on a single chip compared to a DDR5 module. That's in addition to 128 GB of GDDR7 VRAM on the Nvidia CPX GPU on any single unit. Bolstered by high-speed data interconnects like Spectrum-X Photonics Ethernet, and Quantum-CX-9 Photonics for scale out (Photonics is another AI bottleneck, which is next-in-line after memory and packaging).
A single Nvidia Rubin NVL144 rack integrates 144 GPUs, equating to a staggering 20,736 TB of HBM 4 memory. So, if you're wondering where all the memory is going, look no further. The reasons behind these massive AI demands are the scale of AI model sizes. As models become larger, the number of parameters and weights associated with them also increases. This creates a demand for fast compute performance in loading model weights, which is why the interface width of HBM is so crucially important, with a rapid interface to keep up with demand when saturated. For example, Moonshot AI's Kimi 2.5 offers 1 trillion parameters in its latest Mixture of Experts (MoE) model, and can only be run in full-fat form on data-center-grade hardware.
Per-bit quantization is also a huge factor in AI deployment. Effectively, an AI model's weights (or 'experts' in an MoE model) are high-precision values, mapped to lower-precision data types. This results in a lower bit-density per-weight, which also affects the amount of VRAM used by the model. Nvidia's NVFP4 format can offer a substantial reduction in memory usage. But, despite efficiency gains thanks to breakthroughs like NVFP4, KVCache, or Deepseek's Engram, the race toward AGI means that frontier model developers want to get their hands on all of the compute power they can get if they want to train, develop, and run the latest and greatest models at scale.
The product winter is only beginning
Spending on AI infrastructure (which includes memory and storage chips at an eye-watering scale) could surpass $3 trillion over the next five years. Tech giants like Meta, Amazon, and Microsoft have also dedicated around $650 billion in CapEx in 2026 alone to facilitate these AI capabilities. The long-term outlook as a result of this level of spending remains to be seen, but one thing is clear: not every company that we know today will survive the deep product winter that we're already in.
"Consumer electronics will see a large number of failures. From the end of this year to 2026, many system vendors will go bankrupt or exit product lines due to a lack of memory." Phison CEO Pua Khein-Seng reportedly said in a recent interview. He reportedly added that the soonest we might see reprieve from the ongoing AI onslaught is by 2030 at the earliest, or another decade.
The last helicopters have already left, and the consumer electronics industry, while remaining clearly profitable for a select few, might be unrecognizable once this is all over. Wrap yourself up warm, and arm yourself with as much compute as you reasonably require; it might be a long wait until a new norm is established.
