Skip to content

Intel and SoftBank team up to develop power-saving alternative to HBM for AI data centers, according to a report

SoftBank aims to dominate the AI chip memory sector by introducing a more proficient alternative to HBM.

SoftBank aims to dominate the AI chip memory sector by introducing an efficient alternative to HBM,...
SoftBank aims to dominate the AI chip memory sector by introducing an efficient alternative to HBM, striving for enhanced processing capabilities.

Here's a fresh take on the news:

Intel and SoftBank team up to craft a power-saving DRAM alternative

Intel and SoftBank team up to develop power-saving alternative to HBM for AI data centers, according to a report

America's biggest chipmaker Intel has joined forces with Japanese tech juggernaut SoftBank to cook up a 3D stacked DRAM replacement for High Bandwidth Memory (HBM) aimed at boosting power efficiency for AI processors. This collaborative effort, known as Saimemory, intends to bring a working prototype and an assessment of mass production feasibility by 2027, with commercial success coming before the decade's end, quips Nikkei Asia.

AI GPUs devour vast amounts of temporary data, and HBM chips are currently the go-to option. However, these bad boys are tricky to manufacture, costly, and burn through power faster than a steak on a barbecue grill. Enter Saimemory, which huddles DRAM chips in tight stacks and figures out a slicker way to wire 'em up. The result? A stacked DRAM chip consuming half as much power as a comparable HBM chip, perfect for cooling down data centers sweltering under AI's power-hungry appetite.

Should this experiment yield successful results, SoftBank plans to hog up a big chunk of these swanky new chips. Nowadays, only three companies reign supreme in the HBM market—Samsung, SK hynix, and Micron. With escalating demand for AI chips making HBM supplies a hot commodity hard to come by, Saimemory aims to seize a hold on the Japan-based AI data center market. Additionally, this will be Japan's first stab at becoming a heavyweight memory chip supplier since the '90s, when it once controlled around 70% of the global supply. However, the rise of South Korean and Taiwanese competitors has forced many local manufacturers off the market.

While Saimemory isn't the first to dip its toes into 3D stacked DRAM, Samsung and NEO Semiconductor have their own projects in the works. Samsung has been tinkering with 3D and stacked DRAM ideas since last year, expand memory modules of up to 512GB. Meanwhile, NEO Semiconductor focuses on 3D X-DRAM for increased capacity. Saimemory diverges from this path by prioritizing power efficiency, a crucial focus for data centers struggling with soaring AI power consumption rates.

Stay tuned with Tom's Hardware on Google News for updates on the AI chip market, reviews, and more! And don't forget to subscribe to our newsletter for the latest in hardware news right in your inbox.

Technology and data-and-cloud-computing are key areas that emerge from the Intel-SoftBank collaboration on Saimemory, a 3D stacked DRAM alternative aimed at increasing power efficiency for AI processors. If successful, Saimemory could disrupt the market currently dominated by three companies – Samsung, SK hynix, and Micron – in the High Bandwidth Memory (HBM) market, possibly influencing the Japan-based AI data center market.

Read also:

    Latest