Samsung will supply half of next-gen DRAM chips needed by Nvidia

Jenith

Well-known Member
Joined
Mar 25, 2019
Posts
51,959
Likes
181
After missing out on most of the lucrative AI memory chip business last year, Samsung has worked hard on improving its high-end memory chips and production efficiency this year. It recently sent its sixth-generation high-bandwidth memory (HBM4) chip samples to Nvidia for final approval. Now, it is ready to supply another type of memory module needed in AI servers.

It has been revealed that it will supply half of the Small Outline Compression Attached Memory Module 2 (SOCAMM2) that Nvidia needs.

Samsung to supply 50% of all SOCAMM2 DRAM modules Nvidia needs in 2026​


Hankyung Insight has revealed that Samsung plans to supply more than half of the SOCAMM2 modules that Nvidia needs in 2026. Apparently, Samsung has confirmed its supply plans to the publication. SOCAMM is generally referred to as the second-grade HBM and is used in AI data centres, and Samsung has developed a second-generation SOCAMM module that Nvidia plans to use next year.

https://www.imeisource.com/wp-content/uploads/2025/12/Samsung-SOCAMM2-Memory-Module.jpg

While Micron was the world's first memory chipmaker to start supplying SOCAMM modules in large quantities and was the biggest SOCAMM supplier this year, Samsung and SK Hynix appear to have improved the production of their SOCAMM modules. Samsung is said to have secured stable yields and performance of fifth-generation DRAM (1c) chips that is the main component of SOCAMM2 memory modules.

Multiple SOCAMM2 modules are placed next to Nvidia's Vera CPU in the company's Vera Rubin board. The Vera CPU controls the Rubin GPU, which ultimately does all the heavy lifting in the advanced AI processing. Previously, LPDDR DRAM chips were used instead of SOCAMM modules. SOCAMM combines four LPDDR DRAM modules into a single substrate. SOCAMM also makes it easier to upgrade the memory.

https://www.imeisource.com/wp-content/uploads/2025/12/Nvidia-Vera-Rubin-NVL144-Tray-AI-Platform.jpg

Nvidia has reportedly asked the memory chip industry to supply up to 20 billion Gigabytes (GB) of SOCAMM modules, and a contract is underway in which Samsung will supply half (10 billion GBs) of the required modules. Approximately 830 million 24Gb LPDDR5X DRAM chips are required to make 10 billion GB of SOCAMM2 modules. So, an estimate of 30,000 to 40,000 wafers are needed to be made per month, which is 5% of Samsung's total monthly DRAM production.

Micron and SK Hynix will reportedly supply the remaining modules to Nvidia.

As Samsung is getting ready to supply HBM4 and SOCAMM2 chips to Nvidia and other AI chipmakers, it will earn billions in profits in the next couple of years.

Image Credits: Nvidia YouTube, Hankyung

The post Samsung will supply half of next-gen DRAM chips needed by Nvidia appeared first on imeisource.
 
Back
Top