Jenith
Well-known Member
- Joined
- Mar 25, 2019
- Posts
- 52,589
- Likes
- 199
After losing to Micron and SK Hynix for the past couple of years in the high-bandwidth memory segment, Samsung unveiled its first HBM4 chip at the Semiconductor Exhibition (SEDEX) 2025 expo in South Korea. This marks the company’s huge step to regain lost ground in the next few years with its most advanced memory chip yet.
At the SEDEX 2025 exposition held at COEX from October 22 to 24, 2025, in Seoul, South Korea, Samsung unveiled (via Asia Business Daily) its first HBM4 chip. HBM4, the sixth-generation high-bandwidth memory chip, is used in AI accelerators made by companies like AMD and Nvidia. These AI accelerators power the Generative AI algorithms used by some of the world’s largest firms.
The performance of Samsung’s HBM4 chip is crucial. If Nvidia decides to buy Samsung’s HBM4 chips, the South Korean firm could potentially earn billions of dollars in operating profit every quarter for the next few years.
https://www.imeisource.com/wp-content/uploads/2025/10/Samsung-HBM4-Chip-HBM3E-SEDEX-2025.jpg
Image Credits: Asia Business DailyIf Nvidia decides to buy HBM4 chips from Samsung, the South Korean firm could earn billions of dollars every quarter for the next couple of years.
SK Hynix, Samsung’s biggest rival, has also completed the development of its HBM4 chip. The company showcased its HBM4 chip alongside Samsung at the same expo. It is reportedly in advanced talks with Nvidia for a large-scale supply. Micron, Samsung, and SK Hynix have all sent their HBM4 chips to Nvidia, which will test them over the next few weeks before deciding which company to offer the contract to.
Samsung developed its HBM4 chips using the 10nm class, sixth-generation (1c) DRAM process. This process is considered more advanced than the 10nm class, fifth-generation (1b) process used by SK Hynix for its HBM4 chips. While the 1c process theoretically offers higher performance, it’s unclear whose HBM4 chips perform better until Nvidia starts using them.
Currently, all the best AI accelerators are using HBM3E chips, and Samsung has been selling its HBM3E chips for the past year or so. HBM4 will be used in Nvidia's next-generation AI accelerator, Rubin. Samsung lost a lot of business opportunities last year due to issues with its HBM3E chips, and it is aiming not to repeat its past mistakes.
Image Credits: Asia Business Daily
The post Samsung unveils HBM4 chip that could be its biggest money-maker appeared first on imeisource.
Samsung unveils HBM4 chips that could be used in AI servers worldwide
At the SEDEX 2025 exposition held at COEX from October 22 to 24, 2025, in Seoul, South Korea, Samsung unveiled (via Asia Business Daily) its first HBM4 chip. HBM4, the sixth-generation high-bandwidth memory chip, is used in AI accelerators made by companies like AMD and Nvidia. These AI accelerators power the Generative AI algorithms used by some of the world’s largest firms.
The performance of Samsung’s HBM4 chip is crucial. If Nvidia decides to buy Samsung’s HBM4 chips, the South Korean firm could potentially earn billions of dollars in operating profit every quarter for the next few years.
https://www.imeisource.com/wp-content/uploads/2025/10/Samsung-HBM4-Chip-HBM3E-SEDEX-2025.jpg
Image Credits: Asia Business DailyIf Nvidia decides to buy HBM4 chips from Samsung, the South Korean firm could earn billions of dollars every quarter for the next couple of years.
SK Hynix, Samsung’s biggest rival, has also completed the development of its HBM4 chip. The company showcased its HBM4 chip alongside Samsung at the same expo. It is reportedly in advanced talks with Nvidia for a large-scale supply. Micron, Samsung, and SK Hynix have all sent their HBM4 chips to Nvidia, which will test them over the next few weeks before deciding which company to offer the contract to.
Samsung developed its HBM4 chips using the 10nm class, sixth-generation (1c) DRAM process. This process is considered more advanced than the 10nm class, fifth-generation (1b) process used by SK Hynix for its HBM4 chips. While the 1c process theoretically offers higher performance, it’s unclear whose HBM4 chips perform better until Nvidia starts using them.
Currently, all the best AI accelerators are using HBM3E chips, and Samsung has been selling its HBM3E chips for the past year or so. HBM4 will be used in Nvidia's next-generation AI accelerator, Rubin. Samsung lost a lot of business opportunities last year due to issues with its HBM3E chips, and it is aiming not to repeat its past mistakes.
Image Credits: Asia Business Daily
The post Samsung unveils HBM4 chip that could be its biggest money-maker appeared first on imeisource.