The Korean tech giant also said that it has developed the industry’s first and highest-capacity 32 gigabit (Gb) DDR5 DRAM, using the industry’s leading 12-nanometer process technology, the report added.
Samsung will start supplying high bandwidth memory, or HBM3, chips — a newer generation of memory optimized to work with artificial intelligence, or AI, accelerators — to Nvidia as early as October, the report added citing industry sources.
Nvidia and Samsung have also agreed on the Korean company’s supply of HBM3 chips for next year. Samsung would likely supply about 30% of Nvidia’s HBM3 needs in 2024, the report added.
The company’s shares gained the most since January 2021. Until now, Samsung’s competitor SK Hynix was Nvidia’s only provider of HBM3 chips.
Last month, Samsung provided Nvidia with samples of its fourth-generation High Bandwidth Memory3 chips for quality check on Nvidia’s A100 and H100 Tensor Core graphics processing units, or GPUs, the report added.
SK Hynix, reportedly, also said last month that it provided samples of a new high-performance chip HBM3E to Nvidia for evaluation.
Nvidia makes GPUs used in generative AI services such as ChatGPT, developed by Microsoft (MSFT)-backed OpenAI. ChatGPT is known to use about 10,000 units of Nvidia’s A100 chip and HBM3 DRAM is an important part in A100, the report added.
Samsung previously be supplying its HBM3 chips to Advanced Micro Devices (AMD) after a successful evaluation on AMD’s Instinct MI300X accelerators, according to the report.
With these two supply deals, Samsung’s global HBM chip market share is expected to be over 50% next year.
In addition, Samsung is also in discussions to offer chip packaging services for Nvidia’s GPUs and AMD’s central processing units. Nvidia has been heavily depending on Taiwan Semiconductor Manufacturing (TSM) for its chip packaging. With TSM’s packaging process lines approximately fully booked, Nvidia and other fabless companies are tuning to other foundry players, as per the report.