News on Medial

Samsung missed out on Nvidia's most expensive AI card but beats Micron to 36GB HBM3E memory — could this new tech power the B100, the successor of the H200?

TechradarTechradar · 1y ago
Samsung missed out on Nvidia's most expensive AI card but beats Micron to 36GB HBM3E memory — could this new tech power the B100, the successor of the H200?
Medial

Samsung has unveiled the industry's first 12-stack HBM3E 12H DRAM, offering high bandwidth and capacity, potentially benefiting Nvidia's AI cards. The HBM3E 12H provides a bandwidth of up to 1,280GB/s and a leading capacity of 36GB, surpassing Micron Technology's offerings. It utilizes advanced thermal compression non-conductive film (TC NCF), resulting in a 20% increase in vertical density compared to Samsung's previous product. Samsung has already begun sampling the memory to customers, with mass production slated for the first half of this year. Micron, on the other hand, will ship its 24GB 8H HBM3E in Q2 2024.

Related News

Download the medial app to read full posts, comements and news.