Being the first company to ship HBM3E memory has its perks for Micron, as the company has revealed that is has managed to sell out the entire supply of its advanced high-bandwidth memory for 2024, while most of their 2025 production has been allocated, as well. Micron's HBM3E memory (or how Micron alternatively calls it, HBM3 Gen2) was one of the first to be qualified for NVIDIA's updated H200/GH200 accelerators, so it looks like the DRAM maker will be a key supplier to the green company.

"Our HBM is sold out for calendar 2024, and the overwhelming majority of our 2025 supply has already been allocated," said Sanjay Mehrotra, chief executive of Micron, in prepared remarks for the company's earnings call this week. "We continue to expect HBM bit share equivalent to our overall DRAM bit share sometime in calendar 2025."

Micron's first HBM3E product is an 8-Hi 24 GB stack with a 1024-bit interface, 9.2 GT/s data transfer rate, and a total bandwidth of 1.2 TB/s. NVIDIA's H200 accelerator for artificial intelligence and high-performance computing will use six of these cubes, providing a total of 141 GB of accessible high-bandwidth memory.

"We are on track to generate several hundred million dollars of revenue from HBM in fiscal 2024 and expect HBM revenues to be accretive to our DRAM and overall gross margins starting in the fiscal third quarter," said Mehrotra.

The company has also began sampling its 12-Hi 36 GB stacks that offer a 50% more capacity. These KGSDs will ramp in 2025 and will be used for next generations of AI products. Meanwhile, it does not look like NVIDIA's B100 and B200 are going to use 36 GB HBM3E stacks, at least initially.

Demand for artificial intelligence servers set records last year, and it looks like it is going to remain high this year as well. Some analysts believe that NVIDIA's A100 and H100 processors (as well as their various derivatives) commanded as much as 80% of the entire AI processor market in 2023. And while this year NVIDIA will face tougher competition from AMD, AWS, D-Matrix, Intel, Tenstorrent, and other companies on the inference front, it looks like NVIDIA's H200 will still be the processor of choice for AI training, especially for big players like Meta and Microsoft, who already run fleets consisting of hundreds of thousands of NVIDIA accelerators. With that in mind, being a primary supplier of HBM3E for NVIDIA's H200 is a big deal for Micron as it enables it to finally capture a sizeable chunk of the HBM market, which is currently dominated by SK Hynix and Samsung, and where Micron controlled only about 10% as of last year.

Meanwhile, since every DRAM device inside an HBM stack has a wide interface, it is physically bigger than regular DDR4 or DDR5 ICs. As a result, the ramp of HBM3E memory will affect bit supply of commodity DRAMs from Micron, the company said.

"The ramp of HBM production will constrain supply growth in non-HBM products," Mehrotra said. "Industrywide, HBM3E consumes approximately three times the wafer supply as DDR5 to produce a given number of bits in the same technology node."

Source: Micron

Comments Locked


View All Comments

  • PeachNCream - Friday, March 22, 2024 - link

    Typo error "...high-bandwidth memory or 2024..."

    Perhaps "or" should be "for"
  • Sonicsgp1 - Friday, March 22, 2024 - link

    You stole this. HBM3E. The patents for this belongs to netlist and you don’t have a license why your court starts next week. Can’t wait for the injunction. In Europe courts already are putting together an injunction for a cease a desist. 😂😂😂 good luck in selling stolen products
  • trivik12 - Friday, March 22, 2024 - link

    I think netlist is a patent troll and will win nothing.
  • Samus - Friday, March 22, 2024 - link

    Good time to invest in Micron and nVidia. Only question is when the latter bursts because like Cisco 25 years ago, their products aren't entirely unique - they just have a head start over everyone else.
  • Dante Verizon - Friday, March 22, 2024 - link

    You mean SK Hynix*
  • Samus - Monday, March 25, 2024 - link

    Hynix is years away from the volume Micron has. They're behind on Yongin Semiconductor, Fab 1 won't be online until next year, and who knows when Fab 2-4 will be complete. Nice try tho!
  • Samus - Monday, March 25, 2024 - link

    More specifically (and a google search will back this up) Micron has 26% of the DRAM market while Hynix has 30% of the DRAM market. But within that market, Micron has overtaken Hynix on HBM3 since last year, and while numbers are unavailable because we are talking about a new product that is barely shipping, the fact Micron has allotted its entire supply of HBM3e, and is shipping, while Hynix just started production 6 days ago and wont ship for possibly months, there are two key takeaways here:

    1) Micron has more partner confidence as they were sampling over a year ago
    2) Micron is actively shipping product
  • Dante Verizon - Monday, March 25, 2024 - link

    Micron does not have 1/3 of the HBM production volume that SK Hynix has. You just looked at the specs and thought that translates into market dominance. Both volume and yield are lower.

    SK Hynix has always been the dominant player in the HBM market, no wonder it was chosen to help get the technology off the ground. At the moment it has 80% of the market.
  • Samus - Wednesday, March 27, 2024 - link

    Forgetting the fact a Google search proves you are full of shit, just think logically here: nVidia, the third most valuable private company on the planet, has exclusively partnered with a company which, according to you, has 20% of the market for a component nVidia needs for their SURVIVAL in AI?

    I might be going out on a limb here but I'd bet you don't know a lot about operating a successful business.
  • Dante Verizon - Wednesday, March 27, 2024 - link

    Nvidia and AMD will simply use whatever is available, that doesn't mean anything. I doubt Micron has 20% of the market, if they did it would be a huge victory for them.

Log in

Don't have an account? Sign up now