Samsung ramps up HBM4 production for next-gen AI chips

Samsung is betting on the artificial intelligence market and plans to significantly increase profits by ramping up production of specialized HBM4 memory. These sixth-generation chips with high bandwidth will become key components in next-generation AI accelerators from Nvidia and AMD. Unlike classic DDR memory, these solutions offer manufacturers substantially higher margins, making this direction particularly attractive.

According to available information, the company aims to boost HBM4 DRAM production to 120,000 wafers per month. This is a major move requiring substantial investment, but Samsung clearly aims to establish itself among leaders amid explosive growth in demand for AI computing. The accelerator market is accelerating rapidly, and along with it, prices for high-bandwidth memory are rising.

To support this expansion, Samsung is upgrading the P4 line at its Pyeongtaek factory, installing new equipment and improving existing processes. The company seeks not just to increase volumes but also to enhance production efficiency to compete on equal footing with SK Hynix and Micron.

Notably, Samsung recently lagged behind competitors in the HBM3E segment, but the situation is now changing. Preliminary data suggests Samsung's HBM4 may outperform counterparts in performance thanks to a more advanced manufacturing process. If the company can consistently reach the stated volumes, it could not only catch up with competitors but also set the pace for the entire AI memory market.