
"SK Hynix wants to significantly scale up production of AI memory chips to respond to rapidly growing demand from data centers worldwide. This was announced by Chey Tae-won, chairman of SK Group, the parent company of the South Korean chip manufacturer. This was reported by Bloomberg. The global wave of investment in AI infrastructure is putting the memory sector under considerable pressure. Data centers need more and more specialized memory chips to train and run AI models, with high-bandwidth memory playing an increasingly important role."
"According to Chey Tae-won, capital investments in 2026 will be significantly higher than in previous years. The additional investments are necessary to be able to supply sufficient HBM chips, which are essential for AI accelerators such as those developed by Nvidia. These chips are used to train and run large-scale AI models and are an indispensable part of modern data centers."
SK Hynix will significantly scale up production of high-bandwidth memory (HBM) chips to meet rapidly growing data-center demand. Capital investments in 2026 will be substantially higher than in previous years to supply sufficient HBM chips for AI accelerators such as those developed by Nvidia. High-bandwidth memory is exceptionally profitable and has helped SK Hynix's share price more than quadruple over the past year amid record results. Major US technology companies are allocating about $650 billion for AI-related infrastructure this year, intensifying pressure on global memory supply. The market is concentrated among SK Hynix, Samsung Electronics and Micron Technology, and current demand has produced widespread shortages.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]