[ad_1]
SOUTH Korea’s SK Hynix said on Thursday (May 2) its high-bandwidth memory (HBM) chips used in artificial intelligence (AI) chipsets were sold out for this year and almost sold out for 2025 as businesses aggressively expand AI services.
The Nvidia supplier and the world’s second-largest memory chipmaker will begin sending samples of its latest HBM chip, called the 12-layer HBM3E, in May and begin mass producing them in the third quarter.
“The HBM market is expected to continue to grow as data and (AI) model sizes increase,” chief executive officer Kwak Noh-Jung told a news conference. “Annual demand growth is expected to be about 60 per cent in the mid-to-long-term.”
Last month, SK Hynix announced a US$3.87 billion plan to build an advanced chip packaging plant in the US state of Indiana with an HBM chip line and a 5.3 trillion won (S$5.2 billion) investment in a new Dram chip factory at home with a focus on HBMs.
Kwak said investment in HBM differed from past patterns in the chip industry in that SK Hynix had a greater sense of customer demand because capacity is being increased after consultations with customers.
Last week, SK Hynix said in a post-earnings conference call that there may be a shortage of regular memory chips for smartphones, personal computers and network servers by the year’s end if demand for tech devices exceeds expectations.
GET BT IN YOUR INBOX DAILY
Start and end each day with the latest news stories and analyses delivered straight to your inbox.
By 2028, the portion of chips made for AI, such as HBM and high-capacity Dram modules, are expected to account for 61 per cent of all memory volume in terms of value from about 5 per cent in 2023, SK Hynix’s head of AI infrastructure Justin Kim said. REUTERS
[ad_2]
Source link