Getty Images/iStockphoto
SK Hynix's $75B investment in AI chips shows a growing trend
The South Korea-based chipmaker responds to the growing demand for AI memory chips with a big investment through 2028. Meanwhile, memory chipmakers step up production.
As the demand for AI chips grows with the popularity of large language models, the demand for compute and memory also grows.
In response to that demand, SK Hynix, a South Korea-based company that supplies memory chips to AI vendors such as Nvidia, revealed plans to invest about $75 billion in AI chips through 2028.
The development, announced on June 30, comes after South Korea said last month it will begin awarding eligible chipmakers financial support that will total about $19 billion.
SK Hynix plans to invest about 80% of the $75 billion into high-bandwidth memory chips. HBM chips consume less energy than other chips and allow different layers of AI chips to be stacked on top of each other.
A demand for memory chips
Other providers of memory chips, such as Micron and Samsung, as well as SK Hynix, have sold out of the chips for 2024 and 2025.
The result is high demand for AI memory chips, especially as AI technology is increasingly integrated into different areas such as mobile devices and contact centers.
"Basically, there's going to be this continued growth and demand for this," Futurum Group analyst Dan Newman said. "You're seeing this massive frontload of demand for AI chips."
Dan NewmanAnalyst, Futurum Group
Despite tech giants such as Google, Meta and Microsoft buying thousands of GPUs from Nvidia, there is still a need for sufficient memory within AI systems, he added.
"Everybody kind of talks about can they get enough Nvidia chips, but they also have to be able to get access to enough memory," Newman said. "If we're going to continue to train bigger models and continue to offer AI in more of our applications, we're going to need to have the associated compute and memory access to do that."
The nearly unquenchable need for access to compute and memory is what is driving SK Hynix and competitors such as Samsung and Micron to seek to expand production of their AI memory chips.
New production
For instance, Micron is building HBM testing and mass production lines in Idaho. Micron also revealed in April that it received $6.1 billion from the U.S. CHIPS and Science Act.
Samsung has also reportedly decided to start construction of a new semiconductor plant, after delays.
"There's a war for memory chips because, ultimately, you need them with AI," Constellation Research founder R "Ray" Wang said. "If you sell more AI, you need more HBM."
The interest in memory chips shows that the rapid development of generative AI technology not only benefits vendors such as Nvidia, Intel and AMD, but also those supplying power and memory for the data center.
"It's a booming opportunity," Newman said. "Right now, memory is one of the biggest booming cycles of all the chips."
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems.