Getty Images/iStockphoto
Big tech invests billions in AI data centers globally
President Joe Biden throws his support behind Microsoft to build an AI data center in Racine, Wis., as big tech companies invest in AI infrastructure to compete.
AI data centers will play a role in the global AI arms race as companies like Google, Amazon and Microsoft invest billions building infrastructure to support generative AI. But as more AI data centers are built, companies will face issues such as local community skepticism and massive energy needs.
Global competition for AI infrastructure is heating up as big tech companies continue to announce AI data center builds both in the U.S. and beyond. Google plans to invest 1 billion euros ($1.1 billion) in expanding its data center in Finland to accommodate AI and $2 billion in an AI data center in Malaysia. Meanwhile, Amazon said it plans to spend $11 billion on new data centers in Indiana.
The U.S. government recognizes the economic impact of AI data centers, with President Joe Biden supporting Microsoft's $3.3 billion investment in building an AI data center in Racine, Wis. Biden highlighted how the AI data center will create 2,300 union construction jobs and 2,000 permanent jobs over time. Microsoft also said it will provide upskilling opportunities for Wisconsin workers.
As AI data centers crop up, U.S. government leaders will play a part in education, according to Forrester Research analyst Alvin Nguyen. For now, there is "fear, uncertainty and doubt" as local communities question the value of AI data centers, particularly given concerns about many of the jobs being automated while the AI data centers take significant amounts of local energy resources, he said.
AI data centers differ from traditional data centers in significant ways, Nguyen said. However, they still require a workforce to run them, which is where upskilling and education come into play. Assessing local energy needs will also be something government officials will need to take into consideration moving forward.
"It can be a good business to have," Nguyen said of AI data centers. "But it needs to be done in balance with residents and other businesses."
AI data center energy consumption a concern
A data center is a facility often made up of multiple racks containing computing infrastructure supporting IT systems, such as servers and data storage.
For a traditional data center, Nguyen said a normal workload is from 4 to 10 kilowatts (kW) of electricity per rack. A rack for generative AI in a traditional data center uses more than 200 kW of electricity, he said. An Electric Power Research Institute report said AI queries from models such as OpenAI's ChatGPT are estimated to require more than 10 times the electricity of a traditional Google search query.
Alvin NguyenAnalyst, Forrester Research
"You're looking at two orders of magnitude more in power density," Nguyen said. "For a lot of organizations using an older data center, increasing power density by that much is quite a bit. It's hard to retrofit that."
Nguyen said the challenge is increasing the power density and cooling the data center. It's common to use air cooling for traditional data centers, he noted. However, once a data center exceeds 50 kW per rack, liquid cooling is necessary. That liquid cooling will then transfer the heat to water, which will be held in cooling ponds.
Upskilling for workers used to working at a traditional data center will also be necessary as new AI data centers push higher power densities, he said.
"There's the fans, the noise, it's in some cases dangerous to be in there or you can't be in there as long because it's too hot, too noisy," Nguyen said.
Like internet and communications technologies, it will be critical to have AI capabilities spread as widely as possible, he said. AI is already part of the digital divide -- a gap in technology access lawmakers have spent years seeking to address.
"People who are able to leverage AI have a clear advantage over those who can't," Nguyen said. "Competition for resources will be more and more common, but this is where we need education for local officials, and we need new ways to create a toolkit to say how much power is available [and] does having a data center make sense."
U.S. will see more AI data centers
Even if it's not in the form of a massive AI data center like Microsoft plans to build in Racine, smaller localized AI data centers will likely start popping up in the future -- especially in areas where companies face data regulations, Nguyen said.
One example of the need for smaller and localized AI data centers is the data sovereignty laws in countries such as France and China, which require personal data to be stored within their borders, he said. Another reason for localized data centers is reduced latency.
Indeed, general large language models are expensive to build and not as good at advising on specific issues, said Brian Hopkins, Forrester's vice president of emerging technology. He said organizations will likely need more customized models.
"The general vanilla instance of GPT is not going to be really good at advising my customers on what tensile steel strength to buy for their engineering needs," Hopkins said.
The demand for AI training and inferencing at edge locations will far exceed what AWS, Microsoft and Google can meet, he said. This is why big tech companies are investing billions in AI data centers.
Instead of pulling all applications into existing data centers, the strategy has shifted to distributing AI data centers to meet company-specific data training needs, Hopkins said.
Makenzie Holland is a senior news writer covering big tech and federal regulation. Prior to joining TechTarget Editorial, she was a general assignment reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.