Getty Images/iStockphoto
Amazon invests additional $4 billion in Anthropic
The cloud provider will be the primary training partner for the generative AI vendor. The LLM maker will also train all its models on Trainium and Inferentia.
Amazon has invested another $4 billion in foundation model provider Anthropic.
On Friday, Amazon revealed that it is now the primary AI model training partner for Anthropic. Anthropic will also use AWS’ Trainium and Inferentia AI chips to train and deploy its future foundation models.
The new investment brings Amazon's total investment in the AI startup to $8 billion.
Last September, Amazon initially invested $4 billion in Anthropic, whose flagship model is Claude.
"AWS doubling down on this relationship hedges against the future in terms of how these tools will develop," said Futurum Group analyst David Nicholson.
A move against Nvidia
However, the tech giant is not only betting on Anthropic but plans to run all the foundation and large language models (LLMs) from Anthropic on its chips, Nicholson said.
"This is huge because this will be a demonstration platform to everyone," he said.
Suppose AWS decides to run Claude and other foundation models from Anthropic also on Nvidia and its AI chips. In that case, it will likely be able to compare and show other vendors the difference and prove that it doesn't need Nvidia, Nicholson added.
"They've got to walk this fine line because they need Nvidia now," he continued.
Ultimately, this $4 billion investment shows that the tech giant and other hyperscalers can be in charge of their destinies by running on their own chips. "I would go out on a limb and say that this is an indication that Nvidia margins will be under assault at a faster rate than maybe we anticipated. This is a big shot across the bow against Nvidia."
Some hyperscalers would prefer investors not to know the hardware behind their technology, he added.
So, by stepping out on its own, AWS is helping the market move in that direction again.
"They want to sell the outcome to whatever problem you're trying to solve, and they don't want you to care about whether it's AMD or Intel or Nvidia or whatever," Nicholson said. "That gives them the opportunity to manage their costs on the back end and pick the best fit for function and make it easier for customers."
GenAI strategy
The new investment also provides insight into AWS' generative AI strategy, said Constellation Research analyst Andy Thurai.
"AWS is going after model trainers," Thurai said.
He added that most LLM providers, including Anthropic, are losing money in training LLMs.
"Staying away from that arms race and becoming a platform for those providers will bring more value to AWS. By demonstrating this, they can also appeal to custom model trainers, small language model trainers, and agentic AI producers, which is a huge market."
Moreover, with its recent partnership with Snowflake, AWS appears to be building an ecosystem and showing that it will provide AI hardware and software, Thurai continued.
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems.