Getty Images/iStockphoto
Google and Hugging Face unveil AI partnership
The partnership reflects the cloud provider's support of open source and aims to appeal to enterprises looking to move from ideation to implementation of generative AI workloads.
Google Cloud and open source generative AI platform provider Hugging Face on Thursday revealed that they have partnered to enable developers to use Google Cloud's infrastructure for all Hugging Face services.
The new partnership allows developers to train and serve open models using Google Cloud's computing, tensor processing units (TPUs) and GPUs.
Developers can now train Hugging Face models with Google's Vertex AI from the Hugging Face platform. The partnership also means Hugging Face developers will have access to Cloud TPU v5e AI accelerators and future support for A3 VMs, which are powered by Nvidia H100 Tensor Core GPUs.
Google and open source
It comes amid the growing popularity of open source tools and technologies, such as Meta Llama 2, and the increasing criticism of models that were once open and are now closed, such as those from OpenAI.
For Google, the partnership is a step forward, Futurum Research analyst Mark Beccue said.
Google has open source projects such as its TensorFlow platform and Kubernetes cluster management system. However, most of its offerings are not open.
Mark BeccueAnalyst, Futurum Research
"[Google] moving toward open source shows how the market is evolving," Beccue said. "We're seeing this trend where there's private pieces to things and there's open source. It's a combination of these for everybody."
Moreover, it's important for vendors with proprietary models to not only say they support open source models, but also show that they contribute to them, Forrester Research analyst Mike Gualtieri said.
"This is following a very similar pattern to software itself," he said.
Hugging Face's benefit
For Hugging Face, the partnership aids its mission of collaborating with different organizations that enables it to create an open platform for developers, according to head of product Jeff Boudier.
"This collaboration will help us offer the best of open source through the Hugging Face platform," Boudier said.
He added that Google's contribution to open science, its custom hardware offerings of GPUs and TPUs, and the AI and machine learning software platform Vertex AI make this a viable partnership.
Hugging Face already has partnerships with other cloud providers including AWS and Microsoft.
However, what differentiates Google from Microsoft and AWS is that Google employs the Jupiter data center fabric, which enables Google to "create direct connect capability across GPUs," Gartner analyst Chirag Dekate said.
"From a model scalability perspective, you can start from the smallest of models and scale to the largest of models with incredible efficiency and performance," he continued.
The benefit for enterprises
Hugging Face and Google Cloud are not the only players benefiting from the partnership.
For enterprises, partnerships like these can help take them from the ideation stage of generative AI to implementation, Dekate said.
Currently, the three top cloud providers -- AWS, Microsoft and Google -- are competing to offer full-featured AI stacks. These include everything from AI infrastructure to model catalogs to MLOps and generative AI applications.
"As enterprises accelerate the exploration and implementation of generative AI, they are trying to explore which cloud provider ... aligns best with their enterprise workloads," Dekate said.
Enterprises are also trying to find the model with the right size that best aligns with their data context, while also introducing more modalities in their architectures, he continued. One way to do this is by choosing a provider that offers an array of different models or is model-agnostic like Hugging Face.
"Enterprises benefit from not just Google Cloud's AI platform, but also from Hugging Face's models," Dekate added.
Furthermore, many enterprises are running hybrid, multi-cloud ecosystems. Some run their applications on AWS, Microsoft Azure and Google Cloud Platform. Therefore, a partnership like this is an opportunity for enterprises to use Hugging Face across any of the cloud platforms.
"I would argue that is an incredible value accelerator for enterprises that are accelerating their implementation vectors of generative AI," Dekate said.
Esther Ajao is a TechTarget Editorial news writer covering artificial intelligence software and systems.