Getty Images/iStockphoto

Google intros Mistral Codestral as a service on Vertex AI

The cloud provider differentiates its GenAI stance by offering the code-generating LLM as a service on the Vertex AI Model Garden, along with Mistral Large 2.

Google is now the first cloud provider to offer France-based startup Mistral AI's open source Codestral model, available as a fully managed service on the Vertex AI Model Garden.

Google revealed the development on Wednesday, the same day Mistral introduced its new model, Large 2.

Large 2 has a 128,000-token context window and supports dozens of languages including French, German, Spanish, Russian and Korean. It also supports more than 80 coding languages including Python, Java and Bash.

Model-as-a-service API

Mistral Large 2 is also available on Amazon Bedrock, Microsoft Azure AI Studio and IBM Watsonx.ai. But Codestral, Mistral's open-weight code generation model, is only available on the Google AI platform and not those of Google's cloud provider rivals.

Google also added Mistral Nemo to the Vertex AI Model Garden. In addition, Google is offering Meta Llama 3.1 405B as a service on the Vertex AI Model Garden.

Both the Codestral and Llama 3.1 model services are available in preview.

"This is our first set of true open source models as a service ... APIs that work with the full Vertex platform," said Jason Gelman, Google's director of product management for Vertex AI.

Google is enabling all Vertex features -- including machine learning and AI model building and deployment features, plus an array of third-party large language models -- for all its customers to use with the open source models right away, Gelman added. So, instead of just providing the models, Google will help enterprises manage them.

"You're getting the entire power of the Vertex platform, not just a model behind an API endpoint," Gelman said.

The big value proposition that Google puts forward is the idea that by leveraging all their integrated pieces, you can try things very, very quickly, and fail quickly, if that's going to be the case.
David NicholsonAnalyst, Futurum Group

Staking claim to being the only cloud provider to offer Codestral falls in line with Google's overall strategy, according to Futurum Group analyst David Nicholson. What Google hopes to do with Codestral is to let enterprises know that it can deliver greater value with a managed service, he said.

"The big value proposition that Google puts forward is the idea that by leveraging all their integrated pieces, you can try things very, very quickly, and fail quickly, if that's going to be the case," Nicholson said. "It's the Vertex part. The part where having something as a managed service so that you can bring your idea for what you want to achieve as a business outcome, and then the technology stack will be ready to go."

Google is also trying to differentiate itself from other cloud providers that also boast model catalogs, said Chirag Dekate, an analyst at Gartner.

"This is Google showcasing its GenAI leadership and its flexibility," Dekate said. "Its ability to deliver better innovation faster in the hands of their customers. It is about showcasing their model catalog difference."

The Mistral difference

Codestral is noteworthy because it comes from the France-based upstart Mistral AI, it's a domain-specific model for coding, and it's based on a different architecture than the transformers that underlie nearly all generative AI systems.

Its architecture enables it to deliver equal performance to some of the transformer-based models, while using considerably fewer compute resources.

"It lowers the cost of implementation," Dekate said. "It lowers the cost of operations."

The ability to offer better results with less compute is part of Mistral's strategy, according to Nemertes Research analyst John Burke.

Instead of offering models that have hundreds of billions of parameters, Mistral claims that its 7 billion-parameter models perform like a 21 billion-parameter model.

"I will just applaud Mistral for trying to wring more effective AI out of smaller AI," Burke said. "The trend of just piling on more parameters is, of course, not sustainable economically, not sustainable from an energy or carbon perspective, and probably not sustainable from a technological perspective for much longer. It's great to see somebody prominently bucking the trend."

Moreover, by forging partnerships with all three cloud providers, Mistral has emerged as a popular choice in a diversified generative AI model market, Dekate said.

"They are innovating at the forefront of GenAI modeling ecosystems and are creating products and solutions that are likely going to be adopted quite broadly by clients," he said.

Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems.

Dig Deeper on AI technologies

Business Analytics
CIO
Data Management
ERP
Close