Google extends generative AI leadership at Google Cloud Next

Industry analyst Mike Leone unpacks the wave of generative AI announcements at the recent Google Cloud Next conference, including updates to Vertex and Duet.

In a theme that is sure to permeate throughout the rest of the fall show season, Google Cloud put its generative AI chops front and center across the most packed set of AI-centric announcements we've seen yet in the enterprise technology market.

Google Cloud Next 2023 was loaded across Google's core pillars of data and AI, modern infrastructure, collaboration and workspaces, and security. We heard from Google Cloud leadership, partners and customers. And, of course, we saw some eye-opening demos that highlighted the enterprise readiness that is core to everything Google Cloud is doing with generative AI.

Customer momentum

Google Cloud's generative AI offerings showed traction early on with customers and partners like Wayfair, Wendy's, Priceline, Orange, Vodafone, Uber and the Mayo Clinic, and we saw an exponential increase at Next 2023.

Success stories from customers highlighted Google Cloud's leadership position in the generative AI market. Estée Lauder Companies, Bayer, Deutsche Bank, HSBC, Shopify, Fox Sports, GE Appliances and GM are just some of the freshly announced customers doing some truly innovative AI projects in partnership with Google Cloud.

While it's clear that organizations spanning different industries across the globe continue to embrace Google Cloud for generative AI, one data point really stood out to me: 70% of AI unicorns are current Google Cloud customers. These are companies like AI21 Labs, Anthropic, Cohere, Runway, Replit, Typeface and Jasper, with whom many in enterprise tech are looking to partner to complete their own generative AI initiatives. If that isn't a stamp of approval, I'm not sure what is.

Vertex AI

Vertex AI is Google Cloud's comprehensive AI platform, where customers can build, train, deploy, monitor and scale machine learning models. This platform serves as the foundation for all things generative AI within Google Cloud by providing access to more than 100 foundation models, including versions from Google, including industry-specific models like Sec-Palm 2 and Med-Palm, open source and even third parties.

The latter was part of an announcement at Next, namely new access to Meta's Llama 2 and Anthropic's Claude 2 models in the Vertex Model Garden. But the eye-opening stat for me came during a breakout session: The Vertex AI platform saw 15x growth in customer adoption from April to July 2023.

Other announcements included new versions of Google's foundation models, with quality, performance and tuning improvements of Palm, Imagen and Codey. Colab Enterprise will also become generally available on Vertex, providing organizations with a way to combine the simplicity of Colab notebooks with the enterprise-level security and compliance capabilities of Google Cloud.

The newly announced Vertex AI Extensions aim to help customers enable foundation models to act based on real-time data connections. And for the two use cases seeing the greatest uptick in adoption, there were updates to Vertex AI Search and Vertex AI Conversation.

There were a set of announcements focused on accelerating the path to production with enterprise-ready AI. How can organizations trust everything associated with generative AI, including the data, models and output? With interest at an all-time high, companies need to establish trust through security, privacy, governance, bias detection and explainability.

Google Cloud continues to spend a lot of time on the topic of responsible AI, and we saw some fantastic announcements at Next. There were continued assurances from Google Cloud that they do not use customers' data to train their foundation models.

As such, they introduced new governance and privacy controls to protect customer data when customizing foundation models. VPC Security Controls, customer-managed encryption keys and Access Transparency are all generally available for generative AI on Vertex and coming in the next month for Search and Conversation. And new grounding for the Palm API and Vertex Search are in private preview to ensure that grounded model output is based on an enterprise's data.

Duet AI expansion to Google Cloud

A natural, easy and fast way to use generative AI in your organization is through a copilot experience: an intelligent chatbot assistant that enables users of all skill levels to ask questions within an application. Google introduced Duet AI in the spring as its always-on AI collaborator that provides help to users of all skill levels where they need it.

The initial announcement positioned Duet AI as an extension of Google Workspace. Recognizing the power Duet AI can bring to an organization, Google announced the expansion of Duet AI to most of Google Cloud at Next. This means more stakeholders can now use Duet AI to assist with not only code creation and chat assistance, but also operations, data analysis, data science, database management, security management and interoperability.

It all comes down to improving productivity. With Duet AI being trained to deliver recommendations in a smarter, more contextual way, I believe this new personalized and intent-driven approach will truly enable teams to build modern, scalable apps faster and more confidently than ever.

A strong and open partner ecosystem of 100,000 companies

One area we continue to hear about from organizations is the need for guidance and hands-on help in ramping up generative AI initiatives. Our research shows that organizations are turning to the partner ecosystem when they need to address skills gaps or need help selecting, implementing or managing infrastructure to support AI initiatives. That means third parties like management consultancies; systems integrators; value-added resellers; and infrastructure, platform and application providers.

At Next, Google Cloud spent a lot of time sharing its partner strategy because the organization recognizes the importance of providing flexibility to its customers based on maturity, use case and domain. The open partner ecosystem isn't anything new for Google Cloud, but as it relates to generative AI, it is the most robust partner story I've seen so far.

There are the normal tiers in Google's partner ecosystem, such as service providers, SaaS and application partners, and technology and platform partners. But there's also a large group of open source software and partner foundation model providers, as well as a growing list of data providers to support.

Any partner strategy is all about reaching a wider audience, providing deeper expertise, reducing costs and risks and accelerating innovation. Google Cloud's open ecosystem of AI partners is setting the standard for the industry.

Cost is the gating factor

Google Cloud did an excellent job creating a sense of "FOMO" at Next. Google showed just how easy it can be to adopt generative AI technology and quickly infuse it into your business to improve efficiency, productivity, collaboration and innovation.

The on-ramp is easier than ever. Bringing your own data is simple. Maintaining privacy and security, while still a concern for many, is being addressed in several ways. Clearly defined and working use cases are readily available. So what did Google miss?

In general, cost transparency continues to be a challenge. Organizations are left with the question of how much all this costs. Of course, it depends on several factors, but I didn't see any cost guidance whatsoever. That's a problem. Imagine a world where you're an existing Google Cloud customer, you've identified your first use case and you have clean data ready to go -- but you're shell-shocked by the initial price tag.

I recognize the idea of pricing to value and pricing competitively. I understand the underlying infrastructure needs to be powerful. But there is a growing need for greater cost transparency to let customers know what they're getting into. What does using generative AI for enterprise search cost for a company with 5 PB of data just to get started? What are the expected recurring costs?

Think back to some of the horror stories about the surprise cloud bills organizations experienced when they first migrated to the cloud. Generative AI costs have the potential to dwarf those bills. More pricing transparency is imperative to broader adoption -- and, I should note, this isn't just a Google Cloud problem but an industry problem.

Maintaining a generative AI leadership position

Google Cloud continues to make significant investments in generative AI, and it's paying off. Google has launched new infrastructure services that are intended to be more scalable, resilient and efficient. It's growing its partner ecosystem to offer greater flexibility to customers. It's simplifying the application development lifecycle. It's enabling customers to collect, manage and analyze data more rapidly. It's committed to security and sustainability. It continues to set responsible AI standards that provide organizations with trust, resiliency and confidence while reducing risk. And customers are throwing money Google's way because of it.

Enterprise Strategy Group is a division of TechTarget. Its analysts have business relationships with technology vendors.

Next Steps

Google unveils generative AI integrations for data tools

Google and Mandiant flex cybersecurity muscle at mWISE

AI news highlights: Gemini Ultra, Cisco-Nvidia partnership

Dig Deeper on Artificial intelligence platforms