Getty Images/iStockphoto

Snowflake boosting its commitment to AI, including GenAI

Recent moves, including the appointment of a new CEO and the formation of a new partnership, are representative of the vendor's heightened focus on artificial intelligence.

Like a pitcher who wants to announce their presence with authority by blowing a first-pitch fastball past an opposing batter, Snowflake of late has made major moves that demonstrate its heightened emphasis on AI.

AI, including generative AI, has been a significant focus for many data management and analytics vendors over the past year.

The data pipelines that feed traditional machine learning models as well as generative AI large language models (LLMs) have become more sophisticated. That has enabled more accurate predictive analytics using traditional AI as well as true natural language processing and the efficiency gains afforded by generative AI.

Snowflake, however, was perceived as slower than some of its competitors to make AI a focal point.

Just after OpenAI released ChatGPT in November 2022, which represented a significant improvement in LLM capabilities and began the ongoing heightened focus on AI, Microsoft, Google and Databricks all quickly demonstrated a newfound emphasis on AI.

Databricks and Google each developed their own LLMs within the first few months of 2023. Meanwhile, Microsoft quickly partnered with OpenAI, including a $10 billion investment in the AI developer in January 2023.

Snowflake continued to advance its platform with the introduction of tools such as industry-specific versions of its data platform. But other than its May 2023 acquisition of search engine vendor Neeva, its AI-related product innovations -- some unveiled in June 2023 and others in November 2023 -- are still in preview.

Now that is changing. Most dramatically, Snowflake has a new CEO under whom the vendor is expected to emphasize AI.

Frank Slootman, known for taking Data Domain and ServiceNow public, stepped down on Feb. 28 after guiding the vendor through a record-setting IPO in September 2020. Former Neeva co-founder Sridhar Ramaswamy, who has a reputation as a technological innovator, was named Slootman's replacement.

Six days later, Snowflake unveiled its first partnership with a generative AI developer, revealing an integration with Mistral AI. In addition, Snowflake moved Cortex, a suite of tools aimed at enabling customers to develop traditional AI and generative AI models, to public preview on March 5 after it was introduced in November 2023.

With Snowflake now showing its commitment to AI, Baris Gultekin, the vendor's head of product for AI, recently took time to discuss Snowflake's AI strategy.

In addition to reviewing the decision to partner with Mistral AI and the development process for Cortex, he spoke about Snowflake's AI strategy under Ramaswamy and the perception that Snowflake has been slower to embrace AI than its closest competitors.

Snowflake just unveiled a partnership and integration with Mistral AI, which is Snowflake's first with a proprietary AI vendor. What made Mistral a good partner for Snowflake compared with other AI vendors?

Baris Gultekin, head of product for AI, SnowflakeBaris Gultekin

Baris Gultekin: We had an integration with [Meta's open source] Llama 2 before Mistral AI. But Mistral Large is the first proprietary model that we're making available to our customers.

The reason we chose Mistral is we were impressed by their speed and innovation. Within just 10 months, they were able to build a competent model that is competing with the state of the art. Right when we announced the partnership, Mistral Large outperformed other language models in benchmark testing. Being [a top performer] is a hard feat, and they were able to achieve that with a small team. We were excited to align with them and partner with them, and I expect us to do even more things together in the future.

Going forward, are there plans to add integrations and partnerships with other AI vendors?

Gultekin: Yes, more are coming. We're excited about what is in the works, and we'll be announcing them soon. What we aspire to do is to provide the most capable models to our customers. We want to give them a choice.

At the core, our customers want these integrations because they want [AI models] running inside the Snowflake security parameters. The goal is to bring the compute to where their data is because that makes it easier for customers to use these models, it makes it more secure to use the models and it makes it easier to govern the use of the models.

Looking more broadly, how would you describe Snowflake's AI strategy and commitment to AI under Frank Slootman's leadership?

Gultekin: The AI strategy revolves around the data strategy. Data cloud is our strategy, which means that we want to make it easy for our customers to bring all their data into one place to manage, govern and secure it. Our customers trust us with this valuable and sensitive asset that they have. When we talk to our customers, everyone wants to take advantage of these AI models that are coming out, and they want to do that while preserving their investment in their data in one place.

Our strategy is figuring out how to enable our customers to make the most out of these capabilities that are coming out without compromising on security and governance while making it as simple as possible to use AI.

Obviously, there was a recent CEO change. Under Sridhar Ramaswamy, how will Snowflake's AI strategy and commitment to AI change?

Gultekin: Sridhar is a technologist and an innovator, and I've been fortunate to work with him. I came to Snowflake shortly after him, and he's crafted our AI strategy. With him now the CEO, our strategy hasn't changed because he's been at the core of setting it up. We are investing in AI, and I expect we will continue to invest in AI.

It's an important part of our strategy.

Where does generative AI fit into Snowflake's overall AI strategy?

Our strategy is figuring out how to enable our customers to make the most out of these capabilities that are coming out without compromising on security and governance while making it as simple as possible to use AI.
Baris GultekinHead of product for AI, Snowflake

Gultekin: The way we think about this is that generative AI is the new, hot, emerging opportunity, especially as it relates to unstructured data. It just enables many new capabilities that were impossible before. When we talk to our customers, all of them have this spreadsheet of 500 use cases that they'd like to try. Everyone is transitioning from these ideas to proof-of-concepts to production. It's right at the beginning of the transformation. So when I think of the opportunity with generative AI, what I think is that now is the perfect time to be investing and working with customers.

If you zoom out, machine learning has been around for a long time. Meanwhile, distilled data -- structured data -- is the core of business decision-making. So we're making a ton of investments there as well. We're trying to make it easy for customers to bring their ML models into Snowflake's security parameters and govern those models. That's a natural fit for us as well as generative AI. We're investing in both [traditional AI and generative AI].

Getting into some specific plans, Databricks and Google have each developed their own LLMs. Does Snowflake have plans to do so as well?

Gultekin: Currently our focus is to bring the best that's out there to our customers. We don't have an announcement on this. We're, of course, evaluating the best scenarios -- what the right thing is for our customers. But currently the focus is offering the full spectrum of AI models for our customers, which includes proprietary models as well as open source models.

How does Cortex work?

Gultekin: I think of Cortex as both an umbrella of products and, at its core, an engine that's running LLMs and search. This engine essentially allows us to run open source and proprietary LLMs with high efficiency by providing a serverless way to access these LLMs. Cortex functions that we've already announced allow direct access to the LLMs, or customers can access them through applications that we've built on top of Cortex, such as Document AI that helps process PDFs and directly extract information from text. With Document AI, you can process millions of documents and extract structured data. We also have a copilot product, which is a text-to-SQL assistant. There's a universal search product where customers can search their metadata in natural language.

All those products are built on top of the core Cortex engine.

When will Cortex be generally available?

Gultekin: We're moving fast, and now it's available to all our customers. We'd like to make it generally available as soon as possible, and we'll announce that soon. We're working hard toward getting that ready.

After the Mistral AI partnership was revealed, analysts I spoke with expressed the view that Snowflake is behind competitors on AI, both in terms of developing AI capabilities and providing tools that enable customers to build AI applications. What's your response to that?

Gultekin: I will zoom back and talk about ML. It is indeed true that when it comes to ML, Databricks started with it while Snowflake started with a focus more on analytics. When it comes to AI, I would say that everyone is at the beginning of the journey and we are moving incredibly fast. I would not say we are behind at all. When you look at our offering, you see that what we put out there in November [with Cortex] essentially provides all the building blocks that our customers have been asking for. And they are now available in public preview.

Overall, the speed of development -- the pace of development -- is incredible, and the products resonate with customers.

Going back to the perception of the analysts, whether accurate or not, how do you go about changing that perception?

Gultekin: Ultimately, we need to deliver. We need to change the perception by showing. We are just starting. The Mistral AI announcement is a good example. It's not just an announcement. We announced it and it's alive. Mistral launched their model a few days before [the partnership was revealed], and we are one of the first companies that have it live and running and available for customers. And we'll continue doing that. We will show by executing.

Looking broadly, what do you view as critical AI and generative AI capabilities that any data cloud vendor needs to provide at this point?

Gultekin: I believe that customers must have access to the most capable models from multiple places. From their perspective, the way they differentiate [from their own competitors] is by using their own data. If we can make it easy for customers to take these models and enable them to use the models with their own data, then we will make them successful.

Looking ahead, what capabilities do you think will represent the next generation of AI and generative AI capabilities?

Gultekin: There are two trends I'm seeing. They go in opposite directions, but I think both will play out.

I'm seeing an opportunity to fine-tune LLMs -- to create smaller but equally performant models. Customers will take these models that are capable for general purposes and fine-tune them so they are more efficient, faster, cheaper and equally high quality. That trend will continue, and I'm super excited about that. That's organizations taking their data and getting something that is focused on their use cases.

The opposing trend is bigger and more capable models. You see every single developer building bigger and bigger models that are amazing at complex tasks such as reasoning. They are essentially becoming orchestrators that can do functions and eventually turn into core enterprise operating systems that can pull in data from multiple places and then help create outputs. I'm excited about that direction as well.

Lastly, again, looking broadly, how would you characterize the current state of AI development?

Gultekin: The industry is moving incredibly fast in a way that I have never seen in my 20-something year career. I've never seen this pace, this excitement, this energy. All of us are moving at a pace that I haven't seen. That makes me excited about the future. And I'm excited about Snowflake's role in this future because we can come at it from a unique angle. There is something here with bringing data and computing to the same place.

Editor's note: This interview has been edited for clarity and conciseness.

Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.

Next Steps

Snowflake partners with Anthropic to improve AI development

Dig Deeper on Data science and analytics