putilov_denis - stock.adobe.com
What vendors must know about the AI assistant craze
More vendors are introducing products to assist enterprises and consumers complete mundane tasks. But there's a need to be strategic and transparent with these products.
This year may be the year of generative AI, but it's also a year of AI assistants.
From online retailers like Shopify to cloud providers like Google, more vendors are releasing AI assistants, and some are even spicing them up with generative AI.
Most recently, Google plans surfaced about changing its AI assistant and infusing it with generative AI.
The market
The market for AI assistants spans different industries. Assistants are pervasive in e-commerce and customer service, and the market is also full of AI assistants that help craft content.
For example, last month, Google introduced NotebookLM, an experimental offering that the tech giant said is an AI research assistant for summarizing facts, explaining ideas and brainstorming new ideas.
Also in July, e-commerce provider Shopify released Sidekick, an AI-enabled commerce assistant for business owners. Sidekick answers business owners' questions as they start and scale their businesses. The tool responds to prompts such as "set up a discount for my holiday sale" or questions like "what are my best-selling products?"
While AI assistants such as Siri and Alexa and transcription assistants like Firefly and Otter AI have been firmly embedded in consumer and business markets for some time, the arena keeps expanding with the added excitement -- whether overblown or not -- about generative AI.
A need for strategy
However, with more AI assistants entering the market, vendors can fail to evaluate their core strategy and reason they're putting an assistant out into the market, said Liz Miller, an analyst at Constellation Research.
"The market seems to believe that digital assistants ... [don't] take strategy -- a content strategy and intention to have those stood up and deployed," she said, adding that vendors should not only think through about what the AI assistant is going to do but also where the data the assistant will use is coming from and how that data will get updated.
"There are a lot of strategic questions and a lot of sausage making that goes into having one of these virtual assistants be up and running and ready to go," Miller continued.
Enterprises interested in AI assistants might also consider a strategy that leading providers of AI assistants are using called retrieval augmented generation, Gartner analyst Bern Elliot said.
Liz MillerAnalyst, Constellation Research
This approach goes beyond just providing a chatbot application, he said. Instead, retrieval augment generation generates responses with pre-trained LLMs and company-specific datasets.
In this way, it works to retrieve information about a product relevant to what the application user is looking for and turns that around to provide a conversational response to the user's question.
For enterprises looking to build AI assistants that want to take the retrieval augmented generation route, it's important to first speak with the vendor about its large model.
"If you don't get that kind of information from them, then you don't know what they're doing," Elliot said.
Getting this information helps enterprises and anyone thinking of creating AI assistants learn how the vendor maintains the model that the assistant is built on and what governing information is needed to make sure information that shouldn't go out isn't going out.
"There are a lot of considerations," Elliot said. "The vendors have to up their game in terms of what advanced function they have and how they're accomplishing them."
Transparency and hype cycle
Enterprises are not the only ones that need to practice transparency with AI assistants, Miller said. Those who consume and use these assistants will also want transparency, especially in e-commerce and customer service spheres.
"[The AI assistants] that are successful are transparent," Miller said. "People who pretend their virtual assistants are like a real-life human being, those will always fail. Humans will figure out if they are actually speaking with a human or if they are speaking with a machine or a computer."
While the AI assistant market is still in its early stages, marketing buzz that seems to hype nearly everything as an assistant, even if it's not, is hurting the people it's supposed to help, Miller said.
"People -- consumers, employees, people -- are already burning out from it," she said. "There will be resistance, and there will be a backlash to it because people are already saying, 'Okay, that's great, but how much time does it really save?'"
Enterprises and consumers could be past the puff of it all. They are now asking hard questions from vendors, such as where the data is coming from, whether the vendor has permission to include the data in the model and whether this data is needed in the model the assistant is using.
The problem is vendors also appear entranced by the ability of LLMs and diffusion models (models that can generate data similar to the data they are trained on) are unlocking data and capacity that help take repeatable and tedious tasks off people's hands, Miller continued.
"But if we don't stop the hype cycle sooner rather than later, no one's going to … trust it," she said.