Olivier Le Moal - stock.adobe.co
Dataiku intros GenAI cost monitoring tool: LLM Cost Guard
The tool lets customers be provider-agnostic and can integrate with models from providers like OpenAI, AWS and Anthropic. It watches for overruns and differentiates between costs.
Dataiku on Wednesday introduced a cost monitoring product for generative AI.
LLM Cost Guard is a new component of the Dataiku LLM Mesh platform the independent AI vendor introduced in September.
LLM Mesh helps IT teams build enterprise-ready applications.
With the added component of LLM Cost Guard, enterprises can trace and monitor their usage of large language model (LLM) applications to better anticipate the cost of generative AI, according to Dataiku.
The tool enables customers to be agnostic regarding LLM providers and includes integrations with LLMs from different providers, including OpenAI, Microsoft, AWS, Anthropic, AI21 Labs and Cohere.
Enterprises can identify potential cost overruns and gain insight into their LLM usage.
The cost of GenAI applications
The Dataiku offering comes as the generative AI market grows and evolves, and pricing varies widely. Vendors are developing the pricing models they think make sense, but that is likely to grow and change and enterprises are trying to adapt to the price of generative AI applications.
Many providers bundle their generative AI applications into other products or charge a perceived fee, or the price customers are willing to pay.
However, enterprises often are not getting the full picture of the cost of generative AI applications beyond the initial price tag, said Futurum Research analyst Keith Kirkpatrick.
"There's not a lot of visibility in terms of the actual cost," he said. "The use of generative AI is only going to increase. With any kind of enterprise standard, you want to have as much visibility as possible so you can really calculate ROI."
Thus, the LLM Cost Guard, and others like Aporia LLM Cost Tracking, aim to make it easier for enterprises to know the full cost of using generative AI applications.
They're also advantageous because the vendors providing the price comparison offerings are not tied to one LLM, Kirkpatrick said.
Keith KirkpatrickAnalyst, Futurum
For example, while a vendor like Anthropic can provide pricing for its model and claim it is lower than others, it is still biased.
"Having a third party to compare apples to apples is a good resource for organizations that are looking to deploy generative AI," Kirkpatrick added.
Scaling AI and planning for overruns
A tool like LLM Cost Guard is also beneficial for organizations looking to scale their AI projects, said IDC analyst Ritu Jyoti.
"Cost is the No. 1 inhibitor for scaling AI initiatives," she said. Moreover, for enterprises that prefer to use LLMs from different vendors for different applications, this will give them global visibility over their projects, she added.
Enterprises can also use tools like these to plan for overruns, such as a need for added talent or a possible litigation case that might pop up for using a public-facing LLM, said Michael Leone, an analyst at TechTarget's Enterprise Strategy Group.
"Having a plan in place to address some of those components is definitely something to think about," he said.
However, Dataiku still needs to ensure what is automated and what will require human involvement, Leone added.
"There are components like tagging and tracking expenditures and how organizations can kind of assign costs to specific projects," he said. "Understanding how much is automated and where humans are required to go in and make adjustments of any areas, that's going to be really important."
Dataiku also needs to be able to ensure data security, privacy, accuracy and transparency with the use of disparate LLMs, Jyoti said.
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems.