ipopba/istock via Getty Images

Predicting 2025's top analytics, AI trends in healthcare

Identifying the value proposition of analytics technologies and establishing AI governance frameworks will be key for healthcare organizations in 2025.

In 2024, healthcare organizations increasingly explored potential use cases for advanced AI and analytics tools to improve care, streamline administrative workflows and increase efficiency. Generative AI, or GenAI, dominated much of the health IT conversation throughout the year, as it had in 2023.

But looking ahead to 2025, analytics and AI trends in healthcare are shifting toward a focus on creating governance frameworks and prioritizing the technologies with the most significant value propositions, according to analysts from Gartner, IDC and KPMG.

'Pilot fatigue' and ROI

Many organizations have made meaningful progress in AI adoption, particularly for administrative use cases like revenue cycle management, billing and EHR workflow optimization. While excitement around AI technologies -- including GenAI -- has grown significantly in recent years, health systems have begun to face "pilot fatigue" in the last year, according to Vince Vickers, a KPMG U.S. healthcare technology leader.

"Everybody wanted to jump in [to the AI space] because they saw the promise, and they wondered, 'How do we apply that in healthcare?'" he explained. "And there are some amazing thoughts that are out there, but we have a large component of the healthcare market that just hasn't had the governance in place to get the most out of AI."

The hype around these tools has put pressure on stakeholders to adopt AI while ensuring a clear return on investment (ROI), which creates unique challenges for healthcare stakeholders.

"In terms of AI, healthcare organizations are seeking the safe path to value now, and every part of that statement is important," noted Jeff Cribbs, vice president, distinguished analyst and key initiative leader at Gartner. "They're getting a ton of pressure from their business and clinical partners to do something with AI. It's unprecedented. Never have business and clinical leadership exerted so much pressure for organizations to be working on AI. But leaders are also asking -- all the time -- 'What is the value of AI?'"

He indicated that many early AI implementations have yet to deliver the level of transformational impact anticipated by health system leadership. Identifying the path to value for the adoption of any tool is crucial, but the rapidly evolving landscape of AI makes this difficult in the healthcare industry, leading many organizations to hesitate.

"Most health systems don't want to be the first to do this kind of work. They really want examples of organizations that have made it to value, that have demonstrated value, that later on, they can point to. They're looking for somebody that's established the path," Cribbs continued.

Some healthcare organizations are working to establish this path, a trend that is likely to continue in 2025, according to Lynne A. Dunbrack, group vice president of public sector at IDC.

"Healthcare organizations are moving from proof-of-concept projects to in-production use cases," she stated. "We have seen a surge in using AI and analytics for personalized medicine, predictive analytics for early disease detection and increased adoption of AI-powered clinical decision support tools."

Dunbrack further noted that IDC's August 2024 "Industry Tech Path" survey found that 26.1% of healthcare respondents have a proof-of-concept project for GenAI-enabled clinical decision support systems, and 40.6% report that this use case is in production.

Despite the slow progress of some healthcare AI deployments, Vickers expressed optimism about these technologies' potential to disrupt the EHR and precision medicine markets in 2025.

He pointed out that while EHRs have digitized patient records, the tools haven't necessarily made providers' lives easier. AI could improve the functionality and usability of EHRs by enabling more integrated, cloud-based systems that offer predictive insights and bolster patient care.

Vickers continued that these technologies could also boost patient and caregiver experience, stating that AI-powered multiagent systems can help streamline the patient journey. Further, modalities like ambient listening are useful for reducing time spent on administrative tasks, allowing providers to focus more on direct care.

Vickers also suggested that breakthroughs in AI-driven research, drug discovery and genomics could lead to significant advancements in precision medicine in 2025.

However, having a robust governance strategy for adopting and evaluating AI tools is critical to the success of these efforts.

Governance and scalability

Governance plays a key role in ensuring that AI applications are used effectively and safely in healthcare, especially as the tools continue to exist in a regulatory "gray area."

As of December 2024, the U.S. FDA has approved nearly 1,000 AI- and machine learning (ML)-enabled medical devices. The agency also finalized new, nonbinding guidance to streamline the process for approving modifications to these devices.

But some in the industry have lamented that the regulatory environment is struggling to keep pace with innovation in the AI space, leading to concerns about the technology's deployment.

As with any presidential election year, there are likely to be some policy shifts that impact the most tightly regulated industries -- including healthcare -- but there are many unknowns about how the second Trump administration will approach healthcare AI regulation.

Executive Order 13960 -- signed by Trump during his first administration -- focused on promoting the use of trustworthy AI within the federal government, including a requirement for the U.S. Department of Health and Human Services to regularly publish an inventory of planned and current AI use cases.

The Biden administration has furthered these efforts with the October 2022 release of its Blueprint for an AI Bill of Rights and the Executive Order on safe, secure and trustworthy AI signed a year later.

These frameworks established guardrails to promote safety and protect Americans' privacy within AI applications across industries; however, they are nonbinding, like the FDA's recent guidelines, spurring some healthcare stakeholders to criticize them as insufficient.

Vickers noted that the slow pace of regulatory efforts has led many healthcare organizations to prioritize AI governance and infrastructure in-house, with many creating new roles like chief analytics officer or chief AI officer to ensure that AI applications are adopted responsibly.

Some have even opted to join collaborative entities like the Coalition for Health AI (CHAI), which aims to bring together public and private partners to advance responsible AI use in the industry. However, some of CHAI's work -- which has recently focused on establishing a network of nationwide health AI assurance labs to evaluate these tools -- has come under scrutiny from Republican lawmakers who are concerned about conflicts of interest, as CHAI was founded by health systems, such as Mayo Clinic, alongside companies, such as Google and Microsoft.

Cribbs said that predicting the potential regulatory environment heading into 2025 is challenging, but highlighted that regulation is just one factor in the conversation that healthcare stakeholders are having when navigating the AI landscape.

The bigger question revolves around doing the work to establish norms and best practices for building AI governance structures for healthcare entities. He noted that creating this infrastructure and designing oversight frameworks to monitor these technologies will be crucial in the event of any regulatory loosening that might occur across industries.

Vickers cautioned that though such regulatory relaxation could encourage innovation, it could also pose risks if new AI tools are adopted too quickly as a result.

Dunbrack predicted that health systems will likely prioritize the development of fair, unbiased and transparent AI algorithms as they navigate potential shifts in regulation.

"Healthcare organizations will increase their focus on improving data quality and implementing responsible and ethical AI solutions, with strong governance frameworks as the foundation to ensure data accuracy, completeness and privacy compliance," she said, citing the IDC survey finding that 41.2% of organizations reported that training for healthcare professionals is a top guardrail necessary to protect against the pitfalls of GenAI deployment.

Scaling AI tools across different parts of healthcare workflows is another potential obstacle that could prevent healthcare stakeholders from maximizing the value these solutions could provide, much of which is centered around giving care teams more time for meaningful interactions with patients.

Vickers stressed that having a clear governance structure in place is key for successful scaling, further noting that, much like the internet became ubiquitous rapidly after users began to see its potential across broad applications, AI tools are likely to be embedded into nearly every aspect of healthcare in the next three to five years if organizations can scale them effectively.

"I'm starting to see some pieces of that," he said. "I think the fear of AI technology is starting to diminish. People see the power of it, and -- as long as it has that governance and some guardrails around it so that it doesn't negatively impact care -- I think we'll see some breakthroughs this year."

Preparing for the challenges ahead

All three analysts agreed that AI tools -- particularly GenAI -- are going to remain a major focus in 2025. But as these technologies continue to advance, healthcare stakeholders will need to address the obstacles that accompany those innovations.

Vickers underscored concerns about growing cybersecurity threats in the industry, noting that increased uptake of emerging technologies is likely to create potential vulnerabilities for cybercriminals to exploit. To combat this, he suggested that healthcare organizations pay close attention to any new threats that might surface during technology deployments and anticipate how to prevent data breaches.

Cribbs emphasized that hurdles like AI bias, data drift and monitoring are already constraining the deployment of these technologies across the industry. Similarly, Dunbrack highlighted that data access and quality have been, and will continue to be, pain points, with many healthcare organizations reporting that data bias, trustworthiness and risk management are some of their top concerns when implementing GenAI.

Issues like cost, a lack of resources and workforce considerations are also top-of-mind for stakeholders.

Vickers emphasized that the aging healthcare workforce -- particularly those who had negative experiences during the widespread integration of EHR systems -- might feel resistant to adopting new technologies or struggle to embrace ongoing technological shifts. Professionals who are newer to the industry might be more receptive to AI initially, he noted, but that depends heavily on making sure that AI-enabled workflows do not lead to unanticipated bottlenecks or inefficiencies.

However, some health systems are already anticipating these AI adoption challenges and working to tackle them early on.

In a recent episode of Healthcare Strategies, OSF Healthcare leadership shared how the organization is pursuing mandatory ongoing education around GenAI to ensure its workforce is properly prepared for the technology's proliferation.

Despite initiatives like these, Cribbs called attention to the fact that, currently, many smaller or rural healthcare organizations might not be able to take advantage of emerging technologies, creating the potential for innovation inequity.

"There is a risk that we're going to create all kinds of innovative solutions that are really only available for sophisticated health systems," he explained. But he added that GenAI has seen uptake in healthcare that other tools -- like blockchain -- haven't, with health systems and vendors working quickly to integrate AI into their product offerings and workflows, making the technology more accessible than many other innovations.

Alongside novel tools like GenAI, Cribbs pointed out that some healthcare organizations are also beginning to explore advanced computing methods, including edge computing, neuromorphic computing and quantum clusters -- although the cost and complexity of deploying these approaches are expected to limit widespread adoption in the near term.

Much as they have in the past few years, analytics and AI trends in healthcare are primed to shake up the industry in 2025. By prioritizing data quality, governance and high-value use cases, health systems can more effectively navigate the ever-shifting digital health landscape.

Shania Kennedy has been covering news related to health IT and analytics since 2022.

Next Steps

Artificial intelligence in healthcare: Defining the most common terms

Arguing the pros and cons of artificial intelligence in healthcare

Top ways artificial intelligence will impact healthcare

Dig Deeper on Artificial intelligence in healthcare