AlfaOlga/istock via Getty Images
Can AI-driven care coordination software improve workflows?
By integrating an AI platform across dozens of hospitals and outpatient settings, University Hospitals aims to streamline care coordination and improve outcomes.
Care coordination -- the organization of patient care across different healthcare providers -- is key to ensuring that treatment is safe, appropriate and effective.
Facilitating information-sharing between care teams can reduce confusion for patients and streamline care management by reducing repeat lab tests, unnecessary trips to a provider's office and medication issues, according to the U.S. CMS.
CMS further emphasizes that a lack of effective care coordination can lead to a host of adverse outcomes, such as negative health outcomes, medical errors, increased use of emergency care, poor care transitions and higher out-of-pocket patient costs.
The Agency for Healthcare Research and Quality (AHRQ) indicates that achieving coordinated care requires healthcare organizations to adopt a blend of broad approaches commonly deployed to improve care delivery -- like the use of health IT -- and care-coordination-specific activities, including creating proactive care plans in line with patients' needs and goals.
Tools like EHRs have a major role to play in these efforts, as these can bolster information-sharing within existing clinical workflows. However, emerging technologies capable of analyzing vast amounts of healthcare data, such as AI, are being touted as the next step in improving care coordination.
In a recent interview, Donna Plecha, chair of radiology at University Hospitals, detailed how the organization's adoption of AI-enabled care coordination software is helping to improve patient outcomes without burdening clinicians.
Improving care coordination workflows
Plecha underscored that University Hospitals is an early adopter of AI technology, with years-long work researching and testing various tools already under the organization's belt. She indicated that to date, the health system has previously used AI-driven portable chest X-ray tools within its intensive care units to screen for collapsed lungs and locate objects like central venous lines, nasogastric tubes and endotracheal tubes.
This year, however, University Hospitals has begun working with healthcare AI company Aidoc to deploy its AI platform -- aiOS -- across 13 hospitals and dozens of outpatient locations.
In this first phase of the partnership, the health system is piloting the company's suite of FDA-cleared AI algorithms for radiology, triage and care coordination. Plecha explained that in the next phase, University Hospitals will review the data generated from those pilots and use those insights to inform additional AI integrations in clinical workflows.
Donna PlechaChair of radiology, University Hospitals
She explained that in pursuing this partnership, the health systems seek to overcome some of the common challenges associated with care coordination workflows, such as high patient volumes and clinician burnout rates.
"Radiologists are in short supply with the amount of volumes that we're dealing with, and we're dealing with burnout," Plecha stated. "So, if these tools -- these algorithms -- can make us more efficient and decrease the load for certain tasks that we have during our day, then that will help us be better at what we're doing."
The use of these tools could also help clinicians prioritize patients with urgent lab findings or screening results, she continued.
"The radiologist makes the final read on all of the studies, so there's nothing being done without a radiologist, but it might help the radiologist be more efficient," she said, emphasizing that some research comparing radiologist and AI tool performance alone to the performance of a radiologist using AI has shown promising results in favor of AI-assisted clinical decision support systems.
While University Hospitals is not currently utilizing all of the algorithms that Aidoc offers, Plecha noted that the organization is using one designed to screen for pulmonary embolism, which can be life-threatening if not diagnosed early. She explained that the presence of a small blood clot in the lungs can be subtle and difficult to catch, but that finding these is key to improving patient outcomes.
She stated that the use of AI to assist with these screenings has significant potential value in terms of care coordination, but suggested that stakeholders should implement tools cautiously.
"There are some false positives and negatives when you work with any kind of AI tool," she explained. "So, you have to go in with realistic expectations -- and not change your diagnostic acumen --to be able to use the [AI] information, but not have it be the final say."
Identifying appropriate AI use cases
To that end, Plecha indicated that homing in on organization-specific AI use cases is crucial for successfully adopting the technology for care coordination improvements. She noted that this is an opportunity to talk to other healthcare providers about the tools they are having success with, which is useful for identifying high-value AI applications and assessing how they might fit in with existing workflows or tools.
She underscored that evaluating and minimizing possible redundancies when selecting AI tools is important, as choosing a solution that does part or all of what an existing tool does would not be a wise use of resources and investment dollars.
At University Hospitals, part of this audit involved noting that the organization already uses AI to screen for issues like pneumothorax and intracranial hemorrhage, eliminating these as possible use cases for consideration.
The assessment did, however, highlight a gap in pulmonary embolism screening, allowing the health system to capitalize on AI's potential value-add in that area.
"We don't want competing AI tools, that would just make things more confusing," Plecha said. "So, we just have to be cognizant on a system-wide level to make sure that all the pieces of the puzzle fit together."
But adopting any AI tool also requires some heavy lifting from an organization's IT and health informatics groups, who conduct what Plecha calls an "architectural review" of any tool being brought in to determine how the solution will integrate with current workflows.
"That takes a long time to figure out -- it can take a while to test to see if [the AI] is ready to go and see if it actually will work," she stated, indicating that University Hospitals utilizes a workflow driven by a picture archiving and communication system (PACS) to help model a tool's functionality within clinical workflows.
"We want to interact with specific radiologists who have more knowledge in the AI world to understand how the interface is going to happen, what the end result for the end user will look like and if it's easy to use," Plecha continued.
Doing so can help prevent the integration of a tool that might hamper clinician efficiency or create more workflow-related burdens.
After conducting the architectural review, the health system can deploy and monitor the tool. Following a tool's pilot, the organization can then pull data about the integration to assess the ROI and make any changes necessary to improve workflow efficiency.
Measuring ROI for analytics tools
Measuring ROI and workflow efficiency will differ somewhat depending on the type of AI tool and the corresponding workflow, Plecha emphasized. Since the Aidoc partnership is still in its early stages, the data necessary to assess its ROI is limited for the time being, but there are some key metrics that apply to the care coordination use cases the health system is pursuing.
"We could look at the number of found positive findings that are true positives, how many true negatives, how many false positives, how many false negatives, and just overall, how many more of these, let's say lung clots, blood clots, are we finding?" she noted. "How many [pulmonary embolisms] are we finding compared to what we did in the last six months before we implemented AI?"
She explained that when using AI to screen for pneumothorax, looking at measures like time to treatment is particularly valuable.
"If you have a patient in the intensive care unit who has a collapsed lung, and they're on a list -- and they're number 30 on that list -- it might take hours to get to that result and to let the caretakers know," Plecha stated. "But if this case is flagged and automatically moved up to the top of your list [by AI], that patient could get a chest tube placed within 40 minutes."
She highlighted that using a metric like time to treatment can also be applied to AI tools for pulmonary embolism or other conditions in the context of care coordination. Further, she indicated that many AI vendors also have standardized metrics that they use to gauge tool efficacy, which can be useful for health systems to consider.
However, one of the major performance indicators that healthcare organizations can't overlook when exploring AI adoption is clinician buy-in and satisfaction.
"It's a learning process, and we're having ongoing conversations with our clinicians," she said, noting that feedback from care teams and shared learnings across the enterprise is key. "We learn as we have more experience with the tools."
This experience is particularly valuable as AI-enabled care coordination software becomes more prevalent.
"For radiologists that are worried about being replaced: I think the radiologist that would be replaceable is the one who isn't interacting with AI, and the one that isn't replaceable is the one who is interacting with AI. You have to embrace the technology, and work with it, and figure out the best way to incorporate it into your workflow," Plecha concluded. "With the shortage of radiologists, if we can lighten their load in some way or another, I think we have to move forward with that."
Shania Kennedy has been covering news related to health IT and analytics since 2022.