Getty Images/iStockphoto

The promise and perils of GenAI in clinical documentation

Generative AI in clinical documentation can ease EHR burdens and enhance patient communication, but issues with accuracy necessitate careful provider review.

While EHRs have enhanced data exchange for care coordination, the technology has increased clinical documentation burden.

Research from 2023 suggests that providers might spend more time on the EHR than they do directly caring for patients. On average, providers spend over 36 minutes on the EHR for every 30-minute patient visit, according to the data.

However, generative AI could change that. As defined by the Government Accountability Office, GenAI is "a technology that can create content, including text, images, audio, or video, when prompted by a user."

Spurred by the availability of chatbot interfaces like Chat-GPT, health IT vendors and health systems are piloting generative AI tools to streamline clinical documentation.

While the technology has shown potential to alleviate documentation burden associated with clinician burnout, several challenges hinder its widespread adoption.

Ambient clinical intelligence

Ambient clinical intelligence uses smartphone microphones and generative AI to transcribe patient encounters in real time, providing clinicians with draft clinical documentation for review in seconds.

A 2024 study examined the adoption of ambient AI scribes across 10,000 physicians and staff within The Permanente Medical Group.

Physicians who used the ambient AI service reported positive feedback, including the technology's ability to facilitate more impactful patient conversations. Providers also reported reduced after-hours EHR documentation.

While the promise of ambient AI for EHR documentation is substantial, ensuring a provider reviews all clinical documentation drafts for accuracy is critical for patient safety.

A 2023 study found that the technology might fall short in documenting non-lexical conversational sounds (NLCSes) -- like mm-hm and uh-uh -- that patients and providers use to convey information.

For instance, a patient might say, "Mm-hm," to indicate yes in response to the question, "Do you have any allergies to antibiotics?"

The researchers evaluated the performance of two clinical ambient AI tools across 36 primary care encounters.

While the tools had a word error rate of about 12% for all words, the NLCS word error rate fell between 40% and 57%, and the word error rate for NLCSes that conveyed clinically relevant information was even higher: 94.7% and 98.7%.

"Some of these NLCSes were used to communicate clinically relevant information that, if not properly captured, could result in inaccuracies in clinical documentation and possibly adverse patient safety events," the researchers emphasized.

Patient communication

Patient portals have long been a key feature of patient communication. However, since the COVID-19 pandemic, patient portal messaging has skyrocketed. A 2023 report in JAMIA suggests that patient portal messaging has surged 157% from pre-pandemic times.

To help providers manage their inboxes, health IT vendors and health systems are piloting generative AI for patient communication.

The technology produces draft responses to patient portal messages in seconds, which clinicians review and edit before sending them to patients. Like clinical documentation case studies, the early use of GenAI for patient communications has found success as well as challenges.

In a 2024 study, PCPs rated generative AI responses to patient messages higher for communication style and empathy compared to provider-authored responses.

However, generative AI responses were longer, more linguistically complex and less readable, posing challenges for patients with lower health or English literacy.

Using generative AI to draft patient portal messages could also unknowingly impact clinical decision-making, according to a 2024 simulation study.

Researchers found that the content of replies to patient messages changed when physicians used large language model assistance, suggesting an automation bias that could impact downstream patient outcomes.

The study findings also underscored the importance of physician review of AI draft messages.

While most drafts were acceptable and posed minimal patient safety risks, a minority, if unedited, could lead to severe harm or death by incorrectly determining the acuity of the scenario.

Further, while generative AI can help decrease EHR tasks, it might not lead to a net decrease in time spent on the EHR.

A recent study found that providers who used AI draft replies to patient portal messages reported decreased task load and emotional exhaustion. However, there were no changes in overall message reply time, read time or write time when comparing pre-pilot and pilot periods.

The study authors suggested that switching from writing to editing message replies might be less mentally burdensome despite taking the same amount of time.

Discharge summaries

Generative AI has also shown potential to make patient discharge summaries more patient-friendly.

In a study published in JAMA Network Open, researchers found that generative AI can effectively lower the reading level of discharge notes from an eleventh-grade reading level to a sixth-grade level.

Most experts agree that patient education materials should be written at a sixth-grade reading level to accommodate varying levels of health literacy.

Two physicians reviewed each patient-friendly discharge summary for accuracy on a 6-point scale, with 54 of 100 reviews giving the best possible rating of six. Summaries were rated entirely complete in 56 of 100 reviews. However, 18 reviews noted safety concerns involving omissions and inaccuracies.

The findings suggest that generative AI can help translate discharge summaries into patient-friendly language to improve readability. However, the authors emphasized the need for physician review.

"Implementation will require improvements in accuracy, completeness and safety," the authors wrote. "Given the safety concerns, initial implementation will require physician review."

Generative AI shows promise in reducing clinical documentation burden and improving patient communications. However, challenges like accurately capturing non-verbal cues and maintaining document accuracy highlight the need for careful provider review.

As generative AI continues to evolve, balancing its benefits with the need for provider oversight will be essential for safe and effective implementation in healthcare.

Hannah Nelson has been covering news related to health information technology and health data interoperability since 2020.

Dig Deeper on Clinical documentation