AlfaOlga/istock via Getty Images

Building a generative AI-ready healthcare workforce

Generative AI is taking various U.S. industries by storm, but how can healthcare organizations ensure that their workers are properly prepared for the technology's proliferation?

AI is transforming the way that industries do business, and healthcare is no exception. But with the rise of generative AI, enterprises are forced to navigate yet another technology hype cycle, in which selecting practical use cases for these tools and deploying them effectively is a major priority.

Healthcare isn't immune to the GenAI hype, with studies and pilot projects demonstrating the technology's potential value in the realms of clinical documentation, revenue cycle management and EHR workflow improvement.

But, as with any novel technology, adopting generative AI in healthcare requires stakeholders to not only identify use cases and deployment strategies, but also address hurdles along the way, from building patient trust to tackling integration concerns across healthcare sectors.

As industry leaders continue to prioritize their AI efforts in an effort to bolster digital transformation, communicating with their workforce about the pros and cons of AI use in healthcare is paramount.

However, this provides a unique challenge in the context of emerging technologies like GenAI, as the relative newness of these tools necessitates educating the healthcare workforce about the technology's basics -- including what it is, how it works and best practices -- alongside its potential benefits in clinical and administrative settings.

In this episode of Healthcare Strategies, Melissa Knuth, vice president of planning at OSF HealthCare, describes how the health system overcame those challenges for its workforce by creating mandatory ongoing education around generative AI.

Shania Kennedy has been covering news related to health IT and analytics since 2022.

Transcript - Building a generative AI-ready healthcare workforce

Melissa Knuth: We were given six weeks to develop Generative AI education for 24,000 employees. Not to stifle the use of those tools, but to give some guardrails for how they would use those tools internally.

Shania Kennedy: Hello, and welcome to Healthcare Strategies. I'm Shania Kennedy, assistant editor of Healthtech Analytics. Today, we're going to chat about generative AI.

If you spend any time in the analytics side of healthcare, you already know that AI tools are being touted as potential game changers in clinical decision support, medical imaging, EHR workflows and more. But when we're talking about deploying these tools, it's important to focus on how healthcare organizations are preparing their workforce to deal with the technology influx.

OSF HealthCare recently rose to this challenge by developing mandatory ongoing education for its employees in order to help them learn more about the benefits of using generative AI. To tell us more, we have Melissa Knuth, vice president of planning at OSF HealthCare, on the show today. Melissa, thanks for coming on to Healthcare Strategies.

Knuth: Thanks for having us, Shania. We're really proud of the work we've been doing here, so I'm grateful for the opportunity to share it.

Kennedy: Absolutely. When I heard about this, I was excited, and I was interested, just because when we're talking about tools like generative AI, the hype is sort of around, 'How can we improve patient experience?' and 'How can we empower our healthcare workforce?' -- especially our clinicians who are already dealing with burnout and pajama time and all of these other sort of issues related to documentation and workflows. This initiative is aiming for that improvement in the patient experience and empowering the healthcare workforce. I was wondering if we could start by having you walk us through how you came to the realization that you needed to create these educational materials around AI for your workforce, and what did that look like?

Knuth: To begin with, so people have context, we are a Catholic healthcare system headquartered in central Illinois. We are founded by the Sisters of the Third Order of St. Francis, so we are a nonprofit healthcare system. To give you an idea of [the] size of our healthcare system, we have 16 hospitals, about 24,000 employees -- we call them mission partners -- and 600 physicians. So, it just gives people a little bit of context and magnitude of the challenge. The other thing I like to share with people is, I consider OSF Healthcare to be one of the most progressive Catholic healthcare systems in the country. We are innovators. We just celebrated 10 years of OSF Innovation and our innovation center. So, we have actively been participating in the transformation of healthcare for many years. So, when generative AI came about, it was a natural extension of what we were doing.

We are facing the same challenges that almost every healthcare system across the country is, especially nonprofit healthcare. The pace of change in healthcare has greatly accelerated, and so, keeping up with the pace of change is always a challenge. We run on thin operating margins, especially in nonprofit healthcare, and so, we have to be as productive and efficient as possible.

There's a lot of talk about the promise of generative AI and helping healthcare systems like ours transform, so we knew this is just a natural extension of the work we were already doing and sort of bought in -- pretty quickly actually -- when the generative AI tools came on the scene.

Kennedy: When I was reading the press release, it said you did this in six weeks, which is quite an undertaking. Especially, like you said, for a health system of this size and this magnitude, and I'm sure -- alongside the challenges that you were already facing -- doing that in that amount of time [was] a tall order.

Knuth: Yeah, we did the actual development of the education in a six-week timeframe. So, there were a couple of activities we had done prior, though, in fairness. We had brought in an industry expert, who was also an expert in generative AI and in the healthcare field, and had them talk to over 200 of our leaders internally about generative AI and to help them understand an example portfolio of use cases of where generative AI can be really beneficial for a healthcare system like ours. We had done that.

We did some alignment with our leaders, our board and with the C-suite of the organization, and then we had put together a generative AI work group. So, we had committed from the top down that we were going to be putting together our own portfolio of use cases for generative AI. And one of the things I think we did that was really smart, honestly, is we put two physician innovators in charge of that work group.

We had Dr. Jonathan Handler, who is an innovation senior fellow with us, with years of experience, and then, also, Dr. Tyler Fitch, who is our CMIO I-- very heavy in clinical informatics. He's a hospitalist, but also very technically savvy.

So, they put together a generative AI work group, and then I was asked to lead the education work stream underneath that work group. You introduced me as the vice president of planning-- that is my role, [and] this is one of those other duties that we get in an organization, and we were given six weeks to develop education for 24,000 employees, our mission partners.

The reason behind that was that these tools were proliferating. They were becoming more and more publicly available, and so we wanted to make sure not to stifle the use of those tools because we want everyone to really understand how it can augment their role, but to help give some guidelines and guardrails for how they would use those tools internally.

The other challenge we had was that with 24,000 mission partners, we had to raise the level of education, or level set the education, for everybody -- from our patient transporters to surgeons. So we had a lot of conversations about 'What do we do about that? Do we create different levels of education and multiple packets of curriculum?' And what we had decided was that generative AI was new enough on the scene that we were going to treat everyone the same and use this as an opportunity to level set on AI literacy, but also raise all boats so that they had this foundational understanding of generative AI that we could build on going forward.

Kennedy: That was something, when I was reading about this, that I was wondering about in the back of my mind, because of course, when you're trying to educate everyone from the CEO to the cafeteria workers, that's got to be challenging. 24,000 people, even just on scale alone, making that generalizable and high value, obviously they have enough on their plates already. You really want something that packs a punch. How do you sort of overcome those challenges? How do you make it generalizable? Of course, I guess there are certain best practices like you mentioned that you can focus on, but it just seems like a huge undertaking no matter what.

Knuth: It definitely was a huge undertaking. We did a couple of things. One, we had to think about the context, the environment to which we were introducing this education. We started on this a year ago. Honestly, we weren't that far out of the pandemic at that time. So, we still had clinicians who were facing burnout, especially due to the administrative burden of clinical documentation. So, we knew it was really important that we pay attention to this, focus on this and get people using these tools in the right way. One of the other things that really drove [this] is we have a staffing shortage in healthcare that is expected to persist for at least another decade. We're challenged by that every single day. This is a way to augment people's roles, so we took the AI literacy approach.

We did start with the basics, [like] 'What is generative AI, and how is that different than AI?' -- which we have been talking about and doing inside our organization for many years -- so they could understand the difference in that the promise of generative AI is really so much greater than what we were looking at with just AI alone, but we're using both. We also wanted to focus on 'What are the limitations of generative AI?' There's bias in the results, and you, as the person using it, have to recognize that you could potentially get a biased result. Always keeping that human in the loop and helping people understand it's sort of a 'trust, but verify' strategy -- that these tools can help accelerate progress on whatever you're working on. You really are accountable for validating the results you receive and making sure they're appropriate for the context.

You want to use that information, but because we're a healthcare system, we have to protect patient health information at all costs.

We wanted to make sure people knew you cannot copy and paste patient health information into these tools unless this is a tool that has been reviewed and approved for that purpose by OSF. So, giving them some of those core guardrails, but not stopping with all of the restrictions and the guidelines, but trying to make it fun for them at the same time and give them real use cases, potential use cases, things they could get excited about.

Being able to diminish the documentation burden is probably one of the biggest ones for clinicians, but there are so many other use cases. So, we gave them examples of those things. The other thing we tried to do -- because it is daunting to try and get to 24,000 mission partners, and we are a society that has sort of lost attention span over time -- we tried to take some lessons from social media and really think about, how can we do this work TikTok style, where we do small vignettes of education no longer than one to two minutes, and make it mixed media.

So, it's not just all videos or all text, it's some interactive visuals that can help test comprehension, it's some videos that give them the core messages that we needed them to understand from the learning. And honestly, it worked really well.

We did a very brief survey at the end of the education, asked them just a couple of questions, and we did have 80% of the organization complete this mandatory education. So, about 19,000 of the 24,000 mission partners completed it across all levels. And one of the things we asked them was, 'Did this help enhance your knowledge of the subject matter?' And 75% of those who took it said it did. We also asked them, 'Was the information we provided relevant to your role?' And 65% of them said it was.

We consider that a huge success when you're trying to raise all boats for 24,000 people on a new technology subject they're really not that familiar with yet. It was fun and challenging because we did have to do that in six weeks because they felt it was important -- people were using these tools, and we needed to get these messages out, but I feel like it was very successful.

Kennedy: An 80% response rate, completion rate -- anybody who's in statistics or does surveys knows that's really good, especially rolling out something like this. [Generative AI] is new, and this is a gray area for a lot of health systems, which is why I think the approach that you've taken, taking lessons from social media and other places, is really innovative. But I think this just goes to show that there are things that you can do depending on what resources you have. And, as you mentioned, putting those guardrails in place is really important. You guys are working with partners to help you do that. That's an important piece of this conversation as well, because if you don't have expertise on your own, you don't have an informatics team, you might look for a third party or outside input to help you with that. I was wondering if you could talk a little bit about how you forged those partnerships, and how they've helped bolster the work that OSF was already doing around AI.

Knuth: We are a member of AVIA Health. That is one of our partners. They're an organization that helps healthcare systems with digital transformation, which we've been working on for a number of years. So again, this kind of fit in really well in that strategy.

One of the things they really focus on is bringing together, in a collaborative manner, healthcare systems that are trying to address the same types of problems. We have ended up finding out that we really were kind of ahead of the other organizations that were part of this collaborative, in thinking about educating our workforce. So, we collaborated with over 30 other healthcare systems through this partnership, and ended up really being a key presenter for them and how we approached our workforce education around generative AI, and we have done several calls -- like, consult calls -- with some of these healthcare systems to help teach them what we did so that they can put in place some of those same approaches at their organization.

Generative AI can be scary to people, and one of the things I didn't mention that I think was really critical in the way we approached this is, we decided in the beginning to use generative AI tools to create the education. It was the only way I was going to get this done in six weeks. So, every visual that was included in our education and all of the videos, were all done with generative AI tools, and we told people that when they were taking the education. At the end of each lesson, it would say, 'All of the visuals and the videos that you just reviewed were created with generative AI tools,' so that they are starting to get an understanding of the power of what generative AI can do.

Kennedy: And of course, there's a transparency aspect as well. Anytime anyone's using a generative AI tool, in the healthcare context, specifically -- because some electronic health record vendors are integrating it into their platforms and their workflows -- there's that question of, 'Okay, how do we be transparent? How do we inform people that they're engaging with this technology?' And this is really interesting because you're building that into the system. You're building that transparency. You're saying, 'Hey, we used this technology to create these materials.' So, that creates an expectation to prioritize that transparency going forward. Of course, that also makes people more aware of how they engage with the technology, even just personally -- thinking about, 'Okay, where is this pulling information from? How am I using it?' And it prompts those questions, which I think is a really important part of any educational tool: Does it encourage you to ask your own questions? I'm looking forward to seeing, in the future, how it goes for you guys and how it changes and evolves, because of course, AI is going to change very quickly.

Knuth: There are a couple of other things that were interesting in our approach too. So we had done this generative AI 101, the AI literacy approach, first, and then we decided as our next step, we wanted to meet people where they were at in the tool sets that they were using.

And so, we partnered with Microsoft on that because I wanted to teach people how to do prompt engineering, how to write a good prompt, which is at the core of all of this, right? If you can write a good prompt, you can get out of the system what you need. So, that was the next thing we did. We taught them to use Microsoft Copilot, and we did that by creating about 35 examples of good prompts that were tried and practiced, and then we developed an approach using Microsoft Power Apps to do crowdsourcing with that inside the organization.

So, we pushed those out for all of the Microsoft Product Suite example prompts so they could see something that worked. They could copy and paste it into Microsoft Copilot, see what it did, then change it for how they needed it to work and be able to learn through that process. And the crowdsourcing part -- because it's open now -- these Power Apps where other people, as they write good prompts, can submit them to share across the organization as well. So, that was kind of the next place we went with that. And then, one of the things you had mentioned earlier, as well, the next place we're expecting to go is to really put together a visual. We share all of our strategy with the organization, using visual art that has been created with generative AI tools now.

And so, giving them an artistic visual of our healthcare system, where they can go into hotspots on that and see -- How are we using generative AI in our hospitals? How are we using it inside digital health? How are we using it in the medical group?-- and be able to give them real examples so that they understand how this is being used, where it's being used and the power behind it.

We're trying to always get people excited, because it can be scary, and sometimes, people feel like, 'Oh, I should be afraid of this because it's going to replace me, potentially.' So, we did make a concerted effort to make sure there were messages in that first set of education that said, 'We are not using this tool set in an attempt to replace your position, but to augment your position and help you really be able to focus on those more value-added tasks that you're needed for.'

Kennedy: That's something that I hear about all the time, people talking about not only the concern of being replaced, but also the concern about how to reassure people that they won't be, because of course these AI tools are wonderful, but they cannot replace the human aspect of healthcare, because the human aspect is the most important one. But of course, with anything new and with anything as big as AI, there are always going to be those questions, so it's always good to know that the people who are implementing it and sort of handing it down are prioritizing that and saying, 'Hey, we want to use this to help you. We don't want to use it to replace you.'

That's all the time we have for today. Thank you again, Melissa, for chatting with us. I think this topic is really interesting, and I'm excited to see how it goes and how other health systems approach it, because I think it's a really important part of how we proceed with AI.

Knuth: Yeah, it's exciting times.

Kelsey Waddill: And thank you, listener, for tuning in. If you liked what you heard, head over to Spotify or Apple and drop us a review. We'll be choosing some of our reviews to be read on the show in appreciation. So, keep listening through to the end because you might get name-dropped. See you next time.

Music by Vice President of Editorial, Kyle Murphy, and production by me, Kelsey Waddell. This is a TechTarget production.

+ Show Transcript

Next Steps

Bringing artificial intelligence into prior authorization processes

Dig Deeper on Artificial intelligence in healthcare

xtelligent Health IT and EHR
xtelligent Healthtech Security
xtelligent Healthcare Payers
xtelligent Pharma Life Sciences
Close