![](https://www.techtarget.com/rms/onlineimages/virtual health_a210532896_searchsitetablet_520X173.jpg)
sdecoret - stock.adobe.com
Early lessons on SDOH data integration across health plans
A new NCQA study found that while SDOH data collection is feasible, health plans face various challenges ranging from siloed data systems to inconsistent coding.
The pursuit of health equity relies on social determinants of health (SDOH) data, with research suggesting social factors may account for up to 50% of county-level variation in health outcomes.
Several federal programs have introduced quality measures to capture and address SDOH data. However, effectively aligning with these requirements is still a work in progress across the care continuum.
The National Committee for Quality Assurance (NCQA) has been at the forefront of SDOH quality measures, introducing the Social Need Screening and Intervention (SNS-E) HEDIS measure to address unmet needs in areas like food, housing and transportation. Recently, NCQA expanded the measure's scope to focus on two emerging domains: utility insecurity and social connection.
A recent NCQA study published in Health Affairs Scholar assesses the feasibility of collecting and reporting on these new SDOH domains. Through interviews with eight health plans, the researchers found that while collecting SDOH data is feasible, interoperability barriers, such as siloed data systems, inconsistent coding and issues with data formatting, storage and mapping, persist.
In this episode of Healthcare Strategies, Rachel Harrington, PhD, assistant vice president of Health Equity Sciences at NCQA, and Adrianna Nava, PhD, MPA, an applied research scientist at NCQA, dive deeper into the study’s findings. They discuss current approaches to social needs screening, the complexities of SDOH data collection and the importance of collaboration in creating SDOH data standards.
Hannah Nelson has been covering news related to health information technology and health data interoperability since 2020.
Rachel Harrington: There's a little bit of the sense of the field trying to build the plane while they're flying it, in terms of what data exchange looks like, what tools are being used, how it's being stored.
Adrianna Nava: We're excited to brainstorm and share new ways to integrate those codes and see if we could make sure to capture all the work that clinicians are doing to address social needs, but also healthcare delivery systems as well as health plans.
Hannah Nelson: Hello and welcome to Healthcare Strategies. My name is Hannah Nelson, assistant editor of Health IT and EHR. Research shows that social determinants of health can drive as much as 80% of health outcomes. However, standardized documentation of social needs is lacking, limiting efforts to improve population health. Joining us today is Rachel Harrington, assistant vice president of Health Equity Sciences at NCQA, and Adrianna Nava, applied research scientist, Health Equity Sciences at NCQA, to discuss a new study published in Health Affairs Scholar that explores early lessons of integrating social determinants of health data into practice.
Rachel and Adrianna, thank you for joining us.
Adrianna Nava: Nice to be here. Thank you.
Rachel Harrington: Thank you.
Hannah Nelson: Now just diving right into it, what motivated the study that you guys have recently published?
Adrianna Nava: For those who are not aware, in HEDIS, which is our quality improvement tool that health plans use to measure performance, we recently developed our social need screening and intervention measure. In that measure, we assess for unmet food, housing and transportation and whether or not an intervention was given to someone who screened positive within a 30-day time period.
Doing this measure and having it included in HEDIS, we noticed that there were other measures in the field as well that had additional domains that our measure at the time didn't take into account. For example, the domain of utility insecurity was one of those. Then there's been a lot of talk about the importance of social connection. A part of social connection is loneliness, social isolation, inadequate social support, and looking at what that would look like as a measure.
What we've done at NCQA is done some pilot testing, which you can read about in our study, where we were looking at the feasibility of collecting data on the new domains of utility insecurity as well as social connection within our current measure space for social determinants of health.
One major finding we found was the lack of standardization in the way that information was collected. The first piece of that is in our first indicator looking at social needs screening. We've noticed that there's a variety of ways that individuals, plans or providers have been collecting that information. NCQA recommends, aligned with Gravity Project terminology, to use evidence-based screening tools. But we did notice that even though this is recommended, there are some delivery systems that are creating their own screening tools. We've also noticed that some states are also developing their own tools or using other tools that are not within our list of validated tools, or have terminology associated for them to be able to collect that information.
There is a great opportunity for us in the larger SDOH field to begin to hone down on what tools that we want to include in measurement, but also making sure that they have the terminology to capture that information and be able to cross across an EHR as well as case management files, and making sure that it's capable of being digital information that's being transferred.
Rachel Harrington: Yeah. What I think people don't always realize about measure development is that measurement is seen as this very quantitative science. But half of the work that goes into it is the qualitative work of trying to understand why things are happening, and how things are happening, and how they're not happening. This paper, I think, really highlights that qualitative background that shapes so much of what we do and what the measure is trying to achieve.
Hannah Nelson: Great. Based on that, what did you find in the study? Passing it off to you, Adrianna, what were the study's main findings?
Adrianna Nava: One of the things that we thought was really interesting, looking at a little bit broader than just quality measurement and what kind of data elements are there, but the role that policy plays in helping to build that standardization process. Really relying on policy, whether it's state level or national policy, to help the infrastructure to be built. We know that as we continue on this digital journey, making sure that we are in sync with what data elements are currently available, but what type of infrastructure is needed in the future, to make sure that data transfer is seamless.
Hannah Nelson: With all these different tools being used and standards, based on your findings, what are the implications for social determinants of health data collection from here?
Rachel Harrington: I don't know, Adrianna, if you'd agree. There's a little bit of the sense of the field trying to build the plane while they're flying it, in terms of what data exchange looks like, what tools are being used, how it's being stored. To the point that you just made, there are, I don't know how to continue the analogy here, there are blueprints out there. The Gravity Project is a blueprint for how to do this. Trying to figure out how people are navigating this in such a dynamic environment was one of the things that stood out to me about this work.
Hannah Nelson: Were there any findings from the study that stood out to you, or something that was particularly surprising or unexpected?
Adrianna Nava: I think what was surprising was the high variability in the amount of plans that were able to map to current codes that we were requesting in our measure versus those who couldn't. There's definitely an opportunity to still learn what the barriers are in addition to the data challenges but seeing if maybe there's challenges with screening members. So maybe declining that screening, which our measure at this point doesn't capture. There's a lot of opportunity to see what other barriers or challenges are there that are prohibiting health plans, in particular, from collecting this necessary data.
It was good to see that there are health plans that have been able to map to LOINC codes or pull LOINC codes from EHRs. That was really great to see as well.
Rachel Harrington: For me, I'm going to look on the positive side here. It was exciting to see the engagement from plans across all product lines. Typically, Medicaid plans, Medicare plans, there are policy levers in place that are helping those plans focus on this. But there were commercial plans who were part of this study too, and we’re able to really engage and discuss their work here.
When you look at the findings, and there's a table in the paper that summarizes the different plans, and their ability to capture screenings and interventions and how they're coding it, the screenings were challenging. But most, if not all, close to all of the plans, did have some sort of systematic capture for interventions. And while there are challenges to focus on here, there are also elements that you're like, "Oh, there's progress." Because I don't think if you had done this four years ago or five years ago, you would have seen this level of engagement and broad engagement across plan types.
Hannah Nelson: Great! That's great. I love to hear looking on the bright side, there has been progress in this field. Really good to focus on that, at the end of the day.
Now not to bring us back to the challenges, but in terms of the barriers that exist, the multiple systems and different standards and formatting challenges, how do these challenges impact health plans' ability to collect and utilize this kind of social determinants of health data effectively?
Adrianna Nava: Yeah. It was really interesting to hear about health plans having to navigate among different data systems and knowing where to find specific codes, especially since there are newer codes that we're introducing to the field. Being able to navigate, "Where do I find the LOINC code? Or where do I find the SNOMED or the CPT? Oh, I have these Z codes that you're not asking for, but they're being documented."
Rachel Harrington: An entire alphabet of codes.
Adrianna Nava: Yeah. "What do we with those? Those are important as well." It was just really interesting to hear. It was honestly positivity. Health plans that were part of the study were excited to be doing and be a part of this work. We're hopeful that this will help shine a light on the work that health plans are doing to try to collect this information on their members.
Hannah Nelson: That's wonderful. Yeah, to know that there's that general excitement and commitment to wanting to collect this data and knowing how important it is is definitely something to highlight. That's great.
Rachel Harrington: It's interesting too, because I think a lot of the work done in the space so far has been done on the clinician, provider, health system side. This is, as far as I am aware, one of the first looks at the health plan side of things. It lets you put the picture together. There's almost an empathy in this, because if you imagine that health plans are dealing with all of these different approaches, and then they're trying to work with all of their contracted providers and systems who themselves are contracted with multiple plans, you can see where this all gets so confusing and challenging in practice.
Hannah Nelson: Totally. Yeah, that's so true. I guess based on the research that you guys have done, what are the next steps for health plans and policymakers, in terms of making social determinants of health data collection more feasible and more usable in practice?
Adrianna Nava: One of the things that we've been working on at NCQA is looking at new codes or terminology that's available to capture SDOH screening, but also potentially some interventions. That's looking specifically from feedback that we've gotten from our field-testing sites of the Z codes, how did we introduce that into our measure, for example. We know with the recent physician fee schedule of 2024, there were additional HCPCS or G codes that we introduced there. We're excited to brainstorm and share new ways going forward in order for us to be able to integrate those codes and see if we could make sure to capture all the work that clinicians are doing to address social needs but also the healthcare delivery systems as well as health plans.
Hannah Nelson: Yeah. Now, jumping off of that, what kind of future research or pilot projects would either of you or both of you like to see in this space?
Rachel Harrington: I think one of the things that stands out for me is the environment, the networked environment here, all of the different contributors. We've talked about the clinicians, the systems, the plans, which was the focus of this study. You also have the community-based organizations, many of whom are being tasked either to do the screenings or to provide the interventions that these sorts of measures and all of these codes we're talking about are documenting. I think for us, really leaning in on trying to understand a map of all of those contributors, and how the barriers and facilitators that Adrianna and the team found in this study, maybe compare between them.
Ideally, you find a common barrier that you can solve that makes everybody's life easier. But chances are, there are likely unique elements there. How do we make sure that, when we're solving the problems for the health plan, we're not making more problems for the community partner? I think that is something that we'll definitely be hoping to look into more as we go forward with this work.
Adrianna Nava: I'd just like to add on, as we continue to standardize the way we collect this data, we're hopeful that when we look at social needs, we'll be able to tie it to specific health outcomes and be able to take these measures even forward. And see how they're contributing to maternal health or other areas that are of interest at NCQA. That's why it's so important right now with this research study to be able to work on getting and collecting this information in a standardized way but also making sure that information is digital and being able to be transmitted across the healthcare system.
I wanted to highlight this as well. We noticed that health information exchanges are not used as much for the transfer of information. There's a great opportunity there to work with states and public health infrastructure to be able to see how we can move the needle forward in collecting this essential information across populations.
Hannah Nelson: Definitely. Yeah, thank you so much for adding that, Adrianna. That's very important and definitely something that we need to keep in mind.
Looking towards the future again, how soon do you think health plans will feasibly be able to implement these measures to really improve population health?
Rachel Harrington: A mixed bag.
Hannah Nelson: Yeah.
Rachel Harrington: One of the things I think we found in this study is many of them are already implementing this work. It's just the documentation that's lagging behind. But there's a reason we measure and we ask for the documentation, which is without that documentation, that transparency, you don't know if the interventions, the screening are making it to all the populations who need it. To Adrianna's point, you don't know if they're working. Are they impacting outcomes? It's core quality measurement 101, you can't act on what you can't see.
It's really taking the work that's already being done, getting it streamlined in a way that can be documented so that it can be evaluated and see where it goes from there. But plans are already doing it. Medicare already has measures that are required at various levels of care for work like this. You're seeing CMMI models that have been doing this work, the Accountable Health Community's model recently had some evaluations come out that showed some really positive impacts. It's, I think, just helping different parts of the field catch up with each other. I don't know, Adrianna, your sense, hopefully we're not too far away.
Adrianna Nava: Yeah, I hope not. Like you said, the work is being done so it's exciting when we're looking for field-testing partners, a lot of enthusiasm and participating, and sharing the great work that they've been able to see. And also, trying to be partners in identifying solutions to some of the challenges that we mentioned. That's really exciting to be a part of.
Rachel Harrington: And frankly, it makes business sense. A lot of these organizations are doing this work or started doing this work before there was ever any measurement or requirement to do it. Again, it goes back to quality 101. If you want to get the most out of your interventions, you need to make sure that you're addressing all of the root causes, the unmet needs. If you invest $1 million in diabetes and you're not paying attention to food, housing or transportation, you're not going to get $1 million of outcomes. I think there's an internal motivation here, too.
Hannah Nelson: That definitely makes sense. Wow. Thank you guys so much for coming on the podcast and joining me today. This has been a great discussion.
Rachel Harrington: Yeah.
Adrianna Nava: No, thank you. It was a pleasure.
Rachel Harrington: Thank you.
Hannah Nelson: To all of our listeners, thanks for joining us. If you're interested in learning more about this topic, check out our site dedicated to health IT at techtarget.com/searchhealthit. If you have thoughts on this topic or if you have any healthcare-related stories that you'd like us to consider for coverage, you can reach out to me at [email protected]. That's [email protected]. Follow us on Spotify to get more of these conversations, and let us know what you think by rating and reviewing the show. See you next time!
Kelsey Waddill:
Music by Kyle Murphy, and production by me, Kelsey Waddill.
This is a TechTarget production.