Getting Smart on SDOH: A Q&A with Dr. John Showalter, Chief Product Officer at Jvion

This Q&A is based on a recent webinar co-hosted by Dr. Showalter and Amy Fellows, MPH, a Senior Advisor at Jvion partner Pivot Point Consulting. Watch the full webinar here, and be sure to read Amy’s guest blog post on how social determinants of health (SDOH) drive inequities in healthcare outcomes.

Q: How can artificial intelligence (AI) help health systems take action on health inequities?

Dr. Showalter: AI represents a possible breakthrough in managing SDOH, and by extension, addressing health inequities. Machine learning makes it possible to correlate petabytes of data on SDOH with historical health outcomes, and from there, predict an individual patient’s risk for adverse health outcomes based on their own SDOH data.

Q: How far along is AI technology in addressing SDOH?

Dr. Showalter: This may seem like an optimistic vision of the future, but in fact, it’s already reality. Last year, we published a study in the American Journal of Managed Care that found that AI could accurately stratify patients’ risk for healthcare utilization using only SDOH data, in conjunction with basic demographic data on patients’ age, gender, race, and address.

Q: How does this use of AI differ from past approaches to SDOH?

Dr. Showalter: This is a significant development. In the past, case managers had to contact patients individually to gain insight on their SDOH — a time- and resource-intensive process — or make assumptions about their risk based on their demographics. Now, with only the basic data in the patient registry, they can gain insight on which patients are at greatest risk, the risk factors driving their risk, and what steps can be taken to reduce their risk. 

For the first time, providers can gain insight into the unseen social and environmental factors that impact their patients’ health outcomes, and adjust their care plans accordingly. Clinicians and case managers can have more productive conversations with patients and be more empathetic to their social and environmental needs, promoting trust and engagement. More importantly, patients can be matched to the most appropriate community benefit programs and social workers that will address their SDOH needs.

Q: Clinical AI has been found to be racially biased in the past. How can we ensure AI works to reduce bias, rather than perpetuate it?

Dr. Showalter: Preventing bias must be a priority for any developer of clinical AI tools. It cannot be an afterthought — the possibility of bias must be actively considered and tested for throughout the product’s development and use. Data used to train machine learning models should be diverse and represent the population the model is ultimately used on. By taking a bias-conscious approach, we can make sure AI works to reduce disparities, rather than amplify them. 

Health equity requires valuing all individuals and populations equally. This is only possible if we take action to address the inequities in social determinants that underlie the inequities in health outcomes. Used correctly, AI can be a powerful tool to help us get there.

Get in Touch

  • (972) 831-7270 | 222 W Las Colinas Blvd., Suite 2200N Irving, TX 75039
  • This field is for validation purposes and should be left unchanged.

Request a demo

  • Download the latest SmartFocus paper “The COVID Aftermath: Why behavioral health is the next crisis health plans should be prepared to manage” brought to you by Jvion and SmartBrief to understand why the current state of analytics leave members and health plans exposed to a behavioral health crisis.

  • We respect your privacy.
  • This field is for validation purposes and should be left unchanged.