Racial and ethnic bias pervasive in EHR notes, UChicago study shows

A study published in Health Affairs this week used machine learning to investigate whether providers’ negative patient descriptors in electronic health records varied by race or ethnicity.  

Compared with white patients, Black patients had more than 2.5 the odds of having at least one negative descriptor in the history and physical notes.  

“This difference may indicate implicit racial bias not only among individual providers but also among the broader beliefs and attitudes maintained by the healthcare system,” wrote the researchers from the University of Chicago.  

“Our findings raise concerns about stigmatizing language in the EHR and its potential to exacerbate racial and ethnic healthcare disparities,” they observed.  


By studying the data of 18,459 patients at a Chicago-based academic medical center, researchers in this study sought to examine the use of potentially stigmatizing language in the EHR.

As the study authors note, decades of evidence exists pointing to the unequal treatment patients of color face in the U.S. healthcare system.  

Studies have also pointed to evidence of the impact implicit bias has on patient-provider relationships.  

For this study, researchers developed a model to analyze the clinical notes data set, training it to categorize sentences containing 15 descriptors such as “(non-)compliant,” “combative,” “aggressive” and “unpleasant” as negative, positive or out of context.  

In models adjusted for socio-demographic and health characteristics, Black patients had 2.54 times the adjusted odds of having one or more negative descriptors in the EHR compared with white patients.   

The likelihood was similar when patients with ICD-10 codes related to delirium, substance use or other mental and behavioral diagnoses were excluded from the sample.  

Medicaid users and unmarried patients were also more likely to have negative descriptors in the EHR.   By contrast, notes written after March 1, 2020, and in an outpatient setting were less likely to have a negative descriptor.  

The researchers raised concerns about the enduring ramifications of such records, although they also noted future inquiries in that regard are necessary.  

“We theorize that negative descriptors in a patient’s EHR may assign negative intrinsic value to patients. Subsequent providers may read, be affected by, and perpetuate the negative descriptors, reinforcing stigma to other healthcare teams,” they wrote.  

“It is also plausible that if a provider with implicit biases were to document a patient encounter with stigmatizing language, the note may influence the perceptions and decisions of other members of the care team, irrespective of the other team members’ biases or lack thereof,” they continued.  

The researchers acknowledged several limitations with the study, including the potential for COVID-19 to affect clinicians’ behavior.  

“We recognize that the use of negative descriptors might not necessarily reflect bias among individual providers; rather, it may reflect a broader systemic acceptability of using negative patient descriptors as a surrogate for identifying structural barriers,” they added.  


Researchers have sought to examine the role health IT can play in reproducing bias.  

For instance, a study from 2020 found that artificial intelligence can make COVID-19 health disparities worse for people of color.  

And experts have noted that even innocuous-seeming data may have unforeseen consequences.  


“The goal of addressing implicit bias is to address the underlying mechanisms that prompt the use of negative descriptors to describe patients,” wrote researchers in the Health Affairs study.  

“This includes preventing the introduction of biased language by providers, preventing the perpetuation of biased language by members of the health care team, and increasing awareness of the effects of providers’ language on the patient relationship.  

“Interventions may include provider bias training and addressing healthcare system factors that may predispose providers toward expressions of bias,” they added.

Kat Jercich is senior editor of Healthcare IT News.
Twitter: @kjercich
Email: [email protected]
Healthcare IT News is a HIMSS Media publication.

Source: Read Full Article