A coalition of medical experts from New York City hospitals has pledged to tackle racially biased computer algorithms that are used to diagnose diseases or decide routes of patient care. It aims to do away with at least one algorithm that uses race as a factor in determining patient treatment within the next two years.
This announcement to launch the Coalition to End Racism in Clinical Algorithms (CERCA) comes after the city's health department declared in October that racism is a public health crisis. The 12-member group includes leaders and medical experts from public and private hospitals across the city. They will collaborate with health providers in raising awareness.
“It starts with consciousness, understanding the problem,” said Dr. Michelle Morse, chief medical officer at the city’s Department of Health and Mental Hygiene. “We have to agree the practice is harmful and then do the actual work with the laboratory.”
Despite their shortcomings, clinical algorithms are vital tools that help doctors make complicated and critical decisions about the best treatment option available for an individual patient. Information, such as age or medical history, are factors in the calculation of care.
Many of these algorithms consider race, which results in different, and often negative, outcomes based on a factor, such as skin color, which is unrelated to the actual ailment.
The coalition vows to do more than sound the alarm. They are reevaluating all algorithms that include race, and there are already two on CERCA’s chopping block.
The first is the VBAC Calculator, which is used by obstetricians to determine the safest birthing method for an expecting mother who has had a previous cesarean delivery. A 2019 analysis showed its race-based factor is nearly 20% less likely to recommend vaginal birth for Black or Hispanic mothers than those who are white with the same health profiles. This bias results in doctors persuading their patients of color to have another cesarean, which are riskier with longer recovery times.
These decisions are a matter of life and death. In New York City, black women are eight times more likely to die of childbirth complications than white, which is higher than the national average.
But being Black is not the real risk factor, Morse said. It’s poverty, marital status and whether a patient has health insurance. In medical algorithms, these factors are sometimes substituted with race because statistically more Black Americans suffer poverty and lack of health insurance. Other experts agree.
We should just leave race out of it.
“We should just leave race out of it,” said Dr. Kathie-Ann Joseph, a professor of surgery and population health at NYU Langone. “Because what happens is people are trying to use race as a substitute for these genetic variations.”
Earlier this year, research from the National Institute of Child Health and Human Development published in the American Journal of Obstetrics and Gynecology showed a new VBAC calculator without a race factor could more accurately calculate the probability of a successful vaginal birth. It relied on concrete medical factors like a patient’s history of hypertension.
Another example of a biased medical algorithm is the one used for determining a metric called eGFR, which estimates how well the kidneys filter waste from the body. The calculations rely heavily on creatinine levels, and high numbers signify disease. The algorithm uses a race-based correction that automatically increases the eGFR measured in Black patients, leading doctors to believe that their kidneys are healthier. This often delays referrals to see a specialist or get a kidney transplant until a patient’s condition has degraded further.
A study published this month by The Lancet found that if the race variable was removed, about 10% more Black Americans would have been eligible for referral to a specialist. Black Americans are nearly four times as likely as white patients to suffer from kidney failure, according to the National Institutes of Health.
Morse said this mission feels personal because her father suffered from chronic kidney disease and also diabetes so severe his leg was amputated. But she said he is someone who would have dealt with the skewed kidney disease algorithm.
“A Black person’s kidney, biologically, at birth is certainly not different from a white person’s kidney,” said Morse. “However, what is different is the experience – Black people are more likely to live in substandard housing and live below the poverty line and less likely to have access to health care as a result of structural racism.”
While more health professionals are becoming aware of systematic racism in medicine, Morse said that there are many algorithms to fix before their work is done. Changing universally accepted practices can be very difficult because it requires convincing doctors that the way they have always diagnosed patients must change.
“A lot of the biggest obstacles are starting to lessen because research is showing that this practice is horrible and needs to change,” said Morse. “But some obstacles remain because providers do not know the history of how white supremacy and racism shaped the very way that we practice medicine.”
Editor's note: This story was updated to correct that Dr. Morse's father's leg was amputated as a result of diabetes.