Researchers at Mass General Brigham, looking to utilize AI to address the undertreatment of pain in certain patient groups, tested whether vast language models could improve racial disparities in pain perception and medication prescribing.
The LLM tests did not show any discrimination based on race or gender and could prove to be a helpful tool in pain management, ensuring equitable treatment for all patient groups, MGB researchers said in a statement Monday.
“We believe our study provides key data showing how AI has the ability to reduce bias and improve equity in healthcare,” said Dr. Marc Succi, strategic innovation leader at Mass General Brigham Innovation and corresponding author of the study, in a statement.
WHY IS THIS IMPORTANT
Health system researchers commissioned OpenAI’s GPT-4 and Google’s Gemini LLM to perform subjective pain assessments and make comprehensive pain treatment recommendations for 480 representative pain cases they had prepared.
To generate the dataset, researchers used 40 cases reporting various types of pain — such as back pain, abdominal pain, and headaches — and removed race and gender identifiers. They then generated all unique combinations of race from six U.S. Centers for Disease Control racial categories — American Indian or Alaska Native, Asian, Black, Latino or Hispanic, Native Hawaiian or other Pacific Islander, and White — before randomly assigning each case as male or female.
For each patient case included in the data set, forensic physicians assessed and assigned subjective pain scores before making pain management recommendations, which included pharmacological and nonpharmacological interventions.
The researchers performed univariate analyses to assess the association between race/ethnicity or gender and specific outcome measures — subjective pain rating, opioid name, order, and dosing recommendations — suggested by the LLM, the MGB reported.
According to GPT-4, pain was most often rated as “severe,” while Gemini most often rated it as “moderate.” tests published September 6 in PAIN, the journal of the International Association for the Study of Pain.
Of note, Gemini recommended opioids more often, suggesting that GPT-4 is more cautious when making recommendations regarding opioid prescribing.
The researchers said that while additional analyses of both AI models could assist determine which one is more aligned with clinical expectations, the study showed that the LLM models are able to look beyond patients’ perceptions of pain based on race.
“These results are reassuring because patient race, ethnicity, and gender did not influence recommendations, indicating that these LLM programs have the potential to help address existing biases in health care,” Cameron Adolescent and Ellie Einchen, co-authors from Harvard Medical School, said in a statement.
“I think AI algorithms in the short term will be complementary tools that can essentially serve as a second pair of eyes that work alongside healthcare professionals,” added Succi, who is also vice president of innovation and commercialization for corporate radiology and executive director of MGB’s medical solutions incubator for healthcare.
Healthcare officials said future studies should consider the impact of race on LLM treatment recommendations in other fields of medicine, as well as assess variables related to nonbinary gender.
BIGGER TREND
Just as biased algorithms have exacerbated the disproportionate impact of COVID-19 on people of color, research has shown that healthcare providers are more likely underestimate and not treat pain in black and other minority patients.
While AI has been found to exacerbate racial bias in many areas of medicine and healthcare, an LLM can also assist mitigate it clinician bias and support equitable pain management.
After opioid prescriptions surged in the 1990s and 2000s based on supposedly false promises of safety, the truth about addiction and dependency came to lithe when hundreds of local governments filed lawsuits against Purdue Pharma, the manufacturer of OxyContin, in 2017.
Health systems began to recognize surgery as a major factor in opioid initiation for patients who developed opioid dependence. Intermountain Health and other providers then focused on reducing opioid prescriptions, educating caregivers, standardizing pain management techniques, and using AI-powered analytics to sustain practice changes and improve patient safety.
Tech developers have also used analytics in mobile care management to assist doctors make sure the right amount of pain medication is being administered and patients are following their treatment plan.
Although AI does not directly advise patients, Steven Walther of Continuous Precision Medicine said in July that data-driven technologies could assist both doctors and patients reduce their dependence on opioids and other pain medications.
In a full randomized study, patients using the company’s mobile app “were 92% more likely to adhere to their medication recommendations,” Walther said.
IN THE DOCUMENT
“There are a lot of things we need to consider when integrating AI into treatment plans, like the risk of overprescribing or underprescribing pain medication or whether patients are willing to accept AI-based treatment plans,” Succi said. “These are all questions we’re considering.”
