A recent Gartner study found that most customers are “AI-shy”: 64% of respondents said they would prefer companies not to implement AI to customer experience. Customers were also concerned about AI and misinformation (42%), data security (34%) and bias/inequity (25%).
Ethical AI can lend a hand organizations create pioneering, trustworthy user experiences – protecting brands, enabling them to maintain a competitive advantage and nurturing better customer relationships. And ethical AI is part of WellPower’s story.
PROBLEM
In the mental health field, there aren’t enough therapists to lend a hand everyone who’s struggling. Community mental health centers like WellPower in Colorado serve some of the most vulnerable populations in need of lend a hand.
Because of the intricate needs of the individuals they serve, WellPower clinicians must navigate more intricate documentation rules than therapists in private practice. These additional rules create an administrative burden that takes away time that could otherwise be spent on clinical care.
WellPower explored how technology could serve as a driver for employee engagement in mental health issues.
The provider organization turned to AI company Iliff Innovation Lab to explore how health informatics could make it easier for patients to access care, such as through telemedicine; how patients could move through treatment more quickly by facilitating evidence-based practices and remote treatment monitoring; and how WellPower could reduce administrative burden by making it easier for providers to create high-quality, right records while allowing them to focus on delivering care.
“When used correctly, clinical documentation is a particularly promising area for AI implementation, especially in behavioral health,” said Wes Williams, CIO and vice president of WellPower. “Vast language models have proven particularly effective at summarizing immense amounts of information.
“During a typical 45-minute psychotherapy session, there is a lot of information to summarize to document the service,” he continued. “Staff often spend 10 or more minutes completing the paperwork for each service, which adds hours that could otherwise be spent providing clinical care.”
APPLICATION
WellPower’s commitment to healthcare equity informs how the company approaches technology adoption, and partnering with Iliff is vital to continuing that mission, Williams said.
“AI tools are often black boxes that hide how decisions are made and can perpetuate biases that have led to the health inequities faced by the people we serve,” he explained. “This puts us in a hard position because not using these up-to-date tools would take away their effectiveness from the people who need them most, but adopting them without assessing for bias could enhance inequities if the AI system had historical biases in health care built into it.
“We found a system that used AI as a passive listening tool that could join therapy sessions (both telemedicine and in-person) and serve as a kind of digital scribe, generating working notes for our clinicians to review and approve,” he added. “But we had to make sure that the digital scribe was trustworthy to generate summaries of therapy sessions that were accurate, useful, and unbiased.”
Behavioral health data is among the most sensitive from a privacy and security perspective; these safeguards are needed to ensure people feel comfortable seeking the care they need, he continued. For that reason, WellPower must carefully vet any up-to-date system, especially one based on artificial intelligence, he said.
RESULTS
To implement an AI-powered digital scribe, WellPower needed to be certain it would not compromise the privacy or security of the people it services.
“Many therapists were initially hesitant to try the new system, citing these legitimate concerns,” said Alires Almon, Chief Innovation Officer at WellPower. “We worked with the Iliff team to ensure the digital scribe was built ethically with privacy at the forefront.
“For example, the system doesn’t record a therapy session, but encodes the conversation on the fly,” she continued. “That means that at the end of the session, the only thing that’s stored is metadata about what topics were discussed during the session. With the insights from the Iliff team, we were able to protect the privacy of our patients while opening up more time for care.”
She added that the apply of an artificial intelligence (AI)-assisted platform to support transcription and drafting of progress notes has significantly improved the quality of therapy for both staff and the people WellPower helps.
“Since adopting the Eleos system, WellPower has seen significant improvement in staff’s ability to complete progress notes,” Almon reports. “Three out of four outpatient therapists apply the system.
“For this group, the average time to complete documentation improved by 75%, and the total documentation time decreased by 60% (reducing note-taking time from 10 minutes to 4 minutes),” she said. “Our therapists were so excited about working with Eleos that some said they would think twice about leaving WellPower because of their experience with Eleos.”
ADVICE FOR OTHERS
As Almon noted, AI is a up-to-date and stimulating endeavor in health informatics, but it comes with its own unique baggage, influenced by science fiction, media hype, and the real possibilities of the field.
“It’s important for your organization to educate and define AI for your staff,” she advised. “Explain how it will be used and what processes and policies will be in place to protect them and their customers. AI is not perfect and will evolve.
“If possible, before you start implementing AI-enabled tools, take a pulse to assess the level of understanding of AI and what people think about AI,” she continued. “Partnering with a program like Iliff’s Trust AI Framework not only helps you choose ethical technology to use, but also communicates that your organization has considered the harms that could occur from AI-enabled platforms.”
This is more crucial than the results themselves, she added.
“Finally, reassure staff that AI cannot replace AI,” she concluded. “Human relationships are the most important relationships in treating individuals. AI is there to help people in their roles; it is an assistive technology. AI can support and assist, but it will never replace the therapeutic connection.”
