AI therapists match human care in groundbreaking mental health trials

According to the first clinical trial of such clinical trials, people who struggled with depression had a 51% reduction in symptoms using the AI therapy app in just four weeks, a result that human therapists usually see.
The study, published on March 27 in the New England Journal of Medicine, provides compelling evidence that AI can help the United States with a severe shortage of mental health providers while providing clinical implications.
Researchers at Dartmouth College have developed and tested smartphone-based chatbots designed to provide evidence-based therapeutic support through natural text conversations. The system is trained in current best practices in psychotherapy and cognitive behavioral therapy.
“The improvement in symptoms we observed is comparable to reports from traditional outpatient treatments, suggesting that this AI-assisted approach may have clinically meaningful benefits,” said Nicholas Jacobson, senior author of the study and associate professor of biomedical data science and psychiatry at Datmus Gaistel Medical School.
The trial recruited 106 participants who were diagnosed with major depression, general anxiety disorders or eating disorders. They interact with Therabot through a smartphone app, responding to prompts or starting a conversation whenever support is needed. Another 104 participants with similar diagnosis formed a control group.
The results showed that patients with anxiety disorder had an average reduction of symptoms of 31%, and many moved from moderate to mild anxiety classification. Even participants in eating disorders (traditionally, treatment is more challenging) showed a 19% reduction in weight and weight focus, significantly exceeding the control group.
“Our results are as good as people who can see with outpatient providers who can get gold standard cognitive therapy,” Jacobson explained. “We are talking about the possibility of giving people the best treatments that you can get in the care system in a shorter time.”
The availability gap in mental health care remains alarming. For each available provider in the United States, approximately 1,600 patients suffer from depression or anxiety alone.
“There is no alternative population care, but there are fewer enough providers to move around,” Jacobson noted. “We want to see generated AI helping to provide mental health support to a large number of people outside of in-person care systems.”
Perhaps most surprising is how participants build emotional connections with digital therapists. Users report that the “therapeutic alliance” level is a key bond of trust between patients and therapists, matching what patients usually experience in human providers.
“We didn’t expect people to treat software almost like friends,” Jacobson said. “My feeling is that people are also comfortable talking to robots because it doesn’t judge them.”
Many participants did not present a conversation, especially during fragile times, such as late at night, when human therapists were not available.
Study participants engaged to Therabot averaged six hours throughout the trial, equivalent to about eight treatment sessions. Nearly 75% of people did not receive any other form of treatment during this period.
Despite the encouraging results, the researchers stress that AI therapy still requires a lot of human supervision. The research team carefully monitored the interaction and was ready to intervene to illustrate the emergence of the content.
“While these results are very promising, no generated AI agents can operate completely automatically in terms of mental health,” warned Michael Heinz, the study’s first author and assistant professor of psychiatry at Dartmouth.
If Therabot detects potential self-harm content, it is programmed to provide immediate crisis resource options, including prompting to call 911 or suicide prevention hotline.
The Dartmouth team has been developing Therabot since 2019, just before the recent explosion of AI tools like Chatgpt. Their cautious, methodical approach is the opposite of mental health applications that now incorporate no similar clinically validated AI.
“Since Chatgpt’s release, there’s a lot of people rushing into this space, and it’s easy to see a great proof of concept at first glance, but the safety and efficacy aren’t well established,” Jacobson warned. “This is one of the situations that require diligent supervision and provide a situation that really sets us apart in this space.”
As the research continues, the team envisions AI therapy not as a replacement for human providers, but as a supplemental tool that can greatly expand access to mental health care and is currently inaccessible.
If you find this report useful, consider supporting our work with a small donation. Your contribution allows us to continue to bring you accurate, thought-provoking scientific and medical news that you can trust. Independent reporting requires time, effort, and resources, and your support makes it possible for us to continue exploring stories that are important to you. Thank you so much!