Our office will be closed on Monday, January 19, 2026, in observance of Martin Luther King Jr. Day.
We will resume normal operating hours on Tuesday, January 20, 2026.

Augmenting Mental Healthcare With AI: Benefits and Concerns

Augmenting Mental Healthcare With AI: Benefits and Concerns

Artificial Intelligence (AI) has become so widespread and such a topic of conversation that some have begun to call it “the new electricity.” The more the technology develops, the more uses for it we seem to find. But with those uses and many benefits, there are lingering concerns — especially with something as sensitive as mental healthcare. 

In a recent clinical education meeting, Dr. Giovanna Franklin, PMHNP, gave a presentation on the ways AI is currently used to augment mental healthcare and the potentials moving forward, as well as some of the concerns. This is what our clinicians were able to learn.

What Is AI?

John McCarthy, one of the founders of artificial intelligence, defined it as “the science and engineering of making intelligent machines.” The artificial nature is because the simulation of intelligence comes from a computer rather than a human person. AI is a system of programming designed to think and learn from experience. It takes cues from the environment to solve problems, assess risks, and make predictions.

There are many current applications of AI that we interact with every day, such as:

  • Face recognition on smartphones
  • Social media algorithms
  • Streaming service recommendations

The reason that AI is so successful is that it can provide rapid analysis of large data sets that would take humans hours or days to sort through. This makes AI a great tool for augmenting many complex industries, including healthcare.

Uses of AI in Healthcare

In the healthcare world, AI is used for things like early diagnosis and the understanding of disease progression. AI analyzes large sets of medical records in order to search for patterns and commonalities. This allows providers to get a sense of how their cases might compare. It can also help providers optimize patient treatment plans based on what has and hasn’t worked for patients with similar surrounding factors.

Currently, AI is used in ophthalmology to analyze retina scans, as well as in oncology and radiology for cancer detection. It can also be used in mental health.

AI In Mental Healthcare

Mental healthcare is unique from physical healthcare because it relies more heavily on softer skills such as relationship building with patients. However, AI can be used to help to enhance the provider’s understanding of the patient. It can be used to rapidly synthesize from a number of sources, and it isn’t limited to lived experience as providers might be. 

How is AI able to do this? Through machine learning. Machine learning is the process by which AI systems learn from examples of training data, then adapt and learn on its own. There are two types of machine learning, supervised and unsupervised machine learning.

Supervised Machine Learning

In supervised machine learning, data is pre-labeled for the machine. For instance, there might be labels such as “depressive” or “non-depressive.” The algorithm learns the difference between these two. “The labels act as a teacher,” Franklin explained. 

Then the machine is tested on unlabeled data. It searches for patterns and similarities and sorts data into the preset labels based on what it has learned.

Unsupervised Machine Learning

In unsupervised machine learning, the algorithm recognizes patterns in the input data and sorts and labels the data on its own. The data can be reviewed by clinicians for accuracy. Unsupervised machine learning can reveal the underlying structures of the data less chance of human bias. For instance, unsupervised machine learning could be used for identifying and categorizing subtypes of illnesses.

Machine learning can be used to analyze an entire health record, from genetic and social factors to previous health conditions and other data. It can then predict diagnosis and even the most effective treatment.

Benefits of AI in Mental Healthcare

There are a number of benefits of AI in mental healthcare that can be taken advantage of, both now and as the technology progresses. Some of these benefits include:

  • Improved detection and diagnosis through machine learning
  • Improved patient monitoring, such as predicting worsening conditions and adherence to treatment
  • Treatment recommendations based on patient data
  • More efficient workflow for providers, especially by using natural language processing to listen in on sessions

Natural language processing (NLP) is a type of AI used in mental healthcare for session monitoring. The goal is to learn, understand, and then interpret language. It learns from unstructured text and generates meaning with language translation, semantic understanding, and information extraction. For instance, clinician notes or session records can be analyzed by NLP.

This allows clinicians to focus on the human aspect of their job, building their relationship with the client, while also reducing documentation time and allowing for more informed training and safety monitoring.

AI in Action: Lyssn

Lyssn AI1 is an AI software used to improve the quality of mental healthcare through better training of mental health clinicians and more successful outcomes. In 2021, over 1 in 5 Americans experienced a mental illness requiring mental health treatment.2 However, half of those who begin therapy drop out, and many who remain in therapy do not get better.3 This often comes down to the quality of training among providers.

Lyssm uses NLP to monitor the quality of treatment directly after the session. It can help providers identify steps to improve and provide performance-based reviews. It has been adopted by universities, telehealth providers, and addiction rehabilitation centers. This is one major example of how AI can be used in mental healthcare.

How Accurate is AI?

Of course, we’ve all seen some of the foibles that can come out of bad AI. So that begs the question, when the case is as precarious as a patient’s life, how accurate is the AI that we use?

There have been a few studies to this effect particularly when it comes to mental healthcare. In 2016, a study entitled “Automatic Detection of ADHD and ASD from Expressive Behaviour in RGBD Data” found that AI could distinguish between patients who had ADHD and autism spectrum disorder and those who did not.4 In 2018, a study that used speech to identify at-risk patients and predict psychosis found the AI software used to be 83% accurate.5 

In one other study, patients were given a smartphone app that collected data such as voice recordings, text information, and audio diaries. An AI software then analyzed all this data and tracked changes and symptoms in real time.6 This predicted depression and PTSD in many patients and gave them something to work on for their next therapy session.

Questions, Concerns, and Limitations

Of course, as many people as there are enthusiastic about AI, there are still those who have reservations. And these reservations are not unfounded. AI still has limitations at this stage. For instance, if the humans training machine learning systems are unaware of their biases, the AI software will only replicate those biases. There are also concerns about safety and privacy issues, as well as patient trust.

What the Clinicians Have To Say

Clinician Maura Dentino shared some of these concerns. “I’m curious, thinking about patients being uncomfortable disclosing those things or being monitored, if there’s a lot of paranoia or psychiatric type of thinking, what they might think of AI,” she said. “It could be very distressing to them.” 

Dr. Janine Inez expressed mixed feelings. “The tech nerd in me thinks this is pretty cool. It’s highlighting a need: that we need better training for therapists,” she said. “And it would be cool if there were a system in place to ensure that.”

“But what kind of therapy is being evaluated?” she asked with reference to Lyssn’s example of evaluating therapy sessions. “There are so many kinds of therapies. What if a client doesn’t respond well to empathy, what if they respond well to direction? There are a lot of ways of helping someone within a therapeutic space. It’s hard to say whether it works or not because there are so many ways to go about it.”

At the end of the day, Inez felt that AI was great for tasks that didn’t require critical thinking skills, such as filling out prior authorizations. But for more complex tasks, “What is the goal of this, ultimately? Do we want to improve patient outcomes? Reduce the workload for providers? Is it just something cool that we can do and so we should?”

Dr. Franklin responded that, “The goal would be to have a combination. How can this benefit the patients and optimize the workload for us as well? But coming up with the perfect combination of both would be challenging.”

Dr. Kimberly Mangla, a clinical consultant with Rivia Mind, seemed to view the possibilities of AI hopefully. “Areas that I think would be helpful include diagnostics: the subtleties of language or affects when trying to diagnose something like schizophrenia or autism spectrum or ADHD. I also think it could be helpful for making medication recommendations. We have so many options that are so similar yet different. I think a lot of things we could take into account are possible in terms of previous trials, side effects, genetics, culture, all of those things.”

Dr. Raymond Raad of Rivia, however, felt that clinicians could easily do most of what was proposed for AI. “My assessment is it’s not telling us anything that the patient isn’t already aware of or the clinician wouldn’t pick up very easily,” he said. However, he did believe AI had the potential for usefulness. “The question is, is the technology one step away or ten steps away?”

Dr. Holly White added a final concern about privacy and “I worry how it could be used instead of appropriate patient care.” 

Ultimately, the goal of AI would be to augment mental healthcare rather than replace clinicians and human patient care. Mental healthcare will always need the soft skills that can only be accomplished by human clinicians. But AI could help us redefine mental healthcare with an objective view of the practice.

The team at Rivia Mind is committed to learning as much as we can about the field of mental healthcare and the tools that can be used to better it, so we can provide better care to our community. Contact us today to learn more about how we can help on your journey to mental wellness or to schedule a free 15-minute consultation.