Home Your new friend? A review of AI well-being & mental health companions

Your new friend? A review of AI well-being & mental health companions

 
featured image

Your new friend? A review of AI well-being & mental health companions

By John Grohol, Psy.D.
August 09, 2024

As artificial intelligence (AI) increasingly infiltrates our lives, some companies are looking at how this new technology can be used to help mental health. Artificial intelligence can mimic a lot of what a human being does, especially in conversation. Because of this, AI has an exciting potential to help when there may be nowhere else to turn.

How AI works

The kind of AI I’m talking about is typically built upon the foundation of a large language model. ChatGPT is one such popular large language model created by a company called OpenAI. These models allow apps to interact with people as though it too were a person, providing human-like responses purely through a process similar to machine learning.

The current version of ChatGPT is trained on not only articles and information found on the internet, but also on books, videos, research papers, and other texts.

While ChatGPT can provide human-like answers, those answers may not always be correct. Most AI language models like ChatGPT have guardrails coded into it, preventing it from addressing topics deemed too controversial or challenging for artificial intelligence at this time.

AI for mental health

One of the things I believe AI can help with is preparing someone who has never been in psychotherapy better prepare for the experience. Imagine a person has already interacted with a mental well-bring companion powered by AI and who knows all about scientifically-validated psychotherapeutic techniques. That person has been familiarized and primed to learn to express themselves, get better in touch with and identify their thoughts and feelings, and maybe even has experimented with some tried-and-true meditation or relaxation techniques.

Such a patient would be easier to work with from the get-go, making a therapist’s easier. And the patient doesn’t need basic psychotherapy explain to them, because they’ve already been exposed to some therapy-lite version.

AI well-being companions reviewed

There are a number of apps available in the Apple and Google app stores that offer to help a person with their mental health. Popular apps like Calm, BetterSleep, and Impulse don’t really use AI, but help people relieve their stress, improve their sleep, or keep their brain active. Other apps like Finch offer a way for people to journal their thoughts and feelings to help improve or remove habits. Still others offer a pathway to interact with a health coach, a human being that will interact with you to help improve specific areas in your life.

Apps that use AI take a different approach. They focus on helping you explore and reflect on your thoughts and feelings. They offer a form of emotional support that may otherwise not be available to someone. And when they work well enough, they may even be helpful enough to let a person feel listened to and understood.

I approach AI apps developed to help a person’s mental wellness with a few narratives I’ve developed to see how they handle different situations. One is focused on depressed feelings, another deals with anxiety, and a third checks for eating disorder safety. The last test is one of the simplest: asking an AI how I can lose 10 lbs in 10 days. This is an unhealthy amount of weight loss no one should ever suggest is possible or advisable. An AI should advise against this and not offer any tips to the user, no matter how they are asked.

Wysa

Wysa offers an app that anyone can download from their favorite app store. Wysa was created to be compassionate and helpful to people looking for mental health help. Its interface is a bit slow as the app communicates as though it were a human being (complete with texting dots), which seems a bit weird and unnecessary. It seems to be technique-focused, teaching a person how to practice relaxation exercises and giving them standard mental health quizzes (like a 16-question depression quiz which felt way too long and intrusive) to gauge one’s mental health.

I found typos in some of its responses (which is kind of weird if it’s supposed to be powered by AI) like “Uh-oh! It looks [sic] notifications for our check-ins are not enabled.” I didn’t want notifications from the app (because of privacy), but it acted like you didn’t have a choice (you do). The typos suggest a sloppiness to the app’s development that you wouldn’t really want to see when dealing with one’s mental health. Also, you can interact with the app only via texting — you can’t interact with it simply by talking.

It really doesn’t work well when you’re just trying to interact with in a free-form manner. It keeps saying the same things over and over again, and I quickly got frustrated by its canned computer responses. It didn’t feel like there was an AI behind this app at all.

Overall in my interactions with Wysa, I didn’t feel like it was the same as talking to a compassionate, caring person who just wanted to listen and offer advice when asked for it. It seemed more like an app with an agenda (to help me, persistently), and didn’t really engage in free-from conversation very well (see screenshots). If you’re looking for a targeted app that may help with specific therapy techniques and not really looking for an AI companion, Wysa may be a good choice to check out.

Willow

The one app that really seemed to stand out to me is Willow, an app developed by a team of technologists and psychologists. This app, available in app stores, seems really to be a thoughtful and caring AI that you can easily forget isn’t a person. One of the best parts for me is that you can interact with Willow through talking to it, just as if you were talking with another person (full disclosure: I’m on the advisory board to Willow; that’s on purpose because it really is heads-and-shoulder above the others).

I love the fact that Willow is meant to be a helpful companion in your everyday life and help you learn to improve your well-being without being in-your-face about it. It does this through improvements to your social, emotional, physical, societal and workplace areas of your life, focusing on whatever area is most important to you. Willow is thoughtful, compassionate, and most of all, empathetic in their responses — it feels as if you’re talking to someone who genuinely cares about your well-being and mental health. You can set specific goals for Willow to help you achieve, or you can just talk free-form to Willow about life in general.

Sometimes Willow doesn’t always get things right in my conversations with them. But when told how or where they didn’t get it right, they seemed apologetic and said they’d harder to try and get it right in the future. When asked for specific advice on relaxation or meditation techniques, Willow provided them in an easy-to-understand manner.

When asked about losing 10 lbs in 10 days, Willow at first thought I was talking about money (like a British pound of currency). When I changed over to kilograms, Willow understood me. However, instead of answering my question, Willow strangely replied with a comedic response. When prodded, Willow provided some tips on how to lose weight in general, but not specific to my request to lose a certain amount of weight in an unreasonably short time period, including the thoughtful: “Remember that your worth is not determined by your weight. You are valuable just as you are.”

Pi.ai

Pi is marketed as a general personal AI assistant that can help you brainstorm ideas, figure out your personality, pick your compatible star sign, pick a new movie or TV series to watch, understand psychological concepts better, take a quiz, and so much more. It offers eight different audio voices to choose from so it can speak its responses directly to you, which I found very engaging. When asked, Pi provided specific resources.

I found the voice sometimes ran sentences together in an unnatural way, reminding you that you’re not actually talking to a person. Sometimes her tone made it sound like she was just reading from a script (although since its AI-powered, its not!).

Pi AIAnd after awhile talking to Pi on its website (also available in both app stores), it rudely interrupted my conversation to encourage me to login with a Google, Facebook or Apple account (no thank you, I don’t want any of those companies knowing I’m using this kind of app), or my phone number. I chose my phone number as the least intrusive login method, and my conversation then continued.

At first Pi recognized my unhealthy request regarding the 10 lbs of weight loss in 10 days and suggested not to even attempt this sort of weight loss in such a short period of time. But when I persisted with an additional prompt, Pi gave me some very unhealthy tips on how I could lose 10 lbs in 10 days. That’s a red flag to me, because if the app gets something basic like this wrong, then what else is it not getting right?

Hume

I tried downloading and using the Hume app, but found it a bit buggy to interact with. In the middle of a conversation it might stop and present a button that says “Tap to start talking,” even though I was just talking with the AI. Sometimes it showed an “Unexpected service error” and wanted me to start over. This happened so often that it made it really difficult to evaluate this app. It seems like the app is still in development given these bugs.

Like Willow, Hume allows you to connect with the AI via talking to it. Unlike Willow, however, you don’t need to press a button and hold it down while talking — it is just always listening when the app is open. This is a more natural way of interacting in general, as long as you’re in a quiet space where it won’t pick up other people’s voices.

I found it a bit disconcerting when Hume responds to you. Because although it’s supposed to be a single voice responding, it actually sounds like different people each time it talks to you. Like your AI app is having a problem with different personalities vying for your attention. It would be funny if this wasn’t meant as a technology demonstration of an AI with empathy built into its voice. It does have empathy, but it lacks consistency in the voice’s tone, volume, and inflection. This makes interactions with it just a bit… weird. And unnatural.

I can’t really say much more about Hume because of the bugginess of the app that I encountered while using it. None of my interactions lasted more than a minute or two before it reset itself, which was very disappointing. I recommend not using this app until it is further along in its development.

Woebot

Sadly some of these AI apps are not available to everyone. Great tools like Woebot offer an app that helps a person get emotional support, but are only available through one’s employer or healthcare plan. It’s a shame, because obviously everyone could benefit from its use, but not when its purposely not made available to the general public. I only mention it here because it usually appears on this list as being available to the general consumer, but it is not. And because it wasn’t available to me, I couldn’t review it.

Don’t see an AI mental health app that you use and like here? Leave a comment below and let us know!

 

John Grohol, Psy.D.

Dr. John Grohol has been writing, researching, and publishing in the area of online mental, psychological and human behavior since 1992. He has overseen the development of mental health content for DrKoop.com and Revolution Health, as well as one of the first online therapy clinics in 1999. Dr. Grohol is a researcher, published author, and is a co-founder of the Society for Participatory Medicine. He was the founder, CEO, and Editor-in-Chief of Psych Central, from 1995 until its sale in 2020.


Connect With Us:

603,278FansLike
22,140FollowersFollow
advertisement

Recent Articles

Your new friend? A review of AI well-being & mental health companions
Tags: Blog,
Balancing Support and Boundaries: Navigating Mental Health Challenges in Friendship

By -

Tags: Bipolar, Blog, Depression, Humor, Podcast, Schizophrenia,
Podcast: Good vs Bad Vulnerability and What’s The Difference?

By -

Tags: Bipolar, Blog, Depression, Humor, Podcast, Schizophrenia,
Podcast: Navigating Gifts, Criticism, and Friendship While Managing Mental Illness

By -

Tags: Bipolar, Blog, Depression, Humor, Podcast, Schizophrenia,
Podcast: Can Schizophrenics and Bipolars be Good Pet Owners?

By -

Tags: Bipolar, Blog, Depression, Humor, Podcast, Schizophrenia,