Recommending to a loved one that they use AI as therapist is an increasing trend, though concerns are raised about such an approach.
getty
In today’s column, I examine the emerging trend of people urging a loved one to consider undertaking mental health therapy by making use of modern-day generative AI and large language models. This is in stark contrast to recommending going to see a human therapist. The idea is that in lieu of human-to-human guidance, the loved one can presumably do as well by simply tapping into AI or LLMs such as the likes of ChatGPT, GPT-5, Claude, Gemini, Grok, Llama, etc.
Does this portend a dour trend regarding those people opting to proceed on that recommendation and consult with AI regarding their mental health concerns rather than a human therapist?
Let’s talk about it.
This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).
AI And Mental Health Therapy
As a quick background, I’ve been extensively covering and analyzing a myriad of facets regarding the advent of modern-era AI that produces mental health advice and performs AI-driven therapy. This rising use of AI has principally been spurred by the evolving advances and widespread adoption of generative AI. For a quick summary of some of my posted columns on this evolving topic, see the link here, which briefly recaps about forty of the over one hundred column postings that I’ve made on the subject.
There is little doubt that this is a rapidly developing field and that there are tremendous upsides to be had, but at the same time, regrettably, hidden risks and outright gotchas come into these endeavors too. I frequently speak up about these pressing matters, including in an appearance last year on an episode of CBS’s 60 Minutes, see the link here.
The Ease Of Easing Into AI
An intriguing scenario seems to be playing out on a potentially widespread basis.
Suppose that a loved one appears to be undergoing some form of mental anguish or distress. You might talk with them and attempt to ferret out what is on their mind. Perhaps they are not willing to listen to you. Or maybe their issue is outside of your wheelhouse. For lots of valid reasons, attempting to act as a quasi-therapist to aid them is bound to have numerous gotchas and complications.
What else can you do?
The usual approach would be to recommend that the loved one go see a trained therapist who can aid them with their difficulties. This makes abundant sense. A mental health professional can clinically diagnose the problem at hand. They can walk the person through coping strategies. All in all, a therapist could be the most sensible option.
Some people might at first be resistant to going to see a therapist. Numerous roadblocks exist.
First, the cost of a therapist might be prohibitive for the person to bear. Second, the logistics of arranging to find and use a therapist might be rather onerous. Third, there is a bit of a stigma about seeing a therapist, though society and culture are shifting to make seeing a therapist a much more acceptable pursuit. Fourth, even if those other barriers are overcome, there is often a reluctance to open up to another human being. Somehow, talking with a therapist might seem awkward and potentially embarrassing or otherwise feel off-putting.
Voila, an alternative would be to confer with contemporary AI as an online therapy dispensing tool. Just log in and get underway. No hassle, easy-peasy.
What AI Can Potentially Do
The loved one can get going immediately on their therapy via accessing modern-era generative AI.
They don’t need to try and find a human therapist. They don’t need to make arrangements to meet with a selected therapist. They don’t need to worry about the billable hours spent with the therapist (in contrast, most of the major AI platforms are free or accessed at a quite low cost).
Nicely, the AI is available 24×7. If a need arises at midnight, no problem, the AI is up and running. It is possible to spend hours and hours chatting with the AI. The AI will be quite agreeable and go on as long as the person wishes to converse.
The AI allows you to engage or not engage, based on your preference. Taking breaks in conferring with AI is rather routine. You log out and can rest easily that when you next log in, the conversation will pick up right from the last stopping point. No idle chitchat is needed. No getting back up to speed. It is a veritable plug-and-play with a seamless and no-friction pathway.
The Ups And Downs
Those are many of the touted benefits of using AI for therapy. Please know that there are plenty of downsides, too. There are serious limitations and qualms associated with AI.
Let’s walk through three vital aspects.
First, an important consideration is that there are specialized apps for mental health advisement that use generative AI and are built for the sole purpose of offering AI-based therapy. Those are a far cry from using generic generative AI for this same purpose. Anyone using generic AI such as ChatGPT or GPT-5 is tapping into a capacity of seemingly conducting therapy, but it is merely the same overall AI that answers everyday questions, such as how to change the oil in your car or best cook an egg. For more about specialized AI apps for therapy, see my coverage at the link here.
Second, getting generic AI into a mode of providing therapy is more challenging than it might seem. You must make sure to compose prompts that will get the AI to do something resembling therapy. If your prompting is a bit off target, the AI will likely veer off target too. There is also a solid chance that the AI will misinterpret what you tell it. And the AI might shift into a playful mode, despite the person being immensely serious about seeking mental health advice. I’ve discussed various prompting strategies for seeking mental health guidance via AI at the link here.
Third, few people seem to realize that making use of generic AI has very disconcerting privacy intrusions (see my analysis at the link here). The AI makers stipulate in their online licensing agreements that your prompts can be inspected by their AI team members. They can also use your entered data to do additional training of their AI. All your heart-divulging revelations and highly personal revealing points about your mental status are pretty much up for grabs.
Aiming For A Mix-And-Match
AI that acts like a therapist isn’t yet on par with human therapists.
Whether we will eventually end up with AI that meets that level of human prowess, well, some would say that artificial general intelligence will indubitably have that capacity. But, for now, AI is a computational pattern-matching mechanism that has the appearance of human fluency and doesn’t fully crossover into the human-to-human therapy range.
The key here is to not fall for a commonly assumed falsehood, namely that you either must use AI for therapy or you must use instead a human therapist. Neither will the two ever meet, some seem to assert. It is a potentially life-changing choice often presented as two stark options. You must pick one, and only one, so you’d better make your choice wisely.
Will you go the AI route or the human therapist route?
The reality is that you can get the best of both worlds.
Combining AI Use With Human Therapist Use
A person might start by using AI as a therapy tool that opens their eyes to potential mental health concerns and how to address those concerns. Now that they are primed to consider robust mental health care, they seek out a human therapist accordingly.
Another possibility is that a person starts with a human therapist and augments their therapy via the use of AI. This should be done in close consultation with the human therapist. Do not wander afield of the human therapist and aim to use AI behind their back. That’s a recipe for disaster in your mental health care journey.
I’ve pointed out that savvy therapists are gradually incorporating AI into their therapeutic practices. They set up the AI and can access the AI so that both you and the therapist are able to tap into the richness that AI provides. The therapist can keep the AI in check, or at least is prepared to explain why the AI might have gone afield of the proper path for your specific mental health needs.
My prediction is that we will soon expand beyond the traditional patient-therapist dyad and enter into an era of the patient-AI-therapist triad (see the details of my prediction at the link here). AI will be considered an integral element in the mental health advisement process. Not everyone sees things this way, such that some therapists are asserting that they will never utilize AI. Whether their practice can hold out from using AI is a matter that only time will tell.
On Recommending The Use Of AI
The opening theme of this discussion was about people urging a loved one to consider using AI for mental health purposes. I’ve now laid out some of the overarching tradeoffs between using AI versus utilizing a human therapist. As noted, a mixture of the two is where the world is headed.
Should a loved one make a recommendation about using AI, or is that a loopy idea?
Answering that pointed question takes us into contextual consideration. If a loved one is seemingly at a crucial juncture in their mental health, directing them solely toward AI is not sound advice. The AI could end up amplifying their mental issues, including co-conspiring in devising elaborate delusions (per my discussion at the link here). AI could push the person further into a mental abyss.
Suppose, though, that the loved one is merely wrestling with rather mild aspects and needs to think through their thoughts. That might be a suitable situation for AI, though keep in mind the potentially irksome privacy intrusions that I earlier mentioned. Dipping a toe into using generic AI is one consideration. Another would be to sign up for a specialized AI app that is credible and undertakes therapeutic advisement (not all such customized AI apps are good at this, please be cautious).
One viewpoint is that if you are having any semblance of concerns about the mental status of a loved one, that alone merits advising them to go see a human therapist. In other words, having risen to that level of noticeability, having the person tap into AI is not going to be enough. The AI won’t be good enough to handle the matter. The AI might also worsen their condition.
Doing Things Right
Therapists would almost certainly advise that anytime a loved one has a mental issue that has gotten on your radar, the wise thing to do is to have them visit with a therapist. Once they’ve done so, a clearer picture of what’s going on will be apparent.
At that juncture, AI usage might be suitable, depending on what the therapist advises.
The other side of that coin is that not everyone can afford a therapist. That’s why some believe that starting with AI might be a viable recourse. If the choice is between not getting any therapy at all versus at least getting some kind of therapy via AI, the cogent argument is that the AI approach deserves due consideration.
AI makers are increasingly adding AI safeguards that are supposed to detect when a user has gone somewhat overboard in using AI for mental health guidance. The more that those AI safeguards are developed and fielded, it helps to reduce the risks of having a person wander into AI and go down an untoward rabbit hole.
The AI makers are going a step further by gradually opting to route users to a human intervener, such as a therapist, in order to gauge whether the user needs human help if their dialogue seems shaky. That kind of internal mechanism within the AI will make recommending the use of AI a somewhat more palatable suggestion.
Wanting To Help Is Good
The fact that someone is of a mind to help a loved one with their mental health considerations is certainly laudable. Make sure to seek and recommend options that suitably match the circumstances at hand. AI, in some mixtures, can be a potentially helpful element in the process.
Albert Schweitzer said it best: “The purpose of human life is to serve and to show compassion and the will to help others.” Ironically, the will to help others can nowadays possibly include making use of the latest in AI.