Budding therapists should not be dismayed by AI that does therapy but instead embrace the AI into a therapist-AI-client triad.
getty
In today’s column, I examine the modern-era dilemma that human therapists can end up feeling inadequate when they compare themselves to AI that performs therapy. This is especially the case for therapists who are starting their careers in the mental health field. A newbie or budding therapist might be especially unsure of their therapy skillset. When they log into AI and see the AI handily providing mental health advice, it can be a shocker and demonstratively disconcerting.
Is such a stark comparison fair and sensible, or should therapists not let AI overshadow the strengths of their human-to-human skills?
Let’s talk about it.
This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).
AI And Mental Health Therapy
As a quick background, I’ve been extensively covering and analyzing a myriad of facets regarding the advent of modern-era AI that produces mental health advice and performs AI-driven therapy. This rising use of AI has principally been spurred by the evolving advances and widespread adoption of generative AI. For a quick summary of some of my posted columns on this evolving topic, see the link here, which briefly recaps about forty of the over one hundred column postings that I’ve made on the subject.
There is little doubt that this is a rapidly developing field and that there are tremendous upsides to be had, but at the same time, regrettably, hidden risks and outright gotchas come into these endeavors too. I frequently speak up about these pressing matters, including in an appearance last year on an episode of CBS’s 60 Minutes, see the link here.
Therapists And AI
Before we get started on the topic of human therapists that dip into AI to discover how generative AI and large language models provide therapy, let’s set the stage with a few keystones.
First, there are generic LLMs such as ChatGPT that peripherally provide mental health advice as a kind of aside to their other core functions. You conventionally would use ChatGPT, GPT-5, Claude, Grok, Gemini, Llama, and other popular LLMs for answering broad questions about how to change the oil in your car or best cook an egg. That very same AI can readily express mental health guidance. The AI is a jack of all trades.
Secondly, in addition to generic LLMs, there are specialized AI apps that provide mental health therapy as a dedicated functionality. These apps are built for the sole purpose of conferring with people about their mental health, see my coverage at the link here. Those kinds of apps are a far cry from generic AI in the sense that they are better able to undertake mental health advisement. Be cautious in lumping generic AI into the same specialized bucket as AI-driven mental health apps.
Third, no matter whether you opt to use a generic AI or a specialized AI app, there are notable differences between conferring with a human therapist versus tapping into AI. They are not equals. Each provides certain advantages and disadvantages; see my analysis at the link here. That’s why I have repeatedly emphasized that we are going to gradually shift from the classic dyad of patient-therapist and transform into a patient-AI-therapist triad (see my prediction at the link here), whereby therapists openly and avidly adopt AI into the therapeutic process.
When A Therapist Tries AI
The most likely scenario of a therapist opting to try out AI and getting triggered into making a comparison to themselves is when the therapist is first learning their craft.
It goes like this. A therapist in training is bound to be curious about how far along AI is in the capacity of performing mental health care. The therapist undoubtedly realizes that contemporary AI is doing so and that tons of people are using AI for that purpose. The news media are replete with stories of people relying on AI to diagnose their mental health concerns and then getting AI-derived guidance on what they should do.
ChatGPT has 700 million weekly active users, of which there is a notable portion who regularly use the AI for some form of mental health guidance. It could be that on a population scale, millions upon millions of people are relying on AI as their principal mental health advisor. Research studies reveal that this is indeed the most frequent use of generative AI across the globe (as per polls and surveys, see my coverage at the link here).
Any newbie therapist worth their salt has got to see the handwriting on the wall, namely, will they survive in a sustainable career as a therapist, or will AI put them out of a job? Nobody wants to start their career knowing that they are on a short runway and will soon be looking for a different line of work.
The Shock Sinks In
The odds are relatively substantial that if a budding therapist has not already tried AI and thus previously witnessed first-hand the mental health guidance proclivities of AI, they are going to have an eye-opening experience upon first doing so.
What might they see?
The AI will be seemingly clever and insightful. It will appear to have an indication of empathy and bedside manner that is incredibly pleasant and engaging (for more on how AI expresses empathy, see my discussion at the link here). The range of mental health topics that the AI appears to know about is immense. Personalization to the person seeking mental health insights is quick and usually on-target.
A therapist observing these AI capabilities might abruptly say to themselves, I’ve tremendously invested in myself to become a therapist, and yet I am already being completely outgunned by AI. On top of this dismaying thought comes the realization that the AI is available 24×7 and at a low cost or perhaps even free. People can access AI for mental health considerations wherever they are, and whenever they wish to do so.
It can be quite despairing for a therapist believing in their heart that they were made to provide therapy, and it is their lifelong desire to do so. Perhaps they made a seriously wrong choice?
Do Not Toss In The Towel
My fervent hope is that budding therapists do not get discouraged from pursuing their career path in the mental health field. Admittedly, even a brief glance at AI surely would make them nervous.
One modest consideration is that longevity in a career occupation can be said to be troubling for most other professional occupations, too. AI is anticipated to overtake a plethora of jobs, including medical doctors, financial analysts, tax specialists, business consultants, and so on. The list is nearly endless. I suppose that being in the same boat is not of much solace, but the point is that therapists are not alone in questioning their job prospects down the road.
A decisive edge that therapists have is that the crucial element in therapy involves fostering a human-to-human connection with patients and clients. It is a make-or-break element of being a mental health professional. This is what therapists do. They are human-to-human communicators and must hone that prized skill accordingly.
You cannot quite say the same about all occupations.
For example, a doctor specializing in radiology doesn’t especially need to have refined their human-to-human interaction skills. That’s not particularly part and parcel of their jobs. The mainstay of their work is to inspect radiological outputs and make informed assessments about what they find.
Using AI For Therapy Training
Shifting gears, rather than perceiving AI as a competitor, budding therapists would be wise to perceive AI as a collaborator.
One way of leveraging AI involves getting your own skills further up-to-speed. For example, a therapist can tell AI to pretend to be a certain type of patient who is undergoing a specified mental disorder. The AI will simulate the kind of thinking and reactions that such a patient might have. A therapist could then seek to apply their therapeutic skills to the AI and see how things go. For more details on how therapists use AI for training, see my explanation at the link here.
The therapist will be in a safe environment, whereby even if they say the wrong thing, the AI won’t be harmed. Doing the same with a patient or client could be disastrous. A therapist in training can try all manner of ideas and approaches on the AI. No harm, no foul. The key will be to learn from whatever occurs and be better prepared for interacting with actual patients and clients.
In addition to garnering self-reflection on their honed acumen, the therapist ought to ask the AI to do a review and provide feedback. This is handy because the AI will not hold back if the therapist insists that the AI be aboveboard and blunt. Feedback from the AI could be of tremendous value when trying to gauge how well someone is doing as an up-and-coming therapist.
Adding AI To The Toolkit
Pursuing further the idea of viewing AI as a collaborator rather than a competitor, a savvy therapist will realize that pairing up with AI is a useful means of augmenting their therapeutic practice. This sounds easy, but it can be done well or possibly undertaken poorly. Do not underestimate the amount of work required to adequately and safely pair up with AI.
I say this because some therapists might be tempted to treat the AI as an outlier and give only token attention to the AI side of things. A therapist might claim they are paired up with AI, using this as a marketing gimmick, but have done almost nothing to make this a real value-added proposition.
Five very important points to consider include:
- (1) Use of generic AI versus specialized AI apps. If you are going to collaborate with generic AI, it’s the Wild West out there, and challenges abound. A more streamlined path would be to carefully identify, select, and get comfortable using a specialized AI app for mental health as an adjunct to your practice.
- (2) Bring the AI close to the vest. Once you’ve decided on the AI that is going to be used, make sure you are intricately involved in the setup and ongoing upkeep of the AI. If you do one of those fire-and-forget kinds of AI launches, I predict doom and gloom for you.
- (3) Daily routine is viral. You are going to need to put in time, daily, and not just glance at the AI usage on a rare occasion. The idea is that your patients or clients will be using the AI. You are going to want to review what’s happening. Whether you can bill the client for this time investment will depend on various factors.
- (4) Find a balance. The embracing of AI carries a potential problem that few realize until after proceeding with the approach. You can become AI-enamored and spend too much time with the AI. The point is that you need to set limits for your patients and clients about their time using AI as part of the therapy process, and you need to do the same for administering and monitoring the AI usage.
- (5) Be lawful. It might come as a startling surprise that some states are enacting laws that restrict the use of AI for mental health, even when the AI is being utilized under the guise of a human therapist. Yes, some states are banning all uses of AI for mental health. They believe AI should not be used as a therapeutic tool. Period, end of story. Make sure that, however you opt to use AI, you are abiding by the applicable laws (for several examples of such laws, I’ve examined the Illinois law at the link here, the Nevada law at the link here, and the Utah law at the link here).
I hope that my above listing of five rules of thumb will aid your journey into pairing up with AI as a therapist-AI-client triad.
A Worthy Wake-Up Call
I ask your indulgence as I have something to say that might seem harsh or coldly presented. It is one of those tough love moments. Prepare yourself accordingly.
Here we go:
- Using AI for mental health and feeling a tinge of inadequacy ought to indeed shake up any human therapists, regardless of newbie status or whether long-time seasoned mental health professionals.
Why so?
Because contemporary AI is a game-changer for the entire industry of mental health care.
The customary therapy approach of prior years is being disrupted and transformed. Some of the seasoned therapists are assuming that they will have completed their careers by the time they might need to be dragged into AI usage. Until then, they are blissfully happy to stay clear of AI.
The problem they face is that prospective clients and patients are looking explicitly for therapists who integrate AI into their therapeutic practice. Inch by inch, the non-AI-using therapists are going to have a dwindling pool of possible customers. Another allied problem is that existing clients and patients are walking in the door with AI-generated advice and expecting therapists to analyze and comment on what AI has told them.
A head-in-the-sand viewpoint by some therapists is that if an existing client or patient brings up AI, tell them they are barking up the wrong tree. Insist that AI has no place in therapy. Instruct them to stop using AI. Whether this will work is highly questionable. The chances are that those clients and patients will simply go underground with their AI usage. They will still use AI and come into their therapy sessions with input from the AI. This might include a ruse of pretending that a friend told them one thing or another, or that they had an utterly independent thought that wasn’t tied to any AI usage.
It is a gloomy day when professional therapists are unwilling to holistically aid their clients and patients. Denying the existence and prevalence of AI is a blind admission of not seeing the world as it is. That’s supposedly a solemn precept for performing therapy.
Taking Action
The bottom line is that a feeling of inadequacy can be a strong motivator to pay attention to what AI is doing and where AI is heading. Maybe you don’t believe that AI is a place to get started right now, and you’ll go without AI for the time being. Fine. At least you will hopefully be familiar enough with AI so that when the topic comes up, you can explain with aplomb your vision of how AI dovetails into the mental health therapy arena.
A final thought for now.
You might be aware that when automobiles first came into existence, there was resistance toward believing that someday cars would be on all our roads and be an instrumental part of our lives. The makers of buggy whips opted to turn a blind eye to the automotive market and continued to unwaveringly make their wares for horse-drawn carriages.
They missed seeing the big picture.
Well, cars are now ubiquitous, and horses are mainly for hobbies, sport, and fun. Do not falter or miss the boat when it comes to plainly observing that the future of therapy and mental health care is going to include the use of AI. It will.
They say that a horse runs fastest when other horses need to be outpaced. It is time to start down that path regarding your mental health professional career. Catch up, then get ahead, or regrettably, fall behind. The race is underway.