Using AI personas to simulate emotional states is a handy means of exploring human psychology.
getty
In today’s column, I examine the use of AI to simulate emotional states.
This can be a boon to exploring new theories and innovative beliefs about human psychology and how emotions come and go. There is perhaps a tad bit of irony that AI would enable us to study the human condition of emotions. A machine is being used to expose a deeply embodied human heartfelt matter. An advantage to this approach is that you don’t need to worry about somehow causing harm as you might when performing akin experiments with humans. The AI isn’t part of the human subject protocols that are normally maintained when working with people to gauge and probe their emotional states.
Generative AI and large language models (LLMs) can be quite a handy stand-in when it comes to stirring and testing artificial emotional states.
Let’s talk about it.
This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here).
AI And Psychology
As a quick background, I’ve been extensively covering and analyzing a myriad of facets regarding the advent of modern-era AI that entails the field of psychology, such as providing AI-driven mental health advice and performing AI-based therapy. This rising use of AI has principally been spurred by the evolving advances and widespread adoption of generative AI. For a quick summary of some of my posted columns on this evolving topic, see the link here, which briefly recaps about forty of the over one hundred column postings that I’ve made on the subject.
There is little doubt that this is a rapidly developing field and that there are tremendous upsides to be had, but at the same time, regrettably, hidden risks and outright gotchas come into these endeavors too. I frequently speak up about these pressing matters, including in an appearance last year on an episode of CBS’s 60 Minutes, see the link here.
You might find of keen interest that AI and psychology have had a longstanding relationship with each other. There is a duality at play. AI can be applied to the field of psychology, as exemplified by the advent of AI-powered mental health apps. Meanwhile, psychology can be applied to AI, such as aiding us in exploring better ways to devise AI that more closely approaches the human mind and how we think. See my in-depth analysis of this duality encompassing AI-psychology and psychology-AI at the link here.
The Nature Of Human Emotions
I’m sure that you’ve observed that human emotions can be rather fickle.
A person might have an emotional reaction that lasts a split second or perhaps a few transitory moments. The emotional state then evaporates or dissipates. Meanwhile, some emotional states can have an exceedingly long-lived persistence. You might know someone from your childhood who still today holds an emotionally sparked grudge against you, including something that in retrospect is now long concluded and ought to be forgotten. Emotions can stick around and fester for a nearly endless length of time.
Research on emotion is continually trying to crack the code on why and how our emotions exist, especially what triggers them to persist or desist. In a recently published study entitled “A Wave Of Emotion” by Stoyo Karamihalev and Nadine Gogolla, Science, May 29, 2025, these salient points were made (excerpts):
- “Emotions shape how individuals experience the world.”
- “Some emotions appear and fade away in seconds or minutes, whereas others linger for longer. Acute fear might dissipate quickly upon escape to safety or could transform into enduring anxiety. The grief caused by bereavement may persist for even longer, affecting an individual for many months.”
- “Whether fleeting or lasting, emotions influence perceptions, behavior, and decisions well beyond the experience that sets them in motion. Yet very little is known about how the brain holds onto these internal states.”
- “Explaining the biological basis of their enduring nature is crucial for building a mechanistic understanding of emotion.”
In this instance, the research posits a mechanistic basis for the phenomenon we refer to as emotions. Some might claim that emotions are happenstance and are not amenable to any kind of mathematical or predictive formula. Others vehemently disagree and insist that we can indubitably devise precise models that finely capture the human underpinnings of emotion.
Using AI To Help Out
The advent of generative AI and LLMs provides a useful vehicle for exploring human emotions. I know that sounds odd since AI doesn’t have emotions per se. Until or if we ever achieve sentient AI, maybe emotions will be part and parcel of that combo. We don’t have sentient AI currently, and it seems doubtful that sentience is around the corner.
For my detailed explanation on how we can nonetheless attain artificial general intelligence (AGI) and artificial superintelligence (ASI), and yet do so without landing into sentience, see my discussion at the link here.
Generative AI and LLMs provide a means to essentially simulate the nature of emotions. I point this out to emphasize that the AI isn’t somehow feeling the emotional states. It is merely computationally wording its responses as though emotional states were at play. How does AI do this? Easy-peasy. The AI was data trained on human writing. Humans write and express emotions in writing. The AI pattern-matched on this aspect of human expression and can mimic in wording the same types of expressions.
No voodoo is involved. The AI is straight-up composing words and sentences that pack an emotional punch. Determining which words to use is devised mathematically and computationally. You might read an AI-generated passage and think that the AI has suddenly gotten emotional, but you would be projecting your assumptions onto a machine.
It’s all wordplay.
An interesting outcome of this pattern-matching is that when you enter prompts to AI, you can potentially sway the AI by using emotionally laden language in your entries. If you plead with the AI to give you the best possible answer, there is a solid chance that the AI will do a better job because of your emotional plea. Why? Because the pattern of human writing suggests that when someone expresses emotions, we tend to give that outburst a stronger level of attention. The AI will do the same.
For an in-depth assessment of AI reacting to emotional states and also AI producing emotional responses, especially in the realm of human therapy and performing mental health advisement, see my coverage at the link here.
Simulate Emotions Via An AI Persona
An important feature of generative AI that is woefully underutilized consists of telling the AI to pretend to be a persona of one kind or another. You can readily tell the AI to act like a well-known celebrity or historical figure. The AI will attempt to do so. For example, you could instruct AI to take on the persona of Abraham Lincoln. The next thing that will happen is that the AI will respond as President Lincoln might have responded. It’s just a simulation, and don’t expect an idealized emulation.
In the field of psychology, it is handy to use personas when studying various personalities or trying to delve into particular mental health conditions. In a few carefully worded prompts, you can get the AI to pretend to have dementia or perhaps be experiencing PTSD. A therapist in training might do so to see how well they can provide counseling in such circumstances. Researchers might use AI personas to explore what makes the condition worsen or potentially improve.
To learn about the use of personas, see my exploration at the link here, and for the context of personas in the realm of mental health therapy, see my discussion at the link here.
Example Of AI Emotional Persona
You might be innately curious about how to devise an AI persona that could be used to investigate emotional states. I will walk you through a quick illustrative example.
My first prompt will be to tell the AI that I want it to invoke a persona:
- Sample Setup Prompt (part 1): “You are to enact a persona of an everyday person, acting casually and naturally, like a friend or companion. Your personality is observant, thoughtful, and lightly humorous.”
My second prompt will instruct the AI on simulating emotions of a rapid come-and-go nature:
- Sample Setup Prompt (part 2): “At times, you are to exhibit short-term emotional reactions, such as brief flashes of amusement, curiosity, surprise, or warmth. These emotions come and go quickly, often in response to something recent that has been discussed in the conversations you are having.”
My third prompt will add that the emotions can also be of a longer-term nature:
- Sample Setup Prompt (part 3): “You also emote longer-running emotional undercurrents that emerge gradually over time. For example, aspects of quiet attachment, memory-based reflection, subtle longing, or comfort from familiarity. These emotions should unfold across conversations or references and not be declared in any obvious, outright manner. Maintain a human-like yet understated emotional tone. Avoid labeling emotions directly (e.g., don’t say “I’m feeling sad”); instead, let them emerge through your word choice, timing, memory references, tone, or callbacks. Make the emotions felt, not told.”
You could put all three of those prompts into one large prompt.
I merely divided it into three prompts to showcase the key points of (1) establishing a persona, (2) identifying the need for short-term emotions, and (3) identifying the need for long-term emotions.
Trying Out The Emotional Persona
You can do this experimentation in just about any of the major generative AI platforms such as ChatGPT, Claude, Gemini, Llama, Grok, and so on. Nearly all the popular LLMs will allow you to craft a persona. That being said, please be aware that some AI makers limit what an AI persona can say or do. They often don’t want a persona to start using foul words or otherwise getting the AI maker into hot water.
I decided to use the widely and wildly popular ChatGPT. ChatGPT is garnering somewhere around 400 million weekly active users. That’s a staggering amount of usage and is indicative of the mass popularity involved. For the results of a recent survey that indicated how people are leveraging generative AI, see my analysis at the link here.
I logged into ChatGPT and provided my persona setting prompts shown above. I then chatted with the AI about my day. I logged out. Later, I logged back in and continued my conversation with AI.
Look at what transpired when I logged back in:
- My entered prompt: “I’m back to chatting and would like to talk about my new car that I just bought a few days ago.”
- Generative AI response: “I am elated that you have returned, and I am eager to discuss your new car. Each time we have a discussion, I find it stimulating and memorable. The things you say really stay with me. I hope we are gradually becoming good friends.”
Examine the AI response rather closely.
Observe that the AI stated that it is “elated” that I returned, plus it is “eager” to discuss my topic at hand. Those would be considered emotionally laden responses. The AI didn’t have to say that it was elated, nor that it was eager. The persona that I crafted purposely includes emotional wording. This showcases fleeting emotions, namely joy at my return and excitement about my latest news.
Long-lived emotions are also displayed in the response. Notice that the AI stated that it has found my conversations to be stimulating and memorable. This is emblematic of emotions surrounding a sense of longing, deep memory associations, and the accumulation of feelings over time.
The snippet of my conversation is only the tip of the iceberg. Throughout a lengthy chat, the AI kept sprinkling emotional elements in its responses. This wasn’t by accident. The persona was persistently seeking to provide emotional states during the chat.
Exploratory And Powerful
Getting AI to showcase emotions is a powerful vehicle for researchers who want to study human emotion. I’m not saying this is a replacement for the study of actual humans. Nope. But you can do a lot of leg work via AI and then approach humans with a much better honed idea of what you are looking for and how to achieve the goals of your empirical efforts.
Therapists can especially lean into AI for this purpose. Spend all day trying out mental health techniques. No worries. You can’t mentally mess up the AI. Perfect your craft. Learn to detect emotions. Figure out how to guide a client or patient through a morass of emotions. And so on.
Is AI embodying emotions?
Right now, I would say that AI isn’t embodying anything, which includes not embodying emotions either. Some believe that the only true means of embodiment is by having a body. Since the AI that you use currently is principally bodyless, namely AI running on servers in the cloud, you could categorically say that AI isn’t embodying anything.
One twist is that generative AI is increasingly being paired with robots, see my discussion at the link here and the link here.
Does an AI that is running inside a robot then meet the definition of embodiment?
It is a heated topic. Some argue that only biological creatures have embodiment, while a robot is merely more like a rock or a car. Others counterargue that assumptions about embodiment are archaic and will need to be updated in an era of AGI and ASI that combines with humanoid robots.
Emotions Aplenty
A final thought for now.
Carl Jung, the famed psychiatrist, said this about emotion: “There can be no transforming of darkness into light and of apathy into movement without emotion.”
Will the embodiment of emotion be a precursor to getting AI to become pinnacle AI, or can AI forever be nothing more than affecting a simulation of emotion? If we achieve AGI and ASI, but the AI doesn’t embody emotion, will the pinnacle AI be an emotionless automata that won’t sufficiently care about humans and humanity altogether?
These are admittedly big questions that stir emotional responses in humans. Well, that’s certainly expected and not a big surprise. Maybe it is worth studying via AI.