There is a global epidemic that has been silently brewing for the last few decades. This epidemic refers to none other than the exponential growth in mental health issues worldwide, which is now attracting significant attention due to its catastrophic consequences.
The Substance Abuse and Mental Health Services Administration (SAMHSA) of the United States Department of Health and Human Services released a groundbreaking report in 2020, highlighting the devastating effects of mental and substance use disorders (M/SUD). SAMHSA’s analysis found that “M/SUD treatment spending from all public and private sources is expected to total $280.5 billion in 2020, which is an increase from $171.7 billion in 2009.” More importantly, mental health issues place significant burdens on the patients themselves, create incalculable challenges for patient families and care structures, and unfortunately, result in numerous lost lives due to irreconcilable illnesses. Indeed, no amount of money or economic analysis can quantify the physical and emotional toll that mental health illnesses entail.
Earlier this month, U.S. Surgeon General Dr. Vivek Murthy released an advisory report titled “Our Epidemic of Loneliness and Isolation”—highlighting the significant public health concerns caused by mental health issues. He specifically addresses loneliness and lack of social connection among the foremost concerns, and discusses his journey in recognizing these as problems: “Loneliness is far more than just a bad feeling—it harms both individual and societal health. It is associated with a greater risk of cardiovascular disease, dementia, stroke, depression, anxiety, and premature death. The mortality impact of being socially disconnected is similar to that caused by smoking up to 15 cigarettes a day and even greater than that associated with obesity and physical inactivity. And the harmful consequences of a society that lacks social connection can be felt in our schools, workplaces, and civic organizations, where performance, productivity, and engagement are diminished.”
Fortunately, increased awareness around mental health has introduced significant innovation and investment into new remedies and treatment modalities. One such novel concept is the use of artificial intelligence in the mental health space.
With the advent of generative AI, conversational AI, and natural language processing, the thought of using artificial intelligence systems to provide human companionship has now become mainstream.
Google Cloud, which is at the forefront of developing scalable AI solutions, provides an in-depth analysis of what conversational AI is: “Conversational AI works by using a combination of natural language processing (NLP) and machine learning (ML). Conversational AI systems are trained on large amounts of data, such as text and speech. This data is used to teach the system how to understand and process human language. The system then uses this knowledge to interact with humans in a natural way. It’s constantly learning from its interactions and improving its response quality over time.”
This means that with enough data, training, and interactions, it is within the scope of plausible reality that these systems can not only replicate human language, but may eventually utilize billions of data points and evidence-based guidelines to potentially provide medical advice and therapy. Undoubtedly, companies such as Google, Amazon, and Microsoft are investing billions of dollars in this very technology, realizing that they are just measures away from possibly replicating human language and conversation. Once these companies can perfect this, the potential is unlimited: everything from customer service to companionship and human relationships can become AI driven.
In fact, there are already trial systems that exist. Take for example Pi, a personal artificial intelligence system developed by the company Inflection AI. Pi “was created to give people a new way to express themselves, share their curiosities, explore new ideas, and experience a trusted personal AI.” Mustafa Suleyman, CEO and Co-Founder of Inflection AI, explains: “Pi is a new kind of AI, one that isn’t just smart but also has good EQ. We think of Pi as a digital companion on hand whenever you want to learn something new, when you need a sounding board to talk through a tricky moment in your day, or just pass the time with a curious and kind counterpart.” Alongside Suleyman, the other co-founder of Inflection AI is Reid Hoffman, who also co-founded professional networking company, LinkedIn. Inflection AI has raised hundreds of millions of dollars in seed funding to support its technology.
However, this incredible technology brings with it many potential concerns. While artificial intelligence certainly has the capabilities to solve potential access inequities, provide healthcare services in a convenient manner, and even provide companionship to those that most require it, it has to be developed with guardrails in place for numerous reasons.
For one, in a realm as sensitive as mental health, patient privacy and data security need to be of utmost importance. Using AI technology in this capacity means that a significant amount of sensitive patient information will also be collected. Developers must ensure that this data will never be compromised and that patient privacy is always the top priority, especially amidst a landscape of growing cybersecurity threats.
Moreover, perhaps the most important concern is an existential one: how far should humanity go with this? While the benefits of AI are certainly numerous, innovators have to be cautious about the limitations of these systems. Notably, the systems are only as good as the models and datasets they can learn from, which means that in the wrong hands, these systems could very easily provide incorrect or dangerous recommendations to vulnerable populations. Hence, corporations must enforce strict practices around responsible development.
Finally, as a general social commentary, combating mental health issues and a loneliness epidemic by using artificial intelligence systems sets a dangerous precedent. No system can (yet) replicate the intricacies of human nature, interaction, emotion, and feeling. Healthcare leaders, regulators, and innovators must remember this underlying tenet, and should prioritize viable and sustainable measures to resolve the mental health crisis, such as training more mental health professionals and increasing patient access to care.
Ultimately, whatever the solution may be, the time to act is now— before this epidemic becomes too catastrophic to manage.
Source: https://www.forbes.com/sites/saibala/2023/05/17/can-artificial-intelligence-solve-the-growing-mental-health-crisis/