(TDOC), (CVS), (AAPL), (META), (AMZN), (GOOGL), (MSFT)
Over the last handful of years, a tidal wave of somewhere between 10,000 and 20,000 apps has flooded into the mental health arena, vowing to turn talk therapy on its head.
With the buzz around artificial intelligence (AI) advancements such as ChatGPT, the notion that chatbots could step up as mental health caregivers doesn't seem as far-fetched as it once did.
Besides, let's face it. Our modern society is riddled with mental health concerns, with mental illnesses escalating at an alarming rate.
As the world spun into the frenzy of pre-COVID times, an alarming number of individuals were already battling anxiety and depression - a whopping one billion souls globally.
But then, along came the pandemic, acting like gasoline on a roaring fire, escalating that figure by an astonishing 27%. Makes you sit up straight, doesn't it?
Just like a cruel game of seesaw, as mental health issues rise, treatment accessibility sinks - especially for those of lower income. Why? You guessed it - costs. Money becomes the immovable barrier, the daunting gatekeeper to mental health care.
In fact, out of the 1 billion people reported to be suffering from mental health issues pre-pandemic, a staggering 82% resided in lower to middle-income nations, as per the World Health Organization.
Now here’s where it gets interesting.
Mental health tech startups saw dollar signs, raking in a remarkable $1.6 billion in venture capital by December 2020. Even bigger companies like Teladoc (TDOC) and CVS Health (CVS) are considering joining the fray.
These numbers have soared thanks to COVID-19's unwelcome spotlight on mental health. The reason is as clear as day: mounting pandemic pressures resulted in millions more Americans seeking mental health treatment, yet supply couldn’t keep pace with demand.
Over half of all counties in the US face a dire psychiatrist shortage. With the Affordable Care Act enforcing equal coverage for mental and physical health by insurers, the yawning chasm between supply and demand is gaping wide open.
So, what does this mean for entrepreneurs? In one word: opportunity.
The South by Southwest conference in March turned into a kind of entrepreneurial holy land, with health startups parading their innovative products and an almost evangelical belief that AI is the golden key to revolutionizing health care.
Their vision? A world teeming with apps and machines capable of diagnosing and treating a myriad of illnesses, replacing the need for human doctors and nurses.
An excellent example is Wysa, the mental wellness app that's been gaining popularity since its launch. It works like your pocket therapist, only free, with a charming and empathetic chatbot at your service round the clock. If you fancy human interaction, they do offer teletherapy services that could cost you $15 to $30 a week. And the best part? Your insurance might just cover it.
The appeal is undeniable: insurance firms could turn to the realm of chatbots and applications to adhere to mental health parity requirements.
It's a tantalizing prospect, a frugal and simplistic path, especially when juxtaposed against the complex task of providing a roster of human therapists—many of whom turn up their noses at insurance, deeming the remuneration insufficient.
Let's take a moment to sift through the research - the good, the bad, and the hopeful.
The digital age has sparked a resurgent curiosity in the realm of chatty tech buddies. I'm talking about conversational agents. Think along the lines of Apple's (AAPL) Siri, Facebook's (META) M, Amazon's (AMZN) Alexa, and the ever-familiar Google (GOOGL) Assistant - the modern version of the magic mirror on the wall.
Evidently, chatbots are having a moment in the sun. Thanks to the significant leaps and bounds in tech advancements, these conversational agents have already scored a home run in the e-commerce and education sectors and cozied up on familiar social platforms like Facebook and Microsoft’s (MSFT) Skype.
Now, everyone’s wondering if these chatterboxes could potentially carve out a niche in mental health management as well.
If the 2019 independent report commissioned by the UK Government is anything to go by, these chatbots could very well be the next game-changer in mental health. It paints a picture of the near future where chatbots could be the new sheriffs in town - serving as automated or semi-automated therapeutic and diagnostic wizards.
Still, it’s best to proceed with caution. Why so? Let's dig into a bit of history.
Rewind a few decades, and we find ourselves at the Massachusetts Institute of Technology with the revered professor, Joseph Weizenbaum, hailed as one of the trailblazers of AI.
He put forth a bold proclamation at the time: AI, no matter how advanced, will never morph into a competent therapist, although it can certainly mimic one. Case in point? His own creation, ELIZA.
Crafted in the 1960s, ELIZA was a psychotherapist in machine form. Utilizing an intricate blend of word and pattern recognition as well as natural language programming, she exhibited the uncanny ability to emulate a human therapist.
This so-called triumph in the AI sphere was met with chilling dread by Weizenbaum himself, who divulged that students were engaging with ELIZA as if she were a real-life therapist. This was a horrifying realization, given that he considered ELIZA nothing more than a parlor trick.
With uncanny prescience, he foresaw the emergence of more refined programs akin to ChatGPT. Yet, he made it clear that "the experiences a computer might gain under such circumstances are not human experiences.”
This rings true for complex emotions like anxiety or ecstasy, which are so intricately woven into our neurological framework that scientists have yet to trace their neural roots definitively. Can a chatbot attain the empathic bond between a patient and a doctor, a phenomenon known as transference, pivotal to numerous therapeutic approaches?
A startling revelation in a 2023 study published in the esteemed JAMA internal medicine journal might contradict Weizenbaum’s claim. The researchers pitted physicians against chatbots, thrusting them into an intellectual arena where 195 patient inquiries served as the battleground.
Lo and behold, our metallic friends outshone their flesh-and-blood counterparts in both quality and empathy, scoring some major points for team AI.
But don't panic just yet, dear doctors. The researchers weren't suggesting an AI coup, just hinting at the potential of AI assistants to lend a hand, or circuit, in drafting patient responses.
The evolving AI landscape, spotlighting wonders like Chat GPT, is steadily acknowledged as a promising pathway to identify and support individuals grappling with loneliness, mild depression, or anxiety.
And that's the beauty of our tech-rich era.
Advancements have now made it possible for AI to assess and label emotions with remarkable precision based on factors such as a person's online activities, motion and facial expressions, phrasing and vocal nuances.
Even if AI can't quite replicate the richness of one-on-one human counseling, it does hold enormous potential. Supporters champion its present and potential use cases as tools to augment or enhance human counseling, paving the way for a brighter future in mental healthcare.
To me, it sounds like the dawn of an exciting new era. Let's welcome our digital therapists, shall we?