Would you tell Alexa about your worst fear? Perhaps you could even ask Siri for some emotional support following an especially trying day? To answer queries, we are increasingly using chatbot on smart speakers, websites, and applications.
And as these systems, which are run by artificial intelligence (AI) software, advance in complexity, they are beginning to offer decent, in-depth responses.
But can these chatbots ever resemble humans well enough to work as therapists?
Replika, a US chatbot app founded by computer programmer Eugenia Kuyda, claims to provide users with an “AI companion who tends to care a lot, always here to hear and speak, always on your side.” It was introduced in 2017 and currently has over two million active users. Due to the AI learning from their interactions, each has a chatbot or “replika” that is particular to them. Users can create their own cartoon chatbot avatars as well.
According to Ms. Kuyda, users of the app range from those who are just lonely and in need of a friend to adults who use it to “warm up before human interactions” with autistic youngsters.
Replika is reportedly used by others to practise for job interviews, discuss politics, or even serve as a marriage counsellor.
Although the app is primarily intended to be a friend or companion, it also makes the claim that it can help with your mental health, for example by allowing users to “build better habits to minimise anxiety”. The World Health Organisation (WHO) estimates that almost a billion people worldwide suffer from a mental condition. That is more than one out of every ten people.
Only a small percentage of those in need have access to effective, accessible, and high-quality mental health care, the WHO continues.
The rise of chatbot mental health therapists may provide a lot of individuals with much-needed support, even if anyone with concerns for themselves or a family member should first see a doctor. Apps that are meant to enhance your mental wellbeing can be helpful, according to Dr. Paul Marsden, a member of the British Psychological Society, but only if you discover the correct one and even then, only to a limited extent.
How are you expected to choose from the 300 applications available for anxiety, I wondered when I last looked.
They must only be considered as an adjunct to in-person therapy. Apps do not, in general, take the place of human therapy.
Dr. Marsden claims to be enthusiastic about the potential of AI to enhance the efficacy of therapeutic chatbots. “Talking therapy is the foundation of mental health support, and talking is what chatbots do,” he asserts.
Dr. Marsden draws attention to the fact that top AI chatbot companies, including OpenAI, the outfit responsible for the recently trending ChatGPT, are sharing their technology with others.
This, according to him, enables mental health apps to leverage the greatest AI to power their chatbots because of “its vast information expertise, increasing reasoning ability, and proficient communication skills.” One such supplier that already makes use of OpenAI’s technology is Replika.