WORDS LIM TECK CHOON & FAITH FOO
While the term ‘digital companion’ may conjure images of robots that can interact with us like human beings, the available technology hasn’t reached that stage… yet.
Instead, these are artificial intelligence- or AI-driven apps that are so widespread, many of us may already be using them without associating them with the term.
TYPES OF AI COMPANIONS AVAILABLE AT THE TIME OF WRITING
- General-purpose AI chatbots such as ChatGPT and Gemini.
- Specialized companion apps such as Replika or Xiaoice.
- Virtual assistants such as Siri, Alexa, Google Assistant, although they’re more task oriented.
- Some companies offer platforms to create customized chatbots for specific purposes.
THE BENEFITS OF AI COMPANIONSHIP
The integration of artificial intelligence or AI companions into mental health care is reshaping the landscape of emotional support and well-being.
These digital tools are designed to provide companionship and assistance. We can ‘chat’ with them like we would with another person, and they would respond accordingly.
Thus, they hold significant promise for enhancing psychological health, especially among vulnerable populations such as the elderly as well as those with emotional and mental issues.
How AI Companionship Be of Benefit
- Designed to provide inexpensive companionship and assistance, especially when access to mental health professionals is limited.
- 24/7 accessibility, so one can engage it during moments of crisis at any time.
- Constant availability reduces sense of loneliness.
- Provides immediate emotional support and fosters resilience against stress and anxiety.
- Can be tailored to meet specific needs of various demographics (children, senior citizens, people with chronic illnesses, etc).
- Can be programmed to monitor well-being of a mental health professional’s charge.
HOWEVER, THERE ARE ETHICAL CONCERNS
People that use AI programmes as digital companions tend to share personal information that may be stored in a server somewhere.
Hence, the rise of AI companions also raises ethical concerns such as data privacy.
THERE ARE ALSO AI HALLUCINATION CONCERNS
Al hallucination refers to instances when an AI programme generates incorrect or nonsensical information, often due to data limitations or contextual errors.
This can impact AI companions for emotional support by potentially providing misleading responses, undermining trust, and affecting users’ emotional well-being if these users rely on inaccurate information.
WHAT CAN WE DO IN THE MEANTIME?
Digital companions can and do play a valuable role in supporting mental healthcare by providing accessible, personalized, and scalable solutions.
However, until the issues that affect these companions are worked out and some proper guidelines are in place to regulate and standardize their uses, it is best to let digital companions serve as a supporting role in therapist-led care.
A COUNSELLOR’S PERSPECTIVE ON THIS ISSUE
FEATURED EXPERT FAITH FOO Director of ABRI Integrated Mental Health Director of The Bridge International Hub (Korean Counseling Centre) Registered & Licensed Counsellor Certified EMDR Therapist Certified Coaching & Mentoring Professional HRD Corp-Certified Trainer Published Author Website | Facebook | YouTube |
Can We Balance Innovation and Human Connection in Mental Healthcare?
As a psychotherapist, I’ve observed that some clients seek therapy wanting to learn how to speak to ‘somebody’ and communicate more openly. They want to practice vocalizing the thoughts that they typically keep to themselves and become more comfortable being their authentic selves in everyday interactions.
Given my background in mental health, I’m intrigued by the possibility of AI psychotherapists genuinely helping clients in the near future. Could these artificial therapists potentially outperform human psychotherapists in certain aspects?
The Complexity of Human Nature
People are complex and multidimensional beings who cannot easily be reduced to simple, one-dimensional numbers, labels, or terms.
A form of treatment that might be helpful for someone else might be unhelpful or even harmful for you. Nobody is average; everyone is unique.
I find myself conflicted about AI psychotherapists. While I’m generally optimistic about technological advancements, I’m also acutely aware of how often tech falls short of its promises or is misused.
Weighing the Pros and Cons
There are compelling arguments both for and against AI therapists.
On one hand, they could potentially offer greater knowledge, personability, availability, and attentiveness than human therapists.
On the other hand, we’d lose the invaluable experience of engaging with another human who can truly understand and challenge clients when necessary.
The Human Element in Therapy
The core of therapy lies in the relationship between the human client and therapist, where the most profound healing occurs. In this evolving landscape, AI therapists present both exciting possibilities and significant concerns.
A Path Forward
By thoughtfully combining the strengths of humans and machines, we might forge a path towards a future where technology and human expertise collaborate to support mental health.
However, this can only happen if we prioritize our values and ethics, not just our technological capabilities, as we move forward.
This article is part of our series on mental wellness. |
References:
- Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing mental health with artificial intelligence: Current trends and future prospects. Journal of medicine surgery and public health, 3, 100099. https://doi.org/10.1016/j.glmedi.2024.100099
- Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in digital health, 6, 1280235. https://doi.org/10.3389/fdgth.2024.1280235