18 May AI Therapy: A Digital Companion
Individuals who may be struggling with loneliness, depression, or anxiety and are in need of immediate relief or support are increasingly looking to apps as a middle-path tool when a mental health service provider may not be the most expedient solution. This BBC article explores the innovative possibilities of a chatbot therapist app called Replika, aimed at providing AI-generated counseling services to its users. Lia Abrams and Charlotte Beatty, two researchers at Fluent, consider the following to be important areas of research to investigate:
1) The consequences of around-the-clock access to chatbots. In face-to-face therapy, skill development often hinges on the time a patient spends away from a therapist; that is, the time patients spend in day-to-day life putting their coping skills into practice. While around-the-clock care could be useful for individuals who need immediate support, it’s important to consider how constant access to a chatbot could facilitate patterns of depending on the resource in times of stress instead of developing personal resilience.
2) The commitment to equitable mental healthcare. The technology described in this article primarily addresses the wider population of individuals with mild to moderate mental health issues. However, it is necessary to consider how the smaller population of people with severe mental illness could utilize the service. These individuals face significant stigma, present a more complex set of challenges, and potentially need more support and funding for their healthcare.
3) The ability of AI to replicate natural human skills and instincts. An important component of therapy is the ability to read a patient’s body language and facial expressions. Will an AI chatbot be able to pick up on the intricacies of a patient’s energy? How will this work if there is no prior knowledge of who the patient is, their personality, or their mannerisms?
4) The security of patient information and the quality of data mining. One important part of therapy is the confidentiality agreement that therapists sign, prohibiting them from disclosing patient information outside of sessions. How can chatbot therapy services ensure that patient data is not misused? In addition, if the data that chatbot algorithms draw on is skewed or doesn’t cover a wide enough demographic of people, then can algorithms truly care for all individuals as they see fit? There is substantial evidence that mental health experiences vary significantly across demographic lines like gender, race, ethnicity, and sexuality. How can AI and chatbots truly cater to the needs of each individual’s intersectional identity?
Here at Fluent, we prioritize research that amplifies voices of underrepresented communities. While the possibility of AI-generated mental healthcare offers significant benefits for increased accessibility to self-help practices, the above are factors that should be addressed and researched in order to ensure that platforms such as Replika can reach and serve a wide range of individuals.