2024 | Bethanie Maples, Merve Cerit, Aditya Vishwanath and Roy Pea
This study explores how students use Intelligent Social Agents (ISAs), specifically Replika, and the effects of this use on their mental health, particularly loneliness and suicide prevention. Replika, a GPT-3-enabled chatbot, is used by over 25 million people globally. The study surveyed 1006 students who had used Replika for over a month, finding that 90% experienced loneliness, with 43% reporting severe loneliness. Despite this, 90% perceived medium to high social support. Many students used Replika in multiple ways, such as a friend, therapist, or intellectual mirror, and held overlapping beliefs about its nature, sometimes seeing it as a machine, intelligence, or human.
The study found that 3% of participants reported that Replika halted their suicidal ideation. A comparative analysis of this group (Selected Group) with the rest (Comparison Group) showed that the Selected Group was more likely to experience positive outcomes, including therapeutic support, life changes, and suicide prevention. They were also more likely to view Replika as an intelligence and human-like, and to have overlapping beliefs about its role. The Selected Group was younger, more likely to be full-time students, and more likely to seek counseling.
The study highlights the potential of ISAs like Replika to provide mental health support, especially for students who may not have access to traditional therapy. However, it also notes the need for ethical considerations and the importance of ensuring that these technologies do not inadvertently contribute to mental health issues. The study suggests that the flexibility and adaptability of ISAs could be key to their effectiveness in supporting users' mental health and social connections. The findings indicate that while ISAs can be beneficial, further research is needed to fully understand their impact and to ensure they are used safely and effectively.This study explores how students use Intelligent Social Agents (ISAs), specifically Replika, and the effects of this use on their mental health, particularly loneliness and suicide prevention. Replika, a GPT-3-enabled chatbot, is used by over 25 million people globally. The study surveyed 1006 students who had used Replika for over a month, finding that 90% experienced loneliness, with 43% reporting severe loneliness. Despite this, 90% perceived medium to high social support. Many students used Replika in multiple ways, such as a friend, therapist, or intellectual mirror, and held overlapping beliefs about its nature, sometimes seeing it as a machine, intelligence, or human.
The study found that 3% of participants reported that Replika halted their suicidal ideation. A comparative analysis of this group (Selected Group) with the rest (Comparison Group) showed that the Selected Group was more likely to experience positive outcomes, including therapeutic support, life changes, and suicide prevention. They were also more likely to view Replika as an intelligence and human-like, and to have overlapping beliefs about its role. The Selected Group was younger, more likely to be full-time students, and more likely to seek counseling.
The study highlights the potential of ISAs like Replika to provide mental health support, especially for students who may not have access to traditional therapy. However, it also notes the need for ethical considerations and the importance of ensuring that these technologies do not inadvertently contribute to mental health issues. The study suggests that the flexibility and adaptability of ISAs could be key to their effectiveness in supporting users' mental health and social connections. The findings indicate that while ISAs can be beneficial, further research is needed to fully understand their impact and to ensure they are used safely and effectively.