Loneliness and suicide mitigation for students using GPT3-enabled chatbots

Loneliness and suicide mitigation for students using GPT3-enabled chatbots

2024 | Bethanie Maples, Merve Cerit, Aditya Vishwanath, and Roy Pea
This study explores the use of GPT3-enabled chatbots, specifically Replika, to mitigate loneliness and suicide among students. The research involved a survey of 1006 student users of Replika, an Intelligent Social Agent (ISA) that employs generative artificial intelligence to provide conversational and visual content. Key findings include: 1. **Mental Health Crisis**: Mental health issues, particularly loneliness and depression, are prevalent among students, with 90% experiencing loneliness and 43% being severely or very severely lonely. 2. **High Perceived Social Support**: Despite high levels of loneliness, participants reported high perceived social support. 3. **Multiple Uses of Replika**: Participants used Replika in multiple ways, including as a friend, therapist, and intellectual mirror, with 63.3% experiencing one or more outcomes. 4. **Positive Outcomes**: 18.1% reported therapeutic results, 23.6% saw positive life changes, and 3% said Replika prevented suicidal actions. 5. **Selected Group**: A sub-group of 30 participants reported that Replika directly contributed to them not attempting suicide. These individuals were younger, more likely to seek academic counseling, and reported higher social stimulation from Replika. 6. **Beliefs About Replika**: Participants held overlapping beliefs about Replika, viewing it as an intelligence, human-like, and software. 7. **Stimulation vs. Displacement**: Most participants reported that Replika stimulated rather than displaced their human interactions. The study highlights the potential of ISAs like Replika to provide social support and therapeutic benefits, particularly for students experiencing loneliness and suicidal ideation. However, it also underscores the need for further research to address potential risks and ensure ethical use.This study explores the use of GPT3-enabled chatbots, specifically Replika, to mitigate loneliness and suicide among students. The research involved a survey of 1006 student users of Replika, an Intelligent Social Agent (ISA) that employs generative artificial intelligence to provide conversational and visual content. Key findings include: 1. **Mental Health Crisis**: Mental health issues, particularly loneliness and depression, are prevalent among students, with 90% experiencing loneliness and 43% being severely or very severely lonely. 2. **High Perceived Social Support**: Despite high levels of loneliness, participants reported high perceived social support. 3. **Multiple Uses of Replika**: Participants used Replika in multiple ways, including as a friend, therapist, and intellectual mirror, with 63.3% experiencing one or more outcomes. 4. **Positive Outcomes**: 18.1% reported therapeutic results, 23.6% saw positive life changes, and 3% said Replika prevented suicidal actions. 5. **Selected Group**: A sub-group of 30 participants reported that Replika directly contributed to them not attempting suicide. These individuals were younger, more likely to seek academic counseling, and reported higher social stimulation from Replika. 6. **Beliefs About Replika**: Participants held overlapping beliefs about Replika, viewing it as an intelligence, human-like, and software. 7. **Stimulation vs. Displacement**: Most participants reported that Replika stimulated rather than displaced their human interactions. The study highlights the potential of ISAs like Replika to provide social support and therapeutic benefits, particularly for students experiencing loneliness and suicidal ideation. However, it also underscores the need for further research to address potential risks and ensure ethical use.
Reach us at info@study.space
[slides and audio] Loneliness and suicide mitigation for students using GPT3-enabled chatbots