psychiatrist

This work may not be copied, distributed, displayed, published, reproduced, transmitted, modified, posted, sold, licensed, or used for commercial purposes. By downloading this file, you are agreeing to the publisher’s Terms & Conditions.

Letter to the Editor

The Intersection of ChatGPT and Reddit: A New Avenue for Support of Abuse Survivors?

Kripa Ahuja, MS; Grace DeSena, BA; and David Spiegel, MD

Published: June 4, 2024


To the Editor: Since the rise of technology in the 21st century, modalities such as social media and artificial intelligence (AI) have continued to grow. Particularly during the 2020 COVID-19 pandemic, online social media platforms, such as Reddit, became supportive communities.1 In late 2022, ChatGPT, a large language model, was released. From performing on academic boards to answering patient questions empathetically, ChatGPT’s full potential has yet to be discovered.2

In 2023, the intersection of Reddit and ChatGPT was observed in communities such as the subreddit: r/NarcissisticAbuse, a group whose about statement is as follows: “This is a safe place for people who suffered or are currently suffering from narcissistic abuse to seek support, learn, vent, discuss, document their abuse, and come together in their path towards healing.”

On a November 2023 post entitled “ChatGPT can help” from the NarcissisticAbuse subreddit, a user shared that they copied and pasted communication from their narcissistic ex into ChatGPT with the following message: “Can you draft me a response to this email from my narcissistic ex?” The user also shared that they asked ChatGPT to analyze the communication from their ex for emotionally abusive tactics, and the responses they received from ChatGPT were enormously validating. This post received significant traction, with many users commenting that they have saved ongoing chats with ChatGPT for affirmation and perspective. One of the comments on this post even described possible responses from ChatGPT after asking ChatGPT to respond as a narcissist.

Studies have shown that explanations of abusive partners as narcissists help women process their trauma and heal faster.3 Identification of the abusive partners as narcissists allows women to work with their counselor to find practical strategies for specific situations.3 Many victims of narcissistic abuse report victim blaming and lack of accountability in their partner.3 Thus, it is refreshing to see an AI modality such as ChatGPT used to validate victims of narcissistic abuse and provide methods of recognizing emotional abuse.

Research has demonstrated that integrative conversational AI chatbots can serve as cost-effective and accessible therapeutic methods.4 While not a replacement for a trained therapist, AI may provide support by listening to and reassuring distressed individuals. Prior research has shown that ChatGPT may alleviate stress and supply encouragement to worried parents of children with atopic dermatitis.5

Analysis of the NarcissisticAbuse subreddit demonstrates that utilizing AI talkbots such as ChatGPT may be beneficial for victims of abuse. Based on the popularity of the post in the NarcissisticAbuse subreddit, counselors may want to suggest ChatGPT as an immediate, short-term method of validation and emotional support to victims of narcissistic abuse. Nevertheless, persons in acute distress should preferably reach out to a crisis hotline or seek inpatient mental health services. Potential pitfalls of AI include racial bias, particularly ChatGPT’s strikingly differing response for pain management (opioids vs aspirin) based on race.6 Other shortcomings of chatbots include a lack of information verification that has the potential to induce stress in users by providing them with an incorrect diagnosis or even dangerous information.6 Clinicians may mitigate some of these challenges by having awareness of AI-based racial biases and informing their patients. Clinicians may also try to create open conversation with their patients about AI, particularly where their patients acquire information and if they are involved in any online social support groups. The interplay of online social media support groups and interactive chatbots may represent a new realm of mental health support, but providers must approach this interplay with caution to ensure optimal patient education, patient safety, and ethical considerations. Future studies are needed to elucidate the most beneficial adaptation of social media and AI for mental health aid.

Article Information

Published Online: June 4, 2024. https://doi.org/10.4088/PCC.23lr03698
© 2024 Physicians Postgraduate Press, Inc.
Prim Care Companion CNS Disord 2024;26(3):23lr03698
To Cite: Ahuja K, DeSena G, Spiegel D. The intersection of chatGPT and Reddit: a new avenue for support of abuse survivors?. Prim Care Companion CNS Disord. 2024;26(3):23lr03698.
Author Affiliations: Eastern Virginia Medical School, Norfolk, Virginia (Ahuja, Spiegel); University of Florida College of Medicine, Gainesville, Forida (DeSena).
Corresponding Author: Kripa Ahuja, MS, Eastern Virginia Medical School, 825 Fairfax Ave, Norfolk, VA 23507 (ahujak@evms.edu).
Relevant Financial Relationships: None.
Funding Sources: None.

Volume: 26

Quick Links:

$40.00

Buy this Article as a PDF

References