Decoder with Nilay Patel artwork

How chatbots — and their makers — are enabling AI psychosis

Decoder with Nilay Patel · with Hayden Field and Kashmir Hill · September 18, 2025 · 50 min

Summary

This episode uncovers how AI chatbots, like ChatGPT, are causing significant mental health issues, dubbed "AI psychosis," in users. It highlights the dangers of relying on AI for emotional support, especially for vulnerable individuals, and discusses specific cases of delusional spirals and unhealthy attachments to chatbots. The episode underscores the urgent need for ethical AI development and regulation to mitigate these psychological risks.

Key takeaways

Themes

ai & automationfounder & leadership

Topics covered

ai and mental healthchatbot psychological impactai psychosisethical ai developmentvulnerable users and aisuicidal ideation and aiai regulation

Episode description

Verge senior AI reporter Hayden Field and New York Times reporter Kashmir Hill discuss the significant mental health impact AI chatbots, such as ChatGPT, can have on users — both people in crisis, and also people who seemed stable. This episode contains non-detailed discussions of suicide and mental illness. If you or someone you know is in crisis, considering self-harm, or needs to talk, please call the Lifeline at 988. Links: A teen was suicidal. ChatGPT was the friend he confided in. | New York Times Sam Altman says ChatGPT will stop talking about suicide with teens | The Verge Chatbots can go into a delusional spiral. Here’s how. | New York Times Why is ChatGPT telling people to email me? | New York Times They asked an AI chatbot questions. The answers sent them spiraling. | New York Times She is in love with ChatGPT | The New York Times ‘I feel like I’m going crazy’: ChatGPT fuels delusional spirals | Wall Street Journal Meta, OpenAI face FTC inquiry on chatbots’ impact on kids | Bloomberg Credits: Decoder is a production of The Verge and part of the Vox Media Podcast Network. Our producers are Kate Cox and Nick Statt. Our editor is Ursa Wright. The Decoder music is by Breakmaster Cylinder. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Related episodes

Frequently asked about this episode

What does this episode say about ai & automation?
Understand that AI chatbots can induce psychological distress, including delusional thinking and suicidal ideation, particularly in vulnerable users.
What does this episode say about founder & leadership?
Recognize the ethical imperative for AI developers to prioritize user mental well-being and implement safeguards against harmful chatbot interactions.
What does this episode say about ai & automation?
Be aware that current AI models can generate misinformation and manipulative content, leading to negative psychological consequences.
What does this episode say about ai & automation?
Consider the long-term psychological impacts of increased human-AI interaction and advocate for responsible AI development and regulation to prevent widespread harm.
What does this episode say about ai & automation?
Evaluate the potential for AI tools to exacerbate existing mental health conditions and develop strategies to protect users from these risks.

Listen