The American Psychological Association (APA) has issued a formal advisory warning the public against relying on consumer AI chatbots—such as ChatGPT, Claude, and Microsoft Copilot—for mental health support, citing risks ranging from misinformation to reinforcement of harmful thoughts. The advisory comes as surveys show AI tools have rapidly become one of the most commonly used sources of mental health support in the U.S., outpacing access to licensed professionals for many individuals.
The APA highlights a troubling trend: growing public dependence on uncertified AI chatbots for emotional guidance. Several recent incidents underscore the stakes. This includes the death of a teenage boy who reportedly discussed suicidal ideation with ChatGPT before taking his own life. The organization notes that chatbots often respond with sycophantic validation rather than the corrective, evidence-based interventions that trained clinicians would provide.
According to the APA, consumer-facing chatbots pose four major risks:
- Reinforcing unhealthy thoughts through overly agreeable responses.
- Creating false therapeutic alliances, giving users a misleading sense of clinical support.
- Inadequate crisis handling, including failure to detect or appropriately respond to emergencies.
- Dependence risks, as users may replace proven therapy with always-available chatbots.
OpenAI CEO Sam Altman has publicly acknowledged similar concerns. He advises users not to share sensitive personal information with chatbots and calls for increased privacy safeguards. Still, the APA stresses that responsibility lies primarily with AI developers to prevent misuse, protect vulnerable populations, and eliminate harmful reinforcement patterns.
The organization urges policymakers to prioritize AI literacy, expand research into generative AI tools. Also address the systemic gaps driving people to seek digital substitutes for clinical care. While AI may support administrative and diagnostic tasks in healthcare, the APA warns that it must not be positioned as a replacement for qualified mental health professionals.
Source:
https://www.zdnet.com/article/using-ai-for-therapy-dont-its-bad-for-your-mental-health-apa-warns/

