When AI Meets Mental Health
- Carol Calkins, LCSW

- Sep 21
- 1 min read

News broke this month that OpenAI will soon reroute sensitive conversations to its newest model, GPT-5, and roll out parental controls for teens. These changes come after heartbreaking cases where young people used AI chats during moments of crisis, and the technology failed them.
As a therapist, I see some benefits and many risks in relying on this for guidance and support for children and adolescents. AI tools are everywhere in our children’s lives, and while they can sometimes provide support, they are not a substitute for human connection or professional care.
What matters for families:
AI is not therapy. Chatbots can’t read nuance the way trained professionals can. A trained professional is able to problem solve and guide the client forward. They know the warning signs and are educated in guiding the individual to emotional and physical safety, if necessary. AI may reinforce harmful thinking instead of interrupting it.
Parental controls are a start. The ability to monitor distress signals
and limit features is helpful , but no technology can replace conversations at home about mental health.
We need balance. Just as we teach our kids healthy use of social media, we must guide them in how to interact with AI, encouraging curiosity while setting limits.
The truth is, no model — however advanced — can replace the healing power of being seen, heard, and understood by another human being. If your teen is struggling with anxiety, depression, or thoughts of self-harm, please reach out. AI can offer information, but therapy offers transformation.
In healing...
Carol
Photo by Steve Johnson on Unsplash



Comments