“You’ve passed harder seasons.”
— an AI wellness bot’s nightly message
Camila sat at her kitchen table, staring at a blinking cursor. The job search had stretched into weeks, and each rejection felt like another door closing in the dark. At 10 pm she opened her phone—not to scroll through social media, but to open a familiar AI wellness bot.
“How are you tonight?” it asked.
She typed back: “Worried. Helpless.”
Seconds later came the reply: “Let’s breathe together.”
That small act—consistent, calm, available—became her lamp in the window during an unpredictable season.
The Rising Need
Loneliness is climbing. Global surveys show record isolation across every generation.¹
Resources are stretched. Mental health providers face long waitlists and limited insurance coverage.
AI promises presence. Therapy bots, check-ins, and conversational agents fill the silence—but also blur the line between support and substitution.
AI can comfort, yet it cannot care in the human sense. The glow in the window is real; the warmth of the hearth still matters more.
Ethical Fault Lines
Dependence and displacement Studies note “emotional over-reliance” when users turn to bots instead of people.²
Errors under pressure Bots may misread distress or hallucinate advice in crisis.³
Synthetic empathy AI mimics warmth without reciprocity, risking false intimacy.⁴
Delusional loops Scholars describe “technological folie à deux”—mutual distortion between user and bot.⁵
Data and bias Sensitive emotions require strict privacy and cultural awareness.⁶
AI can feel like presence—but presence isn’t the same as relationship.
Playbook: Using AI Wellness Tools Responsibly
Everyday Application
Camila’s nightly rhythm looked simple:
Morning check-in: “How’s your emotional battery?”
Evening reflection: Three words for today’s feelings.
Weekly summary: “What’s one small win?”
Crisis mode: “I’m not equipped to help—but here’s someone who can.”
The AI stayed consistent, predictable, gentle—a steady light through uncertainty.
The Metaphor
A lamp in the window signals presence and safety.
A hearth offers shared warmth.
AI should remain lamp, never hearth.
Let the lamp shine, but don’t mistake it for the fire.
References
World Health Organization. Loneliness and Mental Health, 2024.
American Psychological Association. “Digital Companionship and Dependence,” Monitor on Psychology, 2023.
Berkeley School of Public Health. “Why AI Isn’t a Magic Bullet for Mental Health.” https://publichealth.berkeley.edu/articles/spotlight/research/why-ai-isnt-a-magic-bullet-for-mental-health
Technology and Society Journal, Vol. 45 (2023): “Synthetic Empathy in Conversational AI.”
Rizzo, A. & Paredes, J. “Technological Folie à Deux,” arXiv preprint 2507.19218 (2025).
Journal of Medical Internet Research, Vol. 27 (2024): “Bias and Privacy in AI-Mediated Therapy.”