An increasing number of Americans are turning to AI chatbots like ChatGPT for emotional support, not as a novelty, but as a lifeline. These stats paint a hopeful picture: AI stepping in where traditional mental health care can’t. Blissbot.ai blends neuroscience, emotional resilience training and AI to deliver “scalable healing systems.” Blissbot was designed from scratch as an AI-native platform, a contrast to existing tools that retrofit mental health models into general-purpose assistants. Other companies, like Wysa, Woebot Health and Innerworld, are also integrating evidence-based psychological frameworks into their platforms. Despite the flurry of innovation, mental health experts caution that much of the AI being deployed today still isn’t as effective as claimed. “Many AI mental health tools create the illusion of support,” said Funso Richard, an information security expert with a background in psychology. “But if they aren’t adaptive, clinically grounded and offer context-aware support, they risk leaving users worse off — especially in moments of real vulnerability.” Even when AI platforms show promise, Richard cautioned that outcomes remain elusive, noting that AI’s perceived authority could mislead vulnerable users into trusting flawed advice, especially when platforms aren’t transparent about their limitations or aren’t overseen by licensed professionals. Used thoughtfully, AI tools can help free up clinicians to focus on deeper, more complex care by handling structured, day-to-day support — a hybrid model that many in the field see as both scalable and safe.