AI won't replace therapists — but for billions with no access, even an imperfect chatbot is better than silence.
A CBS 60 Minutes segment recently examined whether AI and chatbots can address the mental health care gap. Mental health issues are rising globally, and the shortage of qualified therapists is severe, prompting exploration of technological solutions. The question isn't whether AI is ready — it's whether waiting for perfection costs more than deploying what we have.
Current Limitations
Present-day AI falls significantly short of what's needed for comprehensive mental health support. To illustrate the gap concretely: present-day AI, such as ChatGPT, cannot reliably distinguish between humorous and serious tones in a text transcript. That emotional and contextual understanding gap must be bridged before these tools can be trusted in high-stakes therapeutic settings.
The E-Reader Comparison
Early e-readers were met with skepticism. Kindle's success wasn't about replacing physical books — it was about improving accessibility. Millions of people who couldn't easily access bookstores, or who couldn't afford full retail prices, suddenly had access to libraries. Similarly, AI chatbots should be understood as accessibility tools rather than therapist replacements. The comparison that matters isn't "chatbot vs. therapist." It's "chatbot vs. nothing."
The Accessibility Argument
Global disparities in mental health access are stark, particularly in Southeast Asia, India, and Latin America. Across these regions, entire populations have effectively zero access to trained mental health professionals. An imperfect AI that can provide consistent, non-judgmental support surpasses complete absence of care. This isn't a lowering of the bar — it's a recognition that the bar must be set relative to what's actually available to people.
A Developmental Stage, Not a Final Answer
Current AI limitations are developmental stages, not permanent ceilings. The trajectory is clear: these technologies will increasingly contribute to democratizing mental health support as they evolve. The work now is to develop them thoughtfully, with appropriate safeguards, so that the next generation of tools can be trusted with more of the burden that qualified humans currently struggle to carry alone.
The question shouldn't be "is AI good enough to replace therapy?" It should be "how do we use AI to extend care to the people who currently receive none?" Those are very different questions, and the answer to the second one is urgent.