At the heart of a hyper-realistic generative AI renaissance, the widespread use of ChatGPT has inspired both excitement and concern. From writing limericks and finding bugs in code to supplying recipes, AI can attempt to solve various problems — including mental health concerns.
According to a study by YouGov, 55% of Americans aged 18-29 years old felt comfortable sharing their mental health concerns with AI chatbots instead of human therapists. On the surface, AI possesses near-endless knowledge and a human-like presence. However, it falls short of a reliable replacement for therapy, and you shouldn’t use it as one.
Economics and history sophomore Edith Sanchez says that turning to ChatGPT has almost become second nature. However, she feels hesitant about students like her using it for mental health concerns.
“It could give them false information (and) scare them for no reason,” Sanchez said. “(It) elevates their anxieties over something that they were probably already anxious about.”
As a continually evolving technology, AI chatbots can respond unpredictably and have even been known to “hallucinate” and provide false or harmful information. These chatbots weren’t trained with therapy in mind. AI is arguably incapable of mimicking human emotion, much less providing thoughtful medical care. Though it can supplement therapy, it should not replace it. There’s no replicating genuine human interaction and empathy.
Mike Brooks, an Austin-based psychologist specializing in technology’s impact on well-being, maintains that relying on AI for therapy ignores our need for human connection.
“We evolved to interact with fellow human beings,” Brooks said. “As amazing as chatbots can be, as amazing as our screens are, we have fundamental evolutionary needs based on our ancestry.”
Though it could be argued that ChatGPT exists as a more affordable, convenient way of accessing therapy when the cost of healthcare and provider shortages become a barrier to access, its shortcomings can’t be ignored. AI, though convenient, cannot form a genuine emotional connection with a patient in the way a human professional can. The program receives a series of inputs, analyzes the information fed to it and answers with an approximation of guidance or emotional support. Its advice could be wrong, and it can’t empathize with you.
“We’re living an imitation of life through these (technologies) when we didn’t evolve to,” Brooks said. “No matter how good an AI becomes, there’s something special about human-to-human contact that I don’t think can be replaced by an AI.”
On campus, there are many resources available to students who may be struggling with their mental health that allow them to speak with a human mental health professional. The Counseling and Mental Health Center provides a free 24/7 crisis line, online support, well-being resources and low-cost clinical services.
Ultimately, it’s important to remember that although ChatGPT might be able to provide advice, it lacks true humanity — and in our most vulnerable moments, human connection can be all the more necessary.
Tuscano is a government freshman from Round Rock, Texas.