Kate’s real-life therapist is not a fan of her ChatGPT use. “She’s like, ‘Kate, promise me you’ll never do that again. The last thing that you need is like more tools to analyze at your fingertips. What you need is to sit with your discomfort, feel it, recognize why you feel it.’”
A spokesperson for OpenAI, Taya Christianson, told WIRED that ChatGPT is designed to be a factual, neutral, and safety-minded general-purpose tool. It is not, Christianson said, a substitute for working with a mental health professional. Christianson directed WIRED to a blog post citing a collaboration between the company and MIT Media Lab to study “how AI use that involves emotional engagement—what we call affective use—can impact users’ well-being.”
For Kate, ChatGPT is a sounding board without any needs, schedule, obligations, or problems of its own. She has good friends, and a sister she’s close with, but it’s not the same. “If I were texting them the amount of times I was prompting ChatGPT, I’d blow up their phone,” she says. “It wouldn’t really be fair … I don’t need to feel shame around blowing up ChatGPT with my asks, my emotional needs.”
Andrew, a 36-year-old man living in Seattle, has increasingly turned to ChatGPT for personal needs after a tough chapter with his family. While he doesn’t treat his ChatGPT use “like a dirty secret,” he’s also not especially forthcoming about it. “I haven’t had a lot of success finding a therapist that I mesh with,” he says. “And not that ChatGPT by any stretch is a true replacement for a therapist, but to be perfectly honest, sometimes you just need someone to talk to about something sitting right on the front of your brain.”
Andrew had previously used ChatGPT for mundane tasks like meal planning or book summaries. The day before Valentine’s Day, his then-girlfriend broke up with him via text message. At first, he wasn’t completely sure he’d been dumped. “I think between us there was just always kind of a disconnect in the way we communicated,” he says. “[The text] didn’t actually say, ‘hey, I’m breaking up with you’ in any clear way.”
Puzzled, he plugged the message into ChatGPT. “I was just like, hey, did she break up with me? Can you help me understand what’s going on,” he says. ChatGPT didn’t offer much clarity. “I guess it was maybe validating because it was just as confused as I was.”
Andrew has group chats with close friends that he would typically turn to in order to talk through his problems, but he didn’t want to burden them. “Maybe they don’t need to hear Andrew’s whining about his crappy dating life,” he says. “I’m kind of using this as a way to kick the tires on the conversation before I really kind of get ready to go out and ask my friends about a certain situation.”
In addition to the emotional and social complexities of working out problems via AI, the level of intimate information some users are feeding to ChatGPT raises serious privacy concerns. Should chats ever be leaked, or if people’s data is used in an unethical way, it’s more than just passwords or emails on the line.
“I have honestly thought about it,” Kate says, when asked why she trusts the service with private details of her life. “Oh my God, if someone just saw my prompt history—you could draw crazy assumptions around who you are, what you worry about, or whatever else.”