Sharing personal issues with ChatGPT? It may not be as private as you think
Dubai: As AI tools like ChatGPT become increasingly integrated into daily life, millions of individuals—including children—are now using them for emotional support, therapy, and general life advice.
However, a critical distinction must be made by anyone considering or currently engaging in such interactions: conversations with ChatGPT do not carry the same legal privacy protections as communications with licensed professionals such as therapists, doctors, or lawyers. This significant clarification comes directly from OpenAI CEO Sam Altman.
Speaking on This Past Weekend, a podcast hosted by Theo Von, Altman acknowledged that a growing number of users—especially young people—are relying on ChatGPT as a kind of digital therapist.
“People talk about the most personal experience in their lives to ChatGPT,” he said. “Young people, especially, use it as a therapist, a life coach, asking, ‘What should I do?’”
But unlike sessions with real doctors, lawyers, or therapists, conversations with ChatGPT lack legal protections. “If you talk to ChatGPT about your most sensitive stuff and there’s a lawsuit, we could be required to produce that. I think that’s very screwed up,” Altman said.
Unlike doctors or therapists, AI tools like ChatGPT aren’t protected by privacy laws — meaning your conversations could be accessed or used in legal cases.
Altman explained that human professionals operate under strict legal privilege—doctor-patient confidentiality, legal confidentiality—but those frameworks haven’t been developed for AI.
“We haven’t figured that out yet for when you talk to ChatGPT,” he said, urging lawmakers to act quickly.
Unlike messaging apps like WhatsApp or Signal, which use end-to-end encryption, ChatGPT conversations are not fully private. OpenAI employees may access chat logs to improve the system or check for violations.
While OpenAI claims it deletes free-tier user data after 30 days, it can retain conversations longer if required by law — another red flag for privacy, especially as Altman confirmed that user data could be used in court.
In June, The New York Times and other media outlets asked a court to compel OpenAI to retain all user conversations—including deleted ones—as part of an ongoing copyright lawsuit. OpenAI is currently appealing that court order.
“No one had to think about this a year ago,” Altman said. “Now it’s a huge issue: how are we going to treat the laws around it?”
Until new protections are in place, Altman advises users to proceed with caution—your AI-powered heart-to-hearts may not be as private as you think.
Network Links
GN StoreDownload our app
© Al Nisr Publishing LLC 2025. All rights reserved.