
Your therapy sessions with ChatGPT are not private: Sam Altman
What's the story
OpenAI's CEO Sam Altman has raised concerns over the privacy of conversations on ChatGPT, especially those related to therapy. In an episode of Theo Von's podcast, This Past Weekend w/ Theo Von, Altman highlighted that these chats do not have the same legal protections as traditional therapist-patient conversations. He said that if a lawsuit arises from sensitive discussions with ChatGPT, OpenAI could be compelled to produce those records.
Privacy concerns
AI chatbots are now being used as therapists
Altman stressed the need for equal privacy standards for conversations with AI as we have with human therapists. He said, "I think that's very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever." The OpenAI chief also noted that many people, especially young ones, are now using ChatGPT as a therapist or life coach.
Data scrutiny
OpenAI can read your chats
Unlike end-to-end encrypted platforms like WhatsApp or Signal, OpenAI has the ability to read your chats with ChatGPT. The company says that its employees sometimes review these conversations for improving the AI or monitoring abuse. Although deleted chats from Free, Plus, and Pro users are erased within 30 days unless required for legal/security reasons, this policy doesn't guarantee complete privacy.
Legal battle
Media's legal battle with OpenAI
Last month, The New York Times and other media outlets sought a court order to compel OpenAI to save all user chats, even deleted ones, as part of ongoing legal proceedings. The company is currently appealing this court order. If the court can override OpenAI's own decisions surrounding data privacy, it could open the firm to further demand for legal discovery or law enforcement purposes.