CHATGPT users may want to think twice before they turn to the AI application for therapy or other types of emotional support. According to the Openai Director, Altman himself, the AI industry has not yet invented how to protect users’ privacy when it comes to these more sensitive conversations, because there is no confidentiality of the patient-doctor when your document is AI.
The executive submitted these comments on the subject Last episode From Theo von podcast, last weekend with Theo von.
In response to the question about AI with today’s legal system, Altman said that one of the problems not having a legal framework or policy for AI is that there is no legal confidentiality of user conversations.
“People talk about the most personal sh ** in their lives to chatgpt,” said Altman. “People use it – especially young people, use it – as a therapist, a life trainer; with relationship problems and [asking] – What should I do? And now, if you are talking to a therapist, lawyer or doctor about these problems, there is a legal privilege. There is a confidentiality of the patient, there is legal confidentiality, anything. And we haven’t come up with it yet when you talk to chatgpt. “
Altman added that this can cause care for privacy for users in the case of the lawsuit, because OpenAI would be legally obliged to conduct these conversations.
“I think it’s very fucked up. I think we should have the same concept of privacy in conversations with AI, which we do with a therapist or anything else – and no one had to think about it even a year ago,” said Altman.
The company understands that the lack of privacy may be a wider users’ reception. In addition to AI demand, so much data during the training is asked to create data from user chats in some legal contexts. Already, openai he fought with a court order In a lawsuit with the Modern York Times, which would require saving chats hundreds of millions of chatgpt users around the world, excluding the leaders of ChatgPT Enterprise clients.
TechCrunch event
San Francisco
|.
October 27-29 2025
In the statement on his website, Opeli said that he refers to this order, which he called “an excessive result.” If the court could replace its own OPENAI decisions regarding data privacy, he may open the company to further demand for legal detection or law enforcement. Today’s technology companies are regularly called to user data to aid in criminal proceedings. But in recent years there have been additional concerns about digital data, because the rights have begun to limit access to previously established freedoms, such as a woman’s right to choose from.
When the Supreme Court repealed, for example, Roe against Wade, customers began to switch to more private applications that follow periods or on Apple Health, which encrypted their documentation.
Altman also asked the host of the podcast about his own operate of chatgPT, taking into account that von said that he did not talk to AI Chatbot because of his own privacy problems.
“I think it’s a sense … you really want a privacy clarity before operate [ChatGPT] Many – like legal clarity – said Altman.
