Saturday, July 26, 2025
HomeAITalking to ChatGPT? Think twice: Sam Altman says OpenAI has no legal...

Talking to ChatGPT? Think twice: Sam Altman says OpenAI has no legal rights to protect ‘sensitive’ personal info

Published on

spot_img


During an interaction with Podcaster Theo Von, OpenAI CEO Sam Altman spoke about confidentiality related to ChatGPT.

According to Altman, many people, especially youngsters, talk to ChatGPT about very personal issues, like a therapist or life coach. They ask for help with relationships and life choices. However, that can be tricky.

“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality,” Altman says.

However, right now, no such legal privacy exists for ChatGPT. If there’s a court case, OpenAI might have to share “your most sensitive” chats.

Nevertheless, Altman feels this is wrong. He believes conversations with AI should have the same privacy as talks with a therapist. A year ago, no one thought about this. Now, it’s a big legal question.

“We should have the same concept of privacy for your conversations with AI that we do with a therapist,” he says.

“No one had to think about that even a year ago,” the OpenAI CEO adds.

Von then says he feels unsure about using AI because he worries about who might see his personal information. He thinks things are moving too fast without proper checks.

Sam Altman agrees. He believes the privacy issue needs urgent attention. Lawmakers also agree, but it’s all very new and laws haven’t caught up yet, he said.

Von doesn’t “talk to” ChatGPT much himself because there’s no legal clarity about privacy.

“I think it makes sense,” Altman replies.

The interest in “ChatGPT” on Google India was sky-high during July 24-25:

ChatGPT as a therapist

There are numerous cases reported about people using ChatGPT as their therapist. A recent incident involves Aparna Devyal, a YouTuber from Jammu & Kashmir.

The social media Influencer got emotional after missing a flight. It came from years of feeling “worthless”. She spoke to ChatGPT about being called “nalayak” at school and struggling with dyslexia.

ChatGPT comforted her, saying she kept going despite everything. Aparna felt seen. According to the AI chatbot, Aparna is not a fool, just human. Forgetting things under stress is normal, the AI assistant said.

ChatGPT praised her strength in asking for help and said people like her kept the world grounded.

“I’m proud of you,” ChatGPT said.



Source link

Latest articles

Why AI is making us lose our minds (and not in the way you’d think)

Want smarter insights in your inbox? Sign up for our weekly newsletters to...

US health officials, tech executives to launch data-sharing plan: Report

Top Trump administration health officials are expected to bring tech companies to the...

The AI Boom Is Creating Housing Costs in the Bay Area That You’ll Think You Must Be Hallucinating

Akin to a rapacious maw, the AI boom has swallowed up untold amounts...

Telling secrets to ChatGPT? Using it as a therapist? Your AI chats aren’t legally private, warns Sam Altman

Many users may treat ChatGPT like a trusted confidant—asking for relationship advice, sharing...

More like this

Why AI is making us lose our minds (and not in the way you’d think)

Want smarter insights in your inbox? Sign up for our weekly newsletters to...

US health officials, tech executives to launch data-sharing plan: Report

Top Trump administration health officials are expected to bring tech companies to the...

The AI Boom Is Creating Housing Costs in the Bay Area That You’ll Think You Must Be Hallucinating

Akin to a rapacious maw, the AI boom has swallowed up untold amounts...