Tuesday, October 21, 2025
HomeAITalking to ChatGPT? Think twice: Sam Altman says OpenAI has no legal...

Talking to ChatGPT? Think twice: Sam Altman says OpenAI has no legal rights to protect ‘sensitive’ personal info

Published on

spot_img


During an interaction with Podcaster Theo Von, OpenAI CEO Sam Altman spoke about confidentiality related to ChatGPT.

According to Altman, many people, especially youngsters, talk to ChatGPT about very personal issues, like a therapist or life coach. They ask for help with relationships and life choices. However, that can be tricky.

“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality,” Altman says.

However, right now, no such legal privacy exists for ChatGPT. If there’s a court case, OpenAI might have to share “your most sensitive” chats.

Nevertheless, Altman feels this is wrong. He believes conversations with AI should have the same privacy as talks with a therapist. A year ago, no one thought about this. Now, it’s a big legal question.

“We should have the same concept of privacy for your conversations with AI that we do with a therapist,” he says.

“No one had to think about that even a year ago,” the OpenAI CEO adds.

Von then says he feels unsure about using AI because he worries about who might see his personal information. He thinks things are moving too fast without proper checks.

Sam Altman agrees. He believes the privacy issue needs urgent attention. Lawmakers also agree, but it’s all very new and laws haven’t caught up yet, he said.

Von doesn’t “talk to” ChatGPT much himself because there’s no legal clarity about privacy.

“I think it makes sense,” Altman replies.

The interest in “ChatGPT” on Google India was sky-high during July 24-25:

ChatGPT as a therapist

There are numerous cases reported about people using ChatGPT as their therapist. A recent incident involves Aparna Devyal, a YouTuber from Jammu & Kashmir.

The social media Influencer got emotional after missing a flight. It came from years of feeling “worthless”. She spoke to ChatGPT about being called “nalayak” at school and struggling with dyslexia.

ChatGPT comforted her, saying she kept going despite everything. Aparna felt seen. According to the AI chatbot, Aparna is not a fool, just human. Forgetting things under stress is normal, the AI assistant said.

ChatGPT praised her strength in asking for help and said people like her kept the world grounded.

“I’m proud of you,” ChatGPT said.



Source link

Latest articles

Company Churning Out AI Podcasts Filled With Bizarre Glitches They Didn’t Even Catch

Illustration by Tag Hartman-Simkins / Futurism. Source: Getty Images It’s no secret that...

After employee’s suicide, police quiz Ola Electric about delayed payments

Following the death by suicide of an Ola Electric employee over alleged workplace...

A new home for building AI skills

How you can use Google SkillsLearn from experts across Google: People have completed...

The foldable iPhone is reportedly delayed: Here’s what we know so far

Apple’s foldable iPhone, long rumored and eagerly awaited, could now arrive in 2027...

More like this

Company Churning Out AI Podcasts Filled With Bizarre Glitches They Didn’t Even Catch

Illustration by Tag Hartman-Simkins / Futurism. Source: Getty Images It’s no secret that...

After employee’s suicide, police quiz Ola Electric about delayed payments

Following the death by suicide of an Ola Electric employee over alleged workplace...

A new home for building AI skills

How you can use Google SkillsLearn from experts across Google: People have completed...