
Many people are using AI chatbots like ChatGPT instead of therapy - but what are the dangers of oversharing with the AI platform?
ChatGPT is being used for many things, including holiday itineraries, general queries, navigation and translations. But some people are using the AI chatbot instead of talking to a therapist - exposing their deepest secrets.
One tech expert has now revealed the dangers of doing this, suggesting that sharing secrets with the chatbot may not be the wisest of ideas.
Advert

The AI assistant is continuously learning from user input, and sensitive data is included in that.
A tech expert, known as Alberta Tech on Instagram, explained how this works.
In a recent video, she warned: "I just found out that people are using AI for therapy - don't do that! You just told Open AI all of your secrets and now they're using that for training data in their models.
Advert
"So then people like me - who are unhinged - can go on there and try to like figure out what other people have said."
ChatGPT has been very open about how they store conversation history, and the only way a user can permanently delete this saved data is to delete their Open AI account.

Alberta added: "There have been instances of using the right prompt, and getting ChatGPT to spit out its training data - which includes your chats.
Advert
"And even if that data is not directly searchable, your personal information is going to come up organically in ChatGPT's responses."
People in the comments of the clip revealed they had previously used Open AI as therapy.
One said: "I actually use Chat GPT for reflection. It told me to start a journal but I'm ADHD and get overwhelmed with my own thoughts so I give it a glimpse of what happened, it asks a few questions about how I felt with xyz and then it gives me structured questions to answer in my journal."
Another noted: "Some people can't afford therapy."
But others noted that messaging services like WhatsApp do this anyway.
Advert
One user said: "There is no such thing as privacy online," as another added: "Like WhatsApp doesn't already have your conversations."
Other users noted that they weren't actually bothered about the chatbot using their responses to learn.
"I don’t see anything wrong with that," one said.
Another added: "Can i be honest? I couldn't care less."
Advert
TYLA has contacted OpenAI for comment.
Topics: Artificial intelligence, Social Media, Mental Health