
It's clear many people across the globe have been absolutely obsessed with ChatGPT ever since it first launched in 2022.
Whether students were sneakily using it to write their essays for them, employees were making it write job applications, people were trialling it out as the new 'Google', or Gen Z were simply playing around with it to hop on social media trends, like the 'AI Barbie' hype from last week, it's no surprise that the language model chatbot is all the rage right now, despite its several limitations.
But it's important to note that all this new technology comes with it a number of dangers, hence why an expert has now issued a stern warning over five things you should never tell ChatGPT.
Advert

Identity information
According to the expert, you should never reveal identity information about yourself when writing a prompt for the chatbot.
Jennifer King, a fellow at the Stanford Institute for Human-Centered Artificial Intelligence, told the Wall Street Journal late last month (30 March) that when you type something into a chatbot 'you lose possession of it'.
Advert
Such information may include your Social Security number, driver’s license, and passport numbers, as well as date of birth, address, and phone numbers.
"We want our AI models to learn about the world, not private individuals, and we actively minimise the collection of personal information," an OpenAI spokeswoman told the outlet.
Financial accounts
This one sounds pretty obvious, and goes for the whole internet, not just ChatGPT, but do your best to avoid sharing financial or bank account details online as such info can be hacked and used to monitor or access funds.
Advert

Medical results
AI chatbots don't operate with the same levels of patient confidentiality as those in the healthcare industry so it may be best to leave any queries to your GP rather than ChatGPT.
However, if you still feel the need to ask the bot to interpret any of your results or documents, King advised cropping whatever you feed it before uploading it so the image is kept 'just to the test results'.
Corporate information
WSJ warns you may get into some major trouble at work if you accidentally expose client data or otherwise private information when using ChatGPT to draft emails, edit documents, etc.
Login information
Some people give ChatGPT their various account usernames and passwords to use the bot to perform a number of tasks for them.
Advert
But, while they may sound convenient and a major time-saver, it's imperative to remember that these AI agents don't keep such credentials secure meaning the data could be shared, hacked and you could ultimately be landed in some very deep water.
Better to just whack all that sensitive info in a tried-and-tested password manager.
Coming from the horse's mouth itself, OpenAI writes on their website: "Please don’t share any sensitive information in your conversations."
Topics: Advice, Artificial intelligence, ChatGPT, Explained, Technology