The article discusses the importance of safeguarding personal information while using AI chatbots like ChatGPT. Experts warn users against sharing sensitive data, such as identity information, medical results, and financial accounts, as it can lead to privacy violations. OpenAI and Google both caution against entering confidential details. The consensus is that while chatbots can provide support, users should practice caution and minimize exposure of personal information to ensure privacy and security.
When you type something into a chatbot, "you lose possession of it," Jennifer King, a fellow at the Stanford Institute for Human-Centered Artificial Intelligence, told the Wall Street Journal.
"Please don't share any sensitive information in your conversations," OpenAI writes on their website, while Google urges Gemini users not to "...enter confidential information or any data you wouldn't want a reviewer to see."
"We want our AI models to learn about the world, not private individuals, and we actively minimize the collection of personal information," an OpenAI spokeswoman told WSJ.
If you feel the need to ask ChatGPT to interpret lab work or other medical results, King suggested cropping or editing the document before uploading it, keeping it "just to the test results."
Collection
[
|
...
]