Noyb highlights concerns regarding ChatGPT's false claims about individuals, asserting that OpenAI violates GDPR by potentially retaining false data. Despite recent updates, biases and inaccuracies persist, affecting reputations. Legal challenges have arisen, including defamation lawsuits from users harmed by misleading outputs. Critics argue that filtering outputs without deleting the underlying data doesn't mitigate reputational risks or comply with GDPR. Lawyers emphasize that disclaimers do not absolve AI companies of legal responsibility, affirming their obligations under the law to manage the truthfulness of their data.
"While the damage done may be more limited if false personal data is not shared, the GDPR applies to internal data just as much as to shared data."
"Adding a disclaimer that you do not comply with the law does not make the law go away."
Collection
[
|
...
]