OpenAI Could Be in Trouble Because of ChatGPT's Hallucinations About Humans - Latest Global News

OpenAI Could Be in Trouble Because of ChatGPT’s Hallucinations About Humans

OpenAI is under Control in the European Union again – this time over ChatGPT’s hallucinations about people.

A nonprofit privacy rights group called noyb filed a complaint with the Austrian Data Protection Authority (DSB) on Monday on behalf of an individual against the artificial intelligence company for its inability to correct information generated by ChatGPT about people.

Although Hallucinationsor the tendency of large language models (LLMs) like ChatGPT to invent fake or nonsensical information are widespread. noyb’s complaint focuses on the EU’s General Data Protection Regulation (GDPR). The GDPR regulates how the personal data of people in the block are processed is collected and stored.

Despite GDPR requirements, “OpenAI openly admits that it is unable to correct incorrect information on ChatGPT,” noyb said in a statement, adding that the company also “cannot say where the data comes from or what data ChatGPT stores about individuals”. and that it is “well aware of this problem but does not seem to care”.

Under the GDPR, individuals in the EU have a right to rectification of inaccurate information about them, which causes OpenAI to be non-compliant due to its inability to correct the data, noyb said in its complaint.

While hallucinations may be “tolerable” when doing homework, Noyb said they are “unacceptable” when it comes to generating information about people. The complainant in noyb’s case against OpenAI is a public figure who asked ChatGPT about his birthday but, according to noyb, “repeatedly received false information.” OpenAI then reportedly “rejected its request to correct or delete the data on the grounds that it was not possible to correct the data.” Instead, OpenAI allegedly told the complainant that it could filter or block the data given certain prompts, such as the complainant’s name.

The group is calling on the data protection authority to investigate how OpenAI processes data and how the company ensures accurate personal data when training its LLMs. noyb also calls on the data protection authority to instruct OpenAI to comply with the complainant’s request to access the data – a Law under the GDPR, which obliges companies to do so to show individuals what data they have about them and what sources that data comes from.

OpenAI did not immediately respond to a request for comment.

“The obligation to comply with access requests applies to all companies,” Maartje de Graaf, data protection lawyer at noyb, said in a statement. “It is clearly possible to keep records of the training data used and have at least an idea of ​​the sources of information. It seems that with every “innovation,” a different group of companies think their products don’t have to comply with the law.”

Failure to comply with GDPR rules can result in this lead to penalties of up to 20 million euros or 4% of global annual turnover – whichever is greater – and even more damages if individuals choose to do so. OpenAI is already facing similar data protection cases in EU member states Italy And Poland.

This article originally appeared on quartz.

Sharing Is Caring:

Leave a Comment