OpenAI Has Filed Another Privacy Complaint Over ChatGPT's Love of Inventing - Latest Global News

OpenAI Has Filed Another Privacy Complaint Over ChatGPT’s Love of Inventing

OpenAI has been hit by a data protection complaint in Austria, namely one that stands for “None Of Your Business”. The complaint alleges that the company’s ChatGPT bot repeatedly provided false information about a real person (who is not named in the complaint for privacy reasons). . This could be a breach of EU data protection rules.

The chatbot allegedly spit out false information about the person’s date of birth, instead of just saying it didn’t know the answer to the question. Like politicians, AI chatbots like to confidently make things up and hope we don’t notice. This phenomenon is called hallucination. However, it’s one thing for these bots to invent ingredients for a recipe, and quite another for them to invent things about real people.

OpenAI refused to help delete the incorrect information, responding that it was technically impossible to make such a change. The company offered to filter or block the data on certain prompts. OpenAI’s privacy policy states that if users notice that the AI ​​chatbot has generated “factually inaccurate information” about them, a “correction request” will be made, but the company says it “may not be able to do so.” “To correct the inaccuracy in any case”. .

This is more than just a complaint, as the chatbot’s tendency to make things up could violate the region’s General Data Protection Regulation (GDPR), which governs it. EU citizens have rights regarding personal data, including the right to correct inaccurate data. Failure to comply with these regulations can result in severe financial penalties, in some cases up to four percent of annual global sales. Supervisory authorities can also order changes in the processing of information.

“It is clear that companies are currently unable to bring chatbots like ChatGPT into compliance with EU law when processing data about individuals,” said Maartje de Graaf, NOYB data protection lawyer, in a statement. “If a system cannot provide accurate and transparent results, it cannot be used to generate data about individuals. The technology must follow the legal requirements, not the other way around.”

The complaint also raised concerns about transparency from OpenAI, suggesting that the company does not provide information about where the data it generates about individuals comes from or whether that data is retained indefinitely. This is particularly important when considering data from private individuals.

Again, this is a complaint from an advocacy group and the EU regulators have not yet issued a statement. However, OpenAI writes that ChatGPT “sometimes writes plausible-sounding but incorrect or nonsensical answers.” NOYB has contacted the company and asked the organization to investigate the issue.

The company is facing a similar complaint in Poland, where the local data protection authority was unable to get OpenAI’s help in correcting incorrect personal data, according to a researcher. This complaint accuses OpenAI of multiple violations of the EU GDPR related to transparency, data access rights and data protection.

There is also Italy. The Italian Data Protection Authority and OpenAI ultimately concluded that the company had violated the GDPR in various ways. This also includes ChatGPT’s tendency to make up false things about people. The chatbot before OpenAI made certain changes to the software, such as new warnings for users and the ability to opt out of using chats to train the algorithms. Although it is no longer banned, the Italian investigation into ChatGPT continues.

OpenAI has not responded to this latest complaint, but has responded to the Italian data protection authority’s regulatory salvo. “We want our AI to learn about the world, not about private individuals,” . “We are actively working to reduce personal data by training our systems such as ChatGPT, which also rejects requests for private or sensitive information about people.”

Sharing Is Caring:

Leave a Comment