OpenAI Offers an Olive Branch to Artists Wary of Feeding AI Algorithms - Latest Global News

OpenAI Offers an Olive Branch to Artists Wary of Feeding AI Algorithms

OpenAI is fighting lawsuits from artists, authors and publishers who claim it improperly used their work to train the algorithms behind ChatGPT and other AI systems. On Tuesday, the company announced a tool that appears aimed at appeasing creators and rights holders by giving them some control over how OpenAI uses their work.

The company says it will launch a tool called Media Manager in 2025 that will allow content creators to exclude their work from the company’s AI development. In a blog post, OpenAI described the tool as a way to “enable creators and content owners to tell us what they own” and indicate “how they would like their works to be included or excluded from machine learning research and training.” “.

OpenAI said it is working with “creators, content owners and regulators” to develop the tool and aims to use it to “set an industry standard.” The company did not name any of its project partners or make it clear exactly how the tool will work.

Among the outstanding questions about the system are whether content owners will be able to make a single request to cover all of their works, and whether OpenAI will allow requests related to models that have already been trained and deployed. Research is currently underway on machine “unlearning,” a process that adapts an AI system to subsequently remove the contribution of some of its training data. However, the technology is not yet perfected.

Ed Newton-Rex, CEO of startup Fairly Trained, which certifies AI companies that use ethically sourced training data, says OpenAI’s apparent shift on training data is welcome, but implementation will be critical. “I am pleased that OpenAI is working on this topic. Whether it actually helps the artists or not depends on the details that are not yet known,” he says. The first big question that comes to mind: Is this simply an opt-out tool that allows OpenAI to continue using data without permission unless a content owner requests the exclusion? Or will it represent a major change in the way OpenAI does business? OpenAI did not immediately respond to a request for comment.

Newton-Rex is also curious whether OpenAI will allow other companies to use its media manager to allow artists to communicate their preferences to multiple AI developers at once. “If not, this just adds complexity to an already complex opt-out environment,” says Newton-Rex, who was previously an executive at Stability AI, the developer of the Stable Diffusion image generator.

OpenAI is not the first to explore ways for artists and other content creators to signal their preferences regarding how their work and personal data is used for AI projects. Other technology companies, from Adobe to Tumblr, also offer opt-out tools for data collection and machine learning. The startup Spawning launched a registry called “Do Not Train” almost two years ago and creators have already registered their preferences for 1.5 billion works.

Spawning CEO Jordan Meyer says the company is not working with OpenAI on its Media Manager project, but is open to it. “If OpenAI is able to make it easier to register or comply with universal opt-outs, we will be happy to integrate their work into our suite,” he says.

Sharing Is Caring:

Leave a Comment