Photo-sharing Community EyeEm Licenses Users’ Photos to Train AI if They Don’t Delete Them | TechCrunch

EyeEm, the Berlin-based photo-sharing community that moved to Spanish company Freepik after bankruptcy last year, is now licensing its users’ photos to train AI models. Earlier this month, the company informed users via email that it was adding a new clause to its terms and conditions that would grant it the right to upload user content to “train, develop and to improve”. .” Users were given 30 days to opt out by removing all of their content from the EyeEm platform. Otherwise, they agreed to this use case for their work.

At the time of the acquisition in 2023, EyeEm’s photo library included 160 million images and nearly 150,000 users. The company said it will merge its community with Freepik’s over time.

EyeEm, once considered a potential challenger to Instagram — or at least “Europe’s Instagram” — had shrunk to three employees before selling to Freepik, TechCrunch’s Ingrid Lunden previously reported. Freepik CEO Joaquin Cuenca Abela hinted at the company’s possible plans for EyeEm, saying it will look at how to integrate more AI into the equation for developers on the platform.

As it turned out, that meant they had to sell their work to train AI models.

EyeEm’s updated terms and conditions are now as follows:

8.1 Granting of rights – EyeEm Community

By uploading Content to the EyeEm Community, you grant us the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, transform, adapt, create derivative works of, your Content communicate content to the public and/or promote it.

This includes, in particular, the sublicensable and transferable right to use your content for the training, development and improvement of software, algorithms and machine learning models. If you do not agree, you should not add your content to the EyeEm Community.

The rights to your Content granted in this Section 8.1 will survive complete deletion from the EyeEm Community and Partner Platforms in accordance with Section 13. You can request deletion of your content at any time. The requirements for this can be found in Section 13.

Section 13 describes a complicated deletion process that begins with deleting photos directly—which would not affect content previously shared on EyeEm Magazine or social media, the company notes. To delete content from EyeEm Market (where photographers sold their photos) or other content platforms, users had to send a request to [email protected], providing the content ID numbers of the photos they wanted to delete and specifying whether this should also be removed from your account or just from the EyeEm Market.

Notably, the notice states that these deletions from EyeEm Market and partner platforms could take up to 180 days. Yes, that’s right: requested deletions last up to 180 days, but users only have 30 days to opt out. This means that the only option is to manually delete the photos one by one.

Worse still, the company adds:

You hereby acknowledge and agree that your permission for EyeEm to market and license your Content in accordance with Sections 8 and 10 will remain valid until the Content is deleted by EyeEm and any Partner Platforms within the time frame specified above. All license agreements concluded before the complete deletion and the usage rights granted thereby remain unaffected by the request for deletion or deletion.

Section 8 details the license rights to train AI. In Section 10, EyeEm informs users that if they delete their account, they will waive their right to receive payouts for their work – something users may consider doing to prevent their data from being passed to AI models. Caught!

EyeEm’s move is an example of how AI models are trained based on user content, sometimes without their explicit consent. Although EyeEm offered an opt-out process, any photographer who missed the announcement would have lost the right to determine how their photos would be used in the future. Given that EyeEm’s status as a popular Instagram alternative has declined significantly over the years, many photographers may have forgotten that they ever used it in the first place. They might have ignored the email if it wasn’t already in their spam folder somewhere.

Those who noticed the changes were upset that they only received a 30 day notice and no Options to bulk delete your postswhich makes it more painful to unsubscribe.

Requests for comment sent to EyeEm were not immediately confirmed. However, since this countdown had a 30-day deadline, we decided to publish it before we hear back.

This type of dishonest behavior is why users today are considering moving to the open social web. Composite platform Pixelfed, which runs on the same ActivityPub protocol that powers Mastodon, is leveraging the EyeEm situation to attract users.

In a post on its official account, Pixelfed announced: “We will never use your images to train AI models. Privacy first, Pixel forever.”

Sharing Is Caring:

Leave a Comment