EU Regulators Are Investigating Meta Over Child Safety Concerns - Latest Global News

EU Regulators Are Investigating Meta Over Child Safety Concerns

European Union (EU) regulators have confirmed another investigation into Meta over concerns the social media giant may have breached online content rules on child safety.

Under the Digital Services Act (DSA), which came into force in the European bloc last year, companies are required to take action on harmful content or face significant fines.

Specifically, Facebook and Instagram are being investigated to determine whether this is the case have “negative impacts” on children’s “physical and mental health.”

On Thursday (May 16), the European Commission confirmed that it had opened a formal procedure, with the EU executive body also concerned that Meta was also not doing enough on pension protection and verification methods.

“The Commission is concerned that Facebook and Instagram’s systems, including their algorithms, could stimulate behavioral addictions in children and cause so-called ‘rabbit hole effects’,” the statement said.

The EU calls on the technology industry to comply with the DSA

Several major technology companies have been targeted by the EU for possible breaches of the DSA, which threatens financial penalties of up to 6% of annual global turnover.

Meta, which also owns WhatsApp and Threads, insists it has “spent a decade developing more than 50 tools and policies” to protect children. “This is a challenge facing the entire industry and we look forward to sharing details of our work with the European Commission,” a company spokesperson added.

The “rabbit hole effect” mentioned above refers to the way algorithms work in modern social media apps, whereby a user viewing one piece of content is redirected to another piece of content of a similar nature. This can become a pattern during a long scrolling session or through repeated suggestions to watch content.

In the UK, regulators are also closely monitoring how the technology works, along with UK communications regulator Ofcom. Alert algorithms that spread harmful content are a cause for concern.

The agency is preparing to enforce the online safety law after finding that many young children use social media accounts, sometimes with parents’ knowledge, even though the minimum age for users is set at 13.

Photo credit: Ideogram

Sharing Is Caring:

Leave a Comment