EU Investigates Meta Over Addiction and Safety Concerns for Minors - Latest Global News

EU Investigates Meta Over Addiction and Safety Concerns for Minors

Meta is in trouble again for his methods (or lack thereof) of protecting children. The European Commission has opened formal proceedings to determine whether the owner of Facebook and Instagram violated the Digital Services Act (DSA) by contributing to children’s social media addiction and failing to provide them with a high level of safety and security Privacy guaranteed.

The Commission’s investigation will in particular examine whether Meta properly assesses and takes action against the risks posed by its platforms’ interfaces. She fears that her designs “could exploit the weaknesses and inexperience of minors and cause addictive behavior and/or reinforce the so-called ‘rabbit hole’ effect.” Such an assessment is necessary to address potential risks to the exercise of the fundamental right, the physical and mental well-being of children and respect for their rights.”

The lawsuit will also examine whether Meta takes the necessary steps to prevent minors from accessing inappropriate content, has effective age verification tools, and has straightforward, strong privacy tools for minors such as default settings.

The DSA sets standards for very large online platforms and search engines (those with 45 million or more monthly users in the EU) such as Meta. The designated companies’ obligations include transparency in advertising and content moderation decisions, sharing their data with the Commission and investigating risks posed by their systems in areas such as gender-based violence, mental health and protection of minors.

Meta responded to the formal procedure by pointing to features such as parental supervision settings, quiet mode, and automatic restriction of content for teens. “We want young people to have safe, age-appropriate online experiences and have spent a decade developing more than 50 tools and policies to protect them. This is a challenge facing the entire industry and we look forward to sharing details of our development.” “We are working with the European Commission,” said a Meta spokesperson Engadget.

However, Meta has consistently failed to prioritize the safety of young people. Previous alarming incidents include Instagram’s algorithm suggesting content that discusses child sexual exploitation and claims that the company is designing its platforms to addict young people while simultaneously promoting psychologically harmful content as promoting eating disorders and body dysmorphia suggests.

Meta is also known to serve as a misinformation hub for people of all ages. The Commission already opened formal proceedings against the company on April 30, citing concerns about misleading advertising, access to data for researchers and the lack of an “effective, real-time tool for civic discourse and third-party election monitoring” ahead of the European elections Parliament had in June. among other worries. Earlier this year, Meta announced that CrowdTangle, which has publicly shown how fake news and conspiracy theories are spread on Facebook and Instagram, will be shutting down completely in August.

Sharing Is Caring:

Leave a Comment