EU regulators to probe Meta over child safety concerns

European Union (EU) regulators have initiated another investigation against Meta for potential breaches of online content rules concerning child safety.

Under the Digital Services Act (DSA), which came into effect last year in the EU, companies are required to take action against harmful content or face significant fines.

Specifically, Facebook and Instagram are under scrutiny to determine if they are negatively impacting the physical and mental health of children.

On Thursday (May 16), the European Commission confirmed that formal proceedings had been opened, expressing concerns about Meta’s lack of adequate age verification methods.

“The Commission is worried that the systems of both Facebook and Instagram, including their algorithms, may encourage addictive behaviors in children and create ‘rabbit-hole effects’,” the statement read.

EU challenges tech industry to comply with DSA

Several major tech companies have come under EU scrutiny for potential DSA violations, which could result in fines of up to 6% of their annual global turnover.

Meta, which also owns WhatsApp and Threads, maintains that it has implemented numerous tools and policies over the years to protect children. The company spokesperson added, “This is a challenge faced by the entire industry, and we are eager to share our efforts with the European Commission.”

The “rabbit-hole effect” refers to how algorithms operate on modern social media platforms, leading users from one piece of content to another of a similar nature. In the UK, regulators are monitoring how algorithms may promote harmful content.

Ofcom, the UK communications watchdog, is preparing to enforce the Online Safety Act, as it has found that many young children are using social media accounts despite the age limit being set at 13.

Image credit: Ideogram

More Reading

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *