EU Investigates Meta Platforms for Child Safety on Facebook, Instagram

EU regulators are intensifying their scrutiny of Meta Platforms’ social media giants Facebook and Instagram over potential breaches of EU online content rules regarding child safety. The move comes as part of the European Union’s Digital Services Act (DSA), aimed at holding tech companies accountable for tackling illegal and harmful content on their platforms. The […]

EU Investigates Meta Platforms for Child Safety on Facebook, Instagram
by Ananya Ghosh - May 16, 2024, 4:31 pm

EU regulators are intensifying their scrutiny of Meta Platforms’ social media giants Facebook and Instagram over potential breaches of EU online content rules regarding child safety. The move comes as part of the European Union’s Digital Services Act (DSA), aimed at holding tech companies accountable for tackling illegal and harmful content on their platforms.

The European Commission announced on Thursday that it has initiated an in-depth investigation into Meta Platforms, citing concerns that the company has not adequately addressed risks to children on its platforms. The Commission specifically highlighted worries about the systems and algorithms of Facebook and Instagram potentially encouraging addictive behaviors in children and leading to what’s termed as ‘rabbit-hole effects,’ where users are drawn into harmful content loops.

One of the core concerns raised by the EU executive is related to age-verification methods and ensuring that children cannot easily access inappropriate content. Meta had previously submitted a risk assessment report in September, but regulators remain unsatisfied with the measures taken by the company.

This investigation adds to Meta’s existing challenges in the EU, including scrutiny over election disinformation, especially concerning the upcoming European Parliament elections. The DSA violations could result in substantial fines, potentially reaching up to 6% of Meta’s annual global turnover.