Meta slapped with child safety probe under sweeping EU tech law

Technology
Thursday, May 16th, 2024 4:56 pm EDT

Key Points

  • EU Investigation Initiated: The European Union has launched a significant investigation into Meta, the parent company of Facebook, over alleged breaches of the bloc’s strict online content laws concerning child safety risks. The investigation, initiated by the European Commission, focuses on potential behavioral addictions in children and “rabbit-hole effects” on Meta’s Facebook and Instagram platforms. Concerns also extend to age verifications and privacy risks associated with Meta’s recommendation algorithms.
  • Regulatory Scrutiny Under DSA: Meta’s investigation is part of broader scrutiny under the EU’s Digital Services Act (DSA), aiming to tackle harmful online content. The EU can enforce fines of up to 6% of global annual revenues for violations. Meta’s compliance with DSA obligations to mitigate negative effects on the physical and mental health of young Europeans is under question, prompting an in-depth investigation into the company’s child protection measures as a priority.
  • Global Concerns and Legal Action: Meta faces legal action not only from the EU but also from authorities in the United States. In the U.S., the attorney general of New Mexico is suing Meta over allegations that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking. Meta’s response emphasizes its use of advanced technology and preventive measures to address these concerns. Meanwhile, in the EU, investigations into Meta’s handling of election disinformation and suspected failure to combat content disinformation and manipulation are ongoing, reflecting a broader trend of regulatory action against tech giants over child safety concerns.


Meta, the parent company of Facebook, faces a significant investigation by the European Union over potential violations of its stringent online content laws related to child safety. The European Commission, the EU’s executive body, announced its concerns that Meta’s platforms, Facebook and Instagram, might encourage behavioral addictions in children and create “rabbit-hole effects.” Additionally, the Commission is scrutinizing Meta’s age verification processes and privacy risks associated with its recommendation algorithms.

A Meta spokesperson stated that the company has spent over a decade developing more than 50 tools and policies to ensure safe, age-appropriate experiences for young users, and expressed a willingness to cooperate with the European Commission. This investigation stems from a preliminary risk assessment report submitted by Meta in September 2023, which failed to convince the EU regulators that Meta had adequately addressed the risks to young users’ physical and mental health.

The European Commission plans to conduct an in-depth investigation into Meta’s child protection measures, gathering evidence through various means such as information requests, interviews, and inspections. The initiation of this Digital Services Act (DSA) probe allows the EU to implement further enforcement actions, including interim measures and decisions on noncompliance. The Commission may also consider any commitments Meta makes to address these concerns.

Meta, along with other U.S. tech giants, has been under increasing scrutiny since the EU introduced the DSA, a landmark law aimed at curbing harmful content. Companies found in violation of the DSA can face fines up to 6% of their global annual revenues. So far, the EU has not issued fines to any tech giants under this law.

In addition to the current investigation, Meta is also under EU scrutiny for its handling of election disinformation. In April, the EU launched a probe into whether Meta had taken sufficient measures to combat disinformation ahead of the European Parliament elections. Furthermore, in December 2023, the EU opened infringement proceedings against X (formerly Twitter) for failing to address content disinformation and manipulation.

The EU’s actions are part of a broader trend of increased regulatory scrutiny on tech companies regarding user safety and content moderation. In the U.S., Meta faces similar challenges, with the attorney general of New Mexico suing the company over allegations that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking. Meta has responded by highlighting its use of sophisticated technology and other preventive measures to protect children from online predators.

Overall, the investigation by the European Commission marks a critical moment for Meta as it navigates the complex regulatory landscape aimed at ensuring the safety and well-being of young users on its platforms.

For the full original article on CNBC, please click here: https://www.cnbc.com/2024/05/16/meta-slapped-with-formal-eu-probe-over-child-safety-risks.html