The European Union has initiated a formal investigation into Meta, the parent company of Facebook and Instagram, due to concerns that it is not adequately protecting children on its platforms. This probe could lead to significant fines if Meta is found to be in violation of EU regulations.
This investigation underscores the increasing regulatory focus on the potential harmful effects of social media on young users, including the encouragement of addictive behaviors and exposure to inappropriate content.
The European Commission, the EU’s executive body, is examining whether Meta has complied with its obligations under the Digital Services Act (DSA), a comprehensive law designed to regulate online platforms. The DSA requires platforms to implement measures to protect children, such as preventing access to inappropriate content and ensuring high levels of privacy and safety. Non-compliance could result in fines up to 6% of a company’s global revenue or enforced changes to its operations.
The European Commission expressed concern that Facebook and Instagram might be exploiting the vulnerabilities and inexperience of minors, potentially leading to addictive behaviors. Additionally, the Commission questioned the effectiveness of Meta’s age verification and assurance methods.
“The Commission is also concerned about age assurance and verification methods put in place by Meta,” it stated, emphasizing the potential inefficacy of these measures.
In response, a Meta spokesperson stated, “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
However, a report Meta submitted to the European Commission last September, detailing how its platforms protect minors, did not fully address the regulators’ concerns. Commissioner Thierry Breton remarked, “We are not convinced that Meta has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans. We are sparing no effort to protect our children.”
Meta has faced increasing scrutiny over the impact of its platforms on young users. In recent years, various school districts and state attorneys general in the United States have filed lawsuits against Meta concerning youth mental health, child safety, and privacy issues.
Moreover, an investigation by the New Mexico attorney general earlier this month into the dangers of Meta’s platforms led to the arrests of three men charged with attempted sexual abuse of children.
Meta’s regulatory challenges extend beyond child protection. The company has frequently clashed with EU regulators over issues such as its handling of advertising by scammers, foreign election interference ahead of upcoming EU elections, and the spread of disinformation and illegal content related to the war in Gaza.
The European Commission’s investigation highlights the increasing pressure on social media companies to safeguard young users. As the EU continues to enforce stringent online protections for children, the outcome of this probe could have significant ramifications for Meta and the wider technology industry.