Brussels, April 29, 2026 EU says Meta fails to protect children on Facebook Instagram
The European Union has formally accused Meta of insufficiently protecting children on its Facebook and Instagram platforms, violating the bloc’s Digital Services Act (DSA).
Insufficient Age Verification Measures
The EU Commission stated that Meta’s current measures fall short of preventing children under 13 from accessing Facebook and Instagram, despite the platforms’ terms of service setting the minimum age at 13. Officials emphasized that the company has not implemented adequate safeguards to enforce this rule, leaving younger users exposed to potential harm.
Under the DSA, large online platforms like those operated by Meta are legally required to shield minors from content unsuitable for their age. The Commission’s findings suggest Meta has disregarded available scientific research highlighting the heightened vulnerability of younger children to risks posed by social media.
Violation of Digital Services Act
The EU’s criticism centers on Meta’s alleged non-compliance with the DSA, which mandates robust protections for young users. The Brussels-based authority views the company’s failure to act on established research as a significant breach of its obligations.
The DSA, enacted to regulate digital giants, imposes strict requirements on platforms to mitigate risks, including those affecting children’s mental health and well-being. Meta’s current policies, according to the Commission, do not meet these standards.

