EU Says Meta Violates DSA by Failing to Keep Children Off Facebook and Instagram

EU Says Meta Violates DSA by Failing to Keep Children Off Facebook and Instagram

EU Finds Meta in Breach of Digital Services Act Over Child Safety on Facebook and Instagram

The European Commission announced on April 29, 2026, that it has preliminarily found Meta's Instagram and Facebook in breach of the Digital Services Act (DSA) for failing to diligently identify, assess, and mitigate the risks of children under 13 years old accessing its platforms. The findings, published in an official Commission press release, conclude that Meta's age-enforcement measures are largely ineffective — children can create accounts on both platforms simply by entering a false date of birth, with no meaningful verification mechanism in place to confirm whether that information is accurate.

The preliminary findings come roughly two years after the Commission formally opened proceedings against Meta on May 16, 2024. Meta now has the opportunity to respond before a final decision is issued. If the Commission's findings are confirmed, Meta could face fines of up to 6% of its total worldwide annual revenue. With Meta reporting nearly $165 billion in revenue last year, a maximum fine under the DSA could approach $10 billion.

What the Commission Found: Ineffective Age Checks and Ignored Scientific Evidence

At the core of the Commission's preliminary findings is a straightforward but damning observation: Meta's own terms of service set 13 as the minimum age for both Facebook and Instagram, yet the company has no reliable system to enforce that rule at the point of sign-up. Children can bypass the age requirement simply by inputting a false birth date when registering. The Commission characterized Meta's risk assessment on this issue as incomplete and arbitrary, accusing the company of underplaying the scale of the problem.

According to Euronews, the Commission found that Meta had "disregarded readily available scientific evidence" indicating that younger children are particularly vulnerable to harms from services like Facebook and Instagram. The Commission's ongoing investigation also covers risks stemming from the design of Facebook's and Instagram's online interfaces, which may exploit the vulnerabilities and inexperience of minors, potentially leading to addictive behaviour and reinforcing so-called "rabbit hole" effects — where algorithmic recommendations pull users deeper into increasingly narrow and potentially harmful content loops.

The Commission also found that Meta did not take sufficient action to remove users who had already gained access to the platforms but were subsequently identified as being younger than 13. The European Commission estimates that roughly 10–12% of children under 13 in Europe are using Instagram and Facebook — a figure that contradicts Meta's own internal assessments of the problem's scale.

Reporting Minors Is a Seven-Click Obstacle Course With No Follow-Through

Beyond the sign-up loophole, the Commission took particular aim at Meta's internal tool for reporting suspected underage users. According to RTÉ, the Commission described the reporting mechanism as "difficult to use and not effective," noting that it requires up to seven clicks just to access the reporting form — which is not automatically pre-filled with the reported user's information. Even when a report is successfully submitted, the Commission found that there is often no proper follow-up: the reported minor can simply continue using the service without any type of check or review.

This combination — easy account creation, a cumbersome reporting tool, and inadequate follow-up — means that the practical effect of Meta's age policies is close to zero for users determined to misrepresent their age. The Commission's formal proceedings were based on a preliminary analysis of a risk assessment report submitted by Meta in September 2023, Meta's replies to formal requests for information, publicly available reports, and the Commission's own analysis.

Meta Disputes the Findings, Calls Age Verification an Industry-Wide Problem

Meta pushed back against the preliminary findings, maintaining that it has systems in place to address underage access. A Meta spokesperson said: "We're clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age."

The company also framed age verification as a challenge that extends beyond any single platform. In a statement reported by Euronews, a Meta spokesperson said: "Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue." The company added that it would have more to share the following week about additional measures rolling out soon.

The Commission's position, however, is that these stated intentions do not translate into effective enforcement. As Henna Virkkunen, Executive Vice President at the European Commission, stated: "Instagram and Facebook are doing very little to prevent children below this age from accessing their services."

Virkkunen also addressed the broader principle at stake under the DSA: "The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children."

Context: A Broader EU Crackdown on Child Safety Online

The action against Meta is not an isolated enforcement move. It is part of a wider regulatory push by the European Commission to hold large online platforms accountable for the safety of younger users. The Commission has simultaneously been investigating TikTok over addictive design features. Last month, the EU said four pornographic platforms — including Pornhub — were allowing children to access adult content in breach of digital rules, signalling that child safety enforcement is a sustained priority rather than a one-off intervention.

On April 15, European Commission President Ursula von der Leyen told social media platforms there were "no more excuses" for not protecting children online and announced that the EU's own age-verification app is technically ready for rollout. The announcement positions the EU not merely as a regulator issuing fines, but as an institution actively building enforcement infrastructure to back up its legal requirements.

The DSA itself sets a high standard for large platforms. It obligates them to identify, assess, and mitigate systemic risks — including risks to minors — and the 2025 DSA Guidelines on the protection of minors specifically identify age estimation and age verification as necessary and proportionate measures platforms should be implementing. The Commission's preliminary findings against Meta suggest that, nearly two years after formal proceedings were opened, the company has not yet met that bar.

For users and families, the practical implications are significant. Platforms like Instagram and Facebook are deeply embedded in the daily lives of teenagers and, increasingly, younger children. The health and developmental risks associated with early and unmonitored social media use — including exposure to age-inappropriate content, addictive design patterns, and the psychological effects of social comparison — are areas of growing concern among researchers and policymakers alike. The Commission's finding that Meta disregarded scientific evidence on the particular vulnerability of under-13s adds weight to those concerns at a regulatory level.

What Happens Next

The preliminary findings are not a final ruling. Meta has the right to respond to the Commission's conclusions before a final decision is issued. The company has indicated it will engage with the Commission and plans to announce additional protective measures in the near term.

If the Commission proceeds to a final finding of non-compliance, Meta faces fines of up to 6% of its total worldwide annual turnover under the DSA. Based on Meta's reported revenue of nearly $165 billion last year, that ceiling sits close to $10 billion — a figure significant even for a company of Meta's scale.

The Commission's investigation into the design of Facebook's and Instagram's interfaces — specifically whether those interfaces exploit the inexperience of minors to drive addictive behaviour — remains ongoing and could result in additional findings beyond those announced today.

With the EU's own age-verification app described as technically ready and the Commission publicly stating there are "no more excuses," the regulatory pressure on Meta and other large platforms operating in Europe is likely to intensify in the months ahead. Whether Meta's forthcoming measures will be sufficient to satisfy the Commission's requirements remains to be seen.

For more tech news, visit our news section.

Why This Matters for Your Health and Digital Wellbeing

The EU's findings against Meta are a reminder that the digital environments children — and adults — inhabit are not neutral spaces. Platforms designed to maximise engagement can work against focus, mental health, and productive habits, particularly for younger users who have fewer psychological tools to manage those pressures. Understanding who controls your digital environment, and how, is increasingly a personal health issue as much as a policy one. Staying informed about how major platforms operate is a meaningful first step toward making better choices about where you spend your attention. Join the Moccet waitlist to stay ahead of the curve.

Share:
← Back to Tech News