EU’s findings against Meta
The European Commission’s preliminary decision says Meta’s current systems do not “diligently identify, assess and mitigate” the risk of under‑13s accessing Instagram and Facebook, as required under the Digital Services Act (DSA). Officials say children can easily open accounts simply by entering a false date of birth, with no effective age‑verification in place to block them. Meta’s own terms of service state that both platforms are meant for users aged 13 and above, but EU investigators found those rules are not being properly enforced.
Evidence gathered by regulators indicates that roughly 10–12% of children under 13 in the EU are managing to use Instagram and Facebook despite the age limit. The Commission also criticises Meta for failing to quickly identify and remove these underage accounts once they are created, leaving younger children exposed to potentially harmful content and data‑collection practices.
Possible penalties and next steps
Under the DSA, the EU can impose fines of up to 6% of a company’s global annual turnover if it confirms a serious breach, a threat that now hangs over Meta if it does not fix the problems flagged by Brussels. The current ruling is preliminary, and the Commission will continue its investigation before issuing a final decision or any financial penalties. EU officials say they are prepared to work with Meta on stronger safeguards but insist that meaningful changes to age checks and child‑safety tools must follow.
This case adds to years of regulatory pressure on Meta in Europe, where the company has already faced large penalties over children’s privacy and data use, including a separate fine of more than 400 million dollars linked to teenagers’ Instagram settings under GDPR rules. Analysts note that the EU’s tougher DSA enforcement powers now allow regulators to move faster and hit repeat offenders harder than under earlier data‑protection laws. For broader context on the DSA and its obligations for big platforms, readers can consult the European Commission’s dedicated explainer page on the law, which outlines the new responsibilities around risk assessments, transparency, and child protection.
Meta’s response and child‑safety debate
Meta has pushed back against the EU’s initial findings, insisting that Facebook and Instagram are “designed for users aged 13 and older” and that systems already exist to detect and remove underage accounts. The company argues it has invested heavily in safety tools for teenagers and says it will cooperate with regulators while defending what it calls its existing compliance efforts. However, EU officials say Meta has “ignored available scientific evidence” about the heightened vulnerability of younger children to social media risks and has downplayed the importance of keeping under‑13s off its platforms.
Child‑safety advocates have long warned that underage users are particularly at risk of exposure to harmful content, targeted advertising and data profiling, and they see the EU move as a test case for how strictly the DSA will be enforced against big tech. Experts also highlight that simple age‑gate screens based on self‑declared birthdays are no longer considered adequate, and they point to emerging age‑verification approaches and safety‑by‑design standards promoted by digital‑rights organisations and academic researchers as potential models for platforms. For a deeper dive into these wider debates on online safety and youth, resources from specialist groups such as the UK’s Internet Watch Foundation and UNICEF’s reports on children’s digital rights offer additional background and best‑practice guidance for policymakers and platforms alike.
Leave a comment