After an investigation against Meta's “little children” —Instagram and Facebook—, the European Commission's services have concluded that both platforms breach EU legislation by not adequately curbing the access of minors under thirteen years of age to their services. If forecasts are met, the company could face a fine amounting to up to 6% of its global turnover.
Although the terms of use of these applications do stipulate the minimum age to access their services, Brussels considers that the measures implemented to enforce these restrictions are not being effective. According to its analysis, companies do not adequately prevent minors from interacting with the platforms' content, as they neither identify them correctly nor remove them promptly once they have gained access.
Direct criticism of Meta from the Community Executive
“Our preliminary conclusions show that Instagram and Facebook are doing very little to prevent children under thirteen from accessing their services,” stated this Wednesday the European Commission’s technology chief, Henna Virkkunen. Her team maintains that legislation requires platforms to enforce their own rules: “the terms and conditions must not be mere written statements, but the basis for concrete actions to protect users, including children”. These statements reflect a growing concern within European institutions about the role that large digital platforms play in protecting minors. For Brussels, it is not enough to establish rules: it is essential that they are applied effectively and systematically.
The Commission cites as an example of these infringements the mere fact that, when creating an account, children can enter a false date of birth that makes them appear older than thirteen years old, without effective controls to verify the truthfulness of what is declared. This gap in verification mechanisms, according to the Executive, shows a lack of diligence on the part of Meta.
Added to this is that the tool made available by the company to report the presence of minors on the platform turns out, according to Brussels, to be “difficult to use and ineffective”. In fact, it requires up to seven clicks just to access the reporting form, which, moreover, is not automatically filled with user information, which discourages its use and reduces its effectiveness.
Lack of follow-up and "arbitrary" evaluation
In the community capital, they also warn that, even when a minor is reported for being below the permitted age threshold, there is often no adequate follow-up. As a consequence, the reported user can continue using the service without any additional verification.
Why does this happen? Brussels explains that the risk assessment carried out by Instagram and Facebook is “arbitrary”, even contradicting broad sets of evidence gathered across the European Union. This evidence indicates that approximately between 10% and 12% of children under 13 access the platforms, a figure that calls into question the effectiveness of current mechanisms.
Following these conclusions, the cabinet of Executive Vice-President Henna Virkkunen urges the company to change its risk assessment methodology. The objective is to identify more precisely which risks arise in each of Meta's services and how they specifically manifest in European territory.
Featured story
Goodbye to lying about age on the internet: Europe activates its definitive solution
5 minutes
Not only that: they also demand that measures be reinforced to prevent, detect, and remove minors under thirteen from their services. In a statement released this Wednesday, the Community Executive states that Meta “must effectively counteract and mitigate the risks that minors could experience on the platforms, guaranteeing a high level of privacy, security, and protection”.
An expert-backed investigation
Brussels reaches these conclusions after a process opened under the digital services regulation, which led to an in-depth investigation whose conclusions are now known. During this procedure, the Community Executive has analyzed data, internal documents, and platforms' responses to various information requests. Furthermore, the work has had the support of civil society organizations and child protection experts from all over the European Union, which reinforces the solidity of the conclusions presented.
It should be noted that this is not the only open front against Meta. The Commission maintains other active investigations related to possible breaches in the protection of the physical and mental well-being of users of all ages. These procedures also cover the assessment and mitigation of risks arising from the design of Facebook and Instagram interfaces, which, according to Brussels, can exploit the vulnerabilities and inexperience of minors. This approach points to a broader concern about how technological design can influence the behavior and safety of younger users.

The President of the European Commission, Ursula von der Leyen, maintains her commitment to see this offensive through to the end to protect minors in digital environments. In this context, she has already announced the details of a European application that will allow users to prove their age when accessing online platforms. The operation will be similar to that of stores that request an identity document for the purchase of alcohol. According to the leader, it is a tool “very easy to use”: the user downloads the application, configures it with their identity document, and subsequently, can prove their age when accessing digital services.
Privacy and technology: the key balance
The Community Executive has wanted to reassure those who warned about a possible violation of data protection. It insists that “no other personal information will be revealed” and that the system will be completely anonymous, with no possibility of tracking users.
The application will be available on multiple devices —phones, tablets, and computers—, which reinforces its accessibility and transversal nature. Furthermore, as Henna Virkkunen explained, the solution is based on zero-knowledge proof, a cryptographic system that allows verifying information without the need to share the underlying data.
The ultimate objective of Brussels is to design a structured approach among the Twenty-Seven for age verification on the internet. It seeks to avoid the fragmentation that would result from the coexistence of multiple national systems, opting for a single solution that guarantees efficiency and coherence throughout the community bloc. The European Commission is intensifying pressure on large technology platforms to assume their responsibility in protecting minors. The case of Meta could become a key precedent in the application of European digital regulation, marking a before and after in the regulation of the digital environment.