The European Commission continues to deploy all instruments provided for in its digital services regulation, with special attention to the protection of minors in the online environment. In this context, this Thursday it announced the opening of a formal investigation against Snapchat with the aim of determining whether, as it suspects, the platform is exposing minors to contact with sexual predators or to recruitment for criminal purposes.
To be able to access the platform, and in accordance with its own terms and conditions, users must have reached a minimum age of thirteen years. However, Brussels considers that self-declaration as the sole verification system opens significant security loopholes. The community services point out that this method does not prevent minors under 13 from accessing the service, nor does it allow for correctly assessing whether users are under 17 years old, “which is necessary to guarantee an experience appropriate to their age.”
Furthermore, the Commission points out that Snapchat would not be offering effective tools for users to report the presence of minors on the platform, which aggravates the risks of exposure.
Risks of grooming and criminal recruitment
"From grooming and exposure to illegal products to account settings that undermine the safety of minors, Snapchat seems to have overlooked that the Digital Services Act requires high safety standards for all users," declared the Vice-President of the Commission responsible for technological sovereignty, Henna Virkkunen.
According to preliminary investigations, these practices would be resulting in an insufficient protection of minors against users with harmful intentions, such as sexual exploitation or recruitment for criminal activities.
The Commission warns that the platform could be allowing adults to impersonate minors, either by hiding their real age during registration or by modifying it later. This would imply the lack of effective safeguards against risky contacts, content, and behaviors.
Default settings that weaken security
Another of the focuses of the investigation is placed on the default settings of the accounts, which, according to Brussels, do not guarantee a sufficient level of privacy, security, and protection.
Among the problems detected, it stands out that children and adolescents are automatically recommended to other users through the “Find friends” system, as well as that push notifications remain activated by default.
Added to this is that, at the moment of creating an account, users do not receive clear or sufficient information about the privacy and security options, nor about how to adequately configure their profiles.
Dissemination of content about illegal products
The community legislation requires platforms to mitigate the systemic risks derived from their services, an obligation that, according to the Commission, Snapchat could be violating.
“Their content moderation tools do not seem effective in preventing the dissemination of information that directs users towards the purchase of illegal products, such as drugs, or age-restricted products, such as vapes and alcohol,” community sources point out.
Likewise, Brussels considers that the platform does not effectively prevent minors' access to this type of content, which increases the risks for this group.
The opening of this formal procedure allows the Commission to adopt additional measures, such as infringement decisions or even interim measures. From now on, the Community Executive will carry out an in-depth investigation, which will include new requests for information, interviews, and inspections.
Additionally, Brussels will take over the investigation initiated in September 2025 by the Netherlands Authority for Consumers and Markets (ACM) regarding the sale of vapes to minors on Snapchat, integrating it into this procedure.
An increasingly active regulatory framework
Since the entry into force of the Digital Services Act (DSA), the Commission has intensified its supervisory activity, accumulating investigations against large platforms such as X, Shein or Google.
The objective is to determine if these companies favor access to illicit or harmful content for minors, as well as evaluate their compliance with obligations regarding privacy, security, and protection.
The community guidelines clearly state that self-declaration is not a reliable method of age verification, that minors must not be easily identifiable by adults, nor recommended as contacts, and that default settings must offer the maximum level of protection.
The Commission's decision is based on the analysis of Snapchat's risk assessment reports for 2023, 2024, and 2025, as well as on the company's response to a request for information sent in October 2025.
In said request, Brussels asked for details about the age verification system and the measures to prevent access to illegal or age-restricted products.
Likewise, the information collected by the ACM in the Netherlands has been taken into account, along with data provided by the Federal Network Agency of Germany (BNetzA), in addition to contributions from academic researchers, civil society organizations, and other public authorities.