EU Opens Formal Investigation Into Snapchat Over Child Safety Under DSA
European Commission Probes Age Assurance, Default Settings, and Grooming Concerns
The European Commission has opened a formal investigation into Snapchat under the Digital Services Act (DSA), focusing on child safety concerns.
Five Investigation Areas
- Age assurance: How Snapchat verifies user ages
- Default account settings: Whether default settings adequately protect minors
- Reporting of illegal content: Effectiveness of reporting mechanisms
- Dissemination of prohibited products: How restricted goods are handled on the platform
- Grooming and recruitment: How the platform addresses child exploitation for criminal activities
Why Snapchat
Snapchat is particularly popular among younger demographics, making child safety a critical regulatory concern. The investigation follows the EU's broader DSA enforcement push against major social media platforms.
What's At Stake
DSA investigations can result in significant fines — up to 6% of global annual revenue — and mandatory operational changes. No timeline has been provided, but DSA probes typically take months.
Context
The EU has been aggressively enforcing the DSA against major platforms, with Meta (Instagram/Facebook), TikTok, and X all facing separate investigations. Snapchat is the latest target in Brussels' systematic scrutiny of social media's impact on children.
Source: The Verge, European Commission