The European Union has officially launched an investigation into TikTok’s compliance with the bloc’s Digital Services Act (DSA).
“Today we open an investigation into TikTok over suspected breach of transparency & obligations to protect minors: addictive design & screen time limits, rabbit hole effect, age verification, default privacy settings,” EU industry chief Thierry Breton wrote on X.
The Commission is concentrating on several key areas, including safeguarding minors, ensuring advertising transparency, granting data access to researchers, and managing risks associated with addictive design and harmful content, according to its press release.
The Digital Services Act (DSA) serves as the EU’s framework for online regulation and content management, and as of Saturday, it has been broadly applied to possibly thousands of platforms and services.
However, since last summer, bigger platforms like TikTok have been subject to additional obligations, including those related to algorithmic transparency and systemic risks. It’s under these specific regulations that the video-sharing platform is currently undergoing investigation.
If TikTok is found to have violated DSA regulations, its parent company, ByteDance, could face a fine as high as 6% of its worldwide revenue.
TikTok said it’s going to keep working with experts and others in the field to make sure young folks using its app stay safe, and that it’s looking forward to sharing the details of what they’ve been doing with the European Commission.
“TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with,” a TikTok spokesperson said.
The EU hasn’t set a strict deadline for wrapping up the investigations. According to its press release, how long the process takes can vary widely, depending on factors like how complex the case is, how much the involved company cooperates with the Commission, and how the rights of defense are exercised.