European Union regulators have formally accused TikTok and Meta Platforms of violating digital transparency and content moderation rules, escalating enforcement actions under the bloc’s landmark Digital Services Act.
The European Commission, the EU’s executive arm, stated that preliminary findings indicate both companies failed to provide researchers adequate access to public data.
This lack of access hinders investigations into harmful or illegal content, especially concerning minors, according to the Commission.
Meta, which owns Facebook and Instagram, faces additional accusations for not offering users simple and effective mechanisms to report illegal content.
Brussels highlighted that Meta’s existing reporting tools are “onerous” and “confusing,” potentially discouraging users from flagging issues like child sexual abuse material or terrorist content.
The Commission underscored that robust “notification and action” mechanisms are crucial under the DSA for users to report content non-compliant with EU or national laws.
These accusations follow an ongoing review of platforms’ compliance with the DSA, which aims to make online spaces safer and more transparent.
The companies now have an opportunity to review the Commission’s findings, respond in writing, and implement corrective measures.
Failure to comply could result in substantial penalties, including fines up to 6% of their total annual worldwide turnover.
