On October 24, 2025, the European Commission officially announced that Meta and TikTok had violated transparency obligations under the Digital Services Act (DSA). This breach opens the possibility of fines of up to 6% of their global annual revenue for each platform.
Background: The DSA Regulation
The DSA is an EU regulation that applies to large-scale digital platforms. It requires social media and search engines to provide mechanisms for reporting illegal content, grant researchers access to platform data, and ensure high transparency in moderation processes.
Key obligations include:
-
Providing users with an easy way to report content involving violence, child abuse, or terrorism.
-
Ensuring independent researchers have sufficient access to platform data to study societal impacts.
-
Avoiding misleading interface designs or barriers to reporting (so-called “dark patterns”).
Regulators emphasized that digital integrity and public trust depend heavily on the transparency of major platforms.
Timeline of Events
Investigation Process
The investigation began in 2024 when the EC launched formal reviews into Meta and TikTok over potential DSA violations.
Initial Findings
On October 24, 2025, the Commission released preliminary findings stating that:
-
Meta (via Facebook and Instagram) failed to provide a simple and effective mechanism for reporting illegal content.
-
TikTok and Meta allegedly restricted researcher access to publicly available platform data.
Potential Impact
If confirmed, both companies could face heavy penalties — up to 6% of their total global revenue.
Core Issue: Transparency and Research Access
The Commission stated that Meta’s interface design was “confusing and obstructive,” preventing users from easily reporting illegal content. The “Notice & Action” module required by the DSA must be easily accessible — yet barriers were found in practice.
Additionally, researcher access to core platform data was deemed “burdensome” and inadequate, making it difficult to study impacts such as youth well-being, misinformation, and radicalization.
Brief Analysis
European Regulatory Pressure
This action demonstrates that the European Union is becoming increasingly firm in enforcing regulations on Big Tech. DSA violations are not only about algorithms or ads — but also about transparency and social accountability.
Business Risks for Meta and TikTok
Large fines, reputational damage, and stricter oversight could affect the business models of both companies in Europe.
Opportunities for Alternative Platforms and Developers
Smaller, more transparent platforms could gain user trust and attract research collaborations as regulatory scrutiny intensifies.
Impact on Users and Industry in Indonesia
For users in Indonesia, while the investigation is taking place in Europe, its effects could ripple globally:
-
Changes in content reporting mechanisms on Facebook/Instagram may be implemented globally.
-
For Indonesian developers and researchers, access to major platform data could become more open — or more restricted — depending on future EU rules.
-
Advertising and monetization models could shift, impacting local creators reliant on Meta and TikTok ecosystems.
Conclusion
The European Commission’s findings against Meta and TikTok mark a turning point in global digital regulation: major platforms are no longer just service providers — they bear social responsibility and must uphold transparency.
If these rules are fully enforced, a new era of digital oversight could begin — one where data access, content reporting, and user trust become new global standards.
For the industry, time is running out to adapt.