Putting the Digital Services Act (DSA) to the test: Can the EU counter the deregulation push from big social media platforms?

Retour à la liste
2025 opens with maximum tension between big social media platforms and the EU. Many are urging the Commission to act faster and firmly and make full use of its powers under the Digital Services Act (DSA). What can the EU do under the DSA?

2025 opens with maximum tension between big social media platforms and the EU: Elon Musk’s interferences in European politics, Mark Zuckerberg’s announcement of the scale back of Meta’s moderation policies, TikTok’s role in Russian interference in Romania’s presidential election and several ongoing proceedings initiated by the European Commission targeting large platforms’ practices.[1]

In this context, several MEPs and organisations are urging the Commission to act faster and firmly and make full use of its powers under the Digital Services Act (DSA).

What are social media platforms’ obligations under the Digital Services Act?

Adopted in 2022, the DSA sets a legal framework for online services that transmit, store or host user-generated content.

In particular, pursuant to the DSA, online platforms are required to provide mechanisms for users to report illegal content and must remove such content in a timely manner when notified. They also must disclose their content moderation policies in their terms and conditions[2] and in transparency reports along with the way they handle notices and illegal content.

The DSA also introduces specific obligations for “very large online platforms” (“VLOPs”), with more than 45 million active users in the EU which include Facebook, X, Instagram, TikTok, LinkedIn, Amazon, various Google apps, Temu, Pornhub and others.

In addition to obligations falling on all platforms, articles 34 and 35 of the DSA require VLOPs to assess “systemic risks” stemming from the design and functioning of their service and to adopt “reasonable, proportionate and effective mitigation measures” tailored to these risks. These measures include adapting their algorithmic recommender systems and content moderation processes.

Systemic risks encompass:

  • the dissemination of illegal content through their services
  • any negative effects on human rights, including the freedom of expression and the freedom and pluralism of the media,
  • any negative effects on civic discourse and electoral processes,
  • any negative effects in relation to public security, gender-based violence, the protection of public health and minors, and physical and mental well-being.

While disinformation does not constitute in itself “illegal content” within the meaning of the DSA, the text underlines that it contributes to the systemic risks listed above.[3]

Are Meta’s policy changes compatible with its obligations under the DSA?

On 7 January 2025, Mark Zuckerberg announced two major changes in Meta’s policies:

  • First, Meta will scrap its 2016 fact-checking program that employed more than 80 fact-checking organisations. For now, this change will apply only in the US.
  • Second, Meta will scale back certain content moderation policies. Such changes are applicable worldwide[4] and have already been partly put in place, with Meta switching its “hateful speech” to a “hateful conduct” section in its Community Standards[5]. This means that users reporting harmful posts, ads, or comments will have less chance of seeing this content taken down.

Are such evolutions lawful under the DSA?

As regards Meta’s decision to scrap its fact-checking program, if it were to be implemented in the EU, it would not in itself constitute a breach of the DSA. Indeed, the DSA does not oblige platforms to employ third-party fact-checking as a content moderation tool.

However experts warn that abandoning third-party fact-checking could cause a surge in fake news and disinformation on the platform, especially coupled with the modification of Meta’s moderation policies.

This spread of false information and deceptive content would have negative effects on civic discourse and electoral processes, as well as on the protection of human rights. It is thus highly questionable whether replacing fact-checkers with “community notes” would be an appropriate and effective measure to mitigate these systemic risks.[6]

In this regard, Meta could be violating its obligation to assess and mitigate systemic risks under articles 34 and 35 of the DSA.

What sanctions do social media platforms face under the DSA?

The European Commission is the sole competent authority in charge of the public enforcement of VLOPs’ obligations set out in articles 34 and 35 of the DSA.

It can open formal proceedings against these large-scale platforms aiming at determining whether they breach their obligations under the DSA. In case of infringement, the Commission can impose fines of up to 6% of platforms’ worldwide revenue. It can also order them to take measures to address the breaches and impose penalty payment if they do not comply with them. As a last resort measure, the Commission may also request national authorities to seek an order from the competent national court to temporarily restrict access to the service.

Many VLOPs are currently being investigated for alleged breaches of the DSA requirements.

For example, formal proceedings were opened against the online marketplace Temu for failing to limit the sale of non-compliant products in the EU, the addictive design of its platform, and the lack of transparency regarding its recommender system.

X is also already under investigation regarding its actions to counter the dissemination of illegal content within the EU, to combat information manipulation on its platform, to increase transparency and for its suspected deceptive user interface design. On 17 January 2025, the Commission requested the company to provide internal documents on its recommender systems and any changes made to them, as part of its the investigation into possible manipulation of the algorithm favouring certain narratives or accounts.

Finally, Meta itself is under two investigations. The first investigation relates to deceptive advertising, disinformation, the visibility of political content on its services and its mechanism to flag illegal content. The second investigation addresses Meta’s compliance with its DSA obligations regarding the protection of minors.

However, despite the growing number of investigations, to date, the Commission has not adopted any non-compliance decision or sanction.

If the Commission ultimately fails to swiftly and firmly enforce the DSA towards big platforms, the DSA will fall short of its goal to safeguard a “safe, predictable and trustworthy online environment” which could come at a high cost to democracy in Europe.

 

Notes

[1] See for a list of all measures taken by the EU Commission on the ‘Supervision of the designated very large online platforms and search engines under DSA‘ page.

[2] Content moderation is defined in Article 3(t) of the DSA as “any activities, whether automated or not […] that are aimed, in particular, at detecting, identifying and addressing illegal content” including the measures taken to handle with illegal content, such as demotion, demonetisation, disabling of access or removal.

[3] It is worth noting that the World Economic Forum recently listed misinformation and disinformation as the biggest short-term risk in the global landscape for the second year in a row. In particular, combatting climate disinformation is becoming a priority at EU and UN level, see for example the UN Global Initiative for Information Integrity on Climate Change.

[4] See Meta’s Community Standards’ webpage. Community standards are part of the Terms and Conditions for using Meta and constitute a guide to what is allowed or not on the platform.

[5] For example, Community standards have been modified to authorise “allegations of mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism and homosexuality”. For more context on the content moderation changes, see this Linkedin post by Lou Welgryn.

[6] The functioning of X’s “community notes” is currently being probed by the EU Commission.

image_pdf
Contactez-nous si vous souhaitez faire appel à nos compétences