European Federation of Journalists

Digital Services Act: EU agreement reached

Credit: Data Europa

On 23 April, European Union co-legislators (European Commission, European Parliament and the Council of Ministers – national governments) reached a provisional agreement on the Digital Services Act (DSA), a new set of rules for online platforms, search engines, online marketplaces and other significant providers of digital services. The DSA will fully enter into force in the first quarter of 2024, and for very large online platforms already four months after the publication of the final text in the EU official journal.

The European Federation of Journalists (EFJ) welcomed this agreement as an indeed overdue set of regulation to nail down the platforms to transparency and accountability regimes so much needed for today’s digital ecosystem, but is very disappointed by the lack of strengthened terms and conditions of intermediary service providers (Article 12).

« The weakening of the European Parliament’s proposal to ensure a binding application of fundamental rights including media freedom in the terms and conditions of very large online platforms is a great disappointment » said EFJ director Renate Schroeder. «However, we cannot give a final assessment as there is no final text available yet. And, as we know, the devil lies in the details, in particular when it comes to regulating big tech at European level, » she added.

The DSA creates a set of transparency and accountability obligations for tech companies, which will force them to properly assess and mitigate the harms their products can cause. Both these assessments and mitigation measures can be assessed by independent auditors and external researchers.

The DSA updates and streamlines the existing notice and action system for illegal content– as defined by national laws of the Member States of the EU.  For illegal content, the DSA establishes the category of trusted flaggers, a group of experts nominated by the national authorities to which platforms should react promptly.

On content moderation 

Along with other civil society organisations the EFJ expressed itself against any kind of ‘media exemption’ that could have given media outlets a free pass from content moderation even when they spread disinformation. Media should be equally accountable when it disinforms. However, the EFJ has insisted on legally binding guarantees that would prevent internet platforms from arbitrarily and unilaterally deleting journalistic content. We have therefore pledged all along to strengthen Article 12 and make sure that the platforms’ terms and conditions are bound to fundamental rights and media freedom, and the new wording in our mind lacks such guarantees.

(Agreed text :12. 2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.)

On complaint system and user appeals

Internal complaint systems now include an additional opportunity to access an out-of-court dispute settlement body, advocated for by the EFJ and many civil society organisations including the EFJ. However, the complaint procedures foreseen in the DSA may take too much time should platforms erroneously block entire apps or accounts from media or remove editorial content. 

On Crisis Response Mechanism

A crisis response mechanism was only recently added to respond to emergencies like the war in Ukraine. It would enable the European Commission to mandate very large online platforms to take specific actions in a crisis, such as taking down war propaganda. After intense lobbying by civil society including the EFJ the original text was adjusted to guarantee human rights standards in such emergency situations. It now includes a time frame of maximum three months. The Emergency powers can only be used  upon a recommendation of the board of national authorities, adopted by a simple majority. Three months after the crisis, the EU executive will have to report to the Parliament and Council on any actions taken under the measure.

On Risk management

The DSA introduces stricter rules for very large online platforms (VLOPs) with more than 45 million users in the EU due to their enormous influence. VLOPs will be required to conduct regular assessments of systemic risks such as disinformation, deceptive content, and revenge porn, and need to implement appropriate mitigation measures subject to independent audits. In case of lack of successful mitigation VLPs face fines.

On transparency obligations

The new rules also include a series of transparency obligations for promoted content, which must be clearly labeled as such. Targeted advertising based on minors’ personal data and profiling based on sensitive data like political views and religious beliefs will be forbidden.

Some of the DSA’s transparency provisions are limited in scope, notably in regards to the brand attribution rule, which the European Broadcasting Union (EBU)  advocated for. We agree with the EBU that when users access media content through social networks, news aggregators, or search engines, they should be able to easily identify who bears the editorial responsibility. Failure by platforms to attribute content to its source deprives audiences of an essential element to judge the information they see and hear. According to the provisional agreement, only online marketplaces will have to fulfill this important requirement.

On Recommender Systems

All platforms will have to explain how it personalises content for their users via their recommender systems. Very large online platforms will also have to provide an alternative recommender system not based on profiling. We believe the text fails to bind  all platforms to  provide access to third party recommender systems in order to guarantee users’ choice, freedom of expression and media pluralism, as we had advocated for.

On enforcement

While national authorities will supervise smaller platforms, the Commission will have exclusive competencies on very large online platforms. To finance that, the EU executive will charge the platforms a supervisory fee proportional to the size of the service and not exceeding 0.05% of its worldwide annual net income. Micro and small enterprises are exempted from a series of obligations, such as the traceability of traders, notification of criminal offences, transparency requirements, a system for handling complaints, and out-of-court dispute settlement.