EU Digital Services Act: journalistic content must be protected from interference by online platforms
Updated (03/12/2021)
The European Federation of Journalists (EFJ) welcomed the proposed EU Digital Services Act as a long overdue and crucial tool to create a safer, fairer and more accountable online environment. The journalists’ community believes that the DSA must be stronger in order to guarantee a digital media ecosystem based on trust and audience engagement, in particular in the fight against disinformation. Today the platforms determine who sees when and what – based on their content recommendation systems, algorithms and terms and conditions. The power of the big platforms and gatekeepers has contributed to the market failure we face in journalism. How to ensure an enabling environment for independent professional journalism in the digital ecosystem will depend on ensuring a level playing field and fair digital competition enforcement.
- The DSA must set high transparency standards on all online platforms regarding algorithmic decision-making processes and content recommendations.
- The EFJ welcomes the fact that the DSA seeks to make ‘Big Tech’ accountable to public authorities through new transparency and due diligence obligations, including for the decisions they make to remove or restrict access to content (free speech protection).
- The EFJ welcomes the obligation of transparency on online advertisement and the natural or legal person behind it.
- The EFJ welcomes the obligation of transparency from online platforms about their use of automated content moderation, indicators of accuracy and any safeguards applied.
“As long as up to 80% of advertisement revenue goes to the big platform providers, the future of independent journalism is at great risk.”
- The EFJ regrets the fact that the DSA does not sufficiently set limits on “Big Tech” business models based on the massive collection of personal data, profiling and targeted advertising.
- The EFJ regrets that the DSA does not address the excessive power of “Big Tech” over information flows (in addition to content moderation rules, we need rules to open the markets to new platforms and to multiply the channels of public discourse and journalistic content).
- Fair and non-discriminatory distribution of all digital press and publications on gatekeeper platforms must be ensured.
- Online platforms monetise on content that is produced by journalists. However these journalists don’t get their share of the incomes. The DSA should put forward concrete proposals which strive for equitable share of revenues and promote fair redistribution systems.
“The new law must be an enabler, not a roadblock for media freedom and freedom of expression. The removal of content should not be left entirely to the platform companies. While the EFJ is not for a blanket media exemption as that may cover captured media or media not binding itself towards ethical standards and self-regulatory bodies as press- and media councils, it advocates both for strengthening the terms and conditions set up by platforms, the complaints handling systems and the need to exempt those media that already are committed to self-regulation. “
The service providers would be expected under the DSA to search and delete any type of potentially illegal content under EU and national law. Given the plurality of and divergences among national laws regulating freedom of expression, it is expected that companies play safe and ban a wider range of content than what would be strictly necessary and proportionate. This undemocratic system of corporate censorship needs to be prevented in the future legislation.
- The DSA must make sure that the providers’ terms and conditions are based on fundamental rights including media freedom (Art. 12) and Article 17 on complaint handling systems must be strengthened accordingly. Article 12 should say: “Terms and conditions shall not restrict freedom of the media as enshrined in Article 11 of the Charter.”
- A stronger DSA should protect online journalistic content from interference by online platforms. Journalistic content regularly gets removed and journalists’ accounts blocked by online platforms without any prior warning. Through the submission to a nation- or regionwide self-regulation by press or media council, these journalists and editorial media are bound to adherence to common journalistic ethical standards and subject themselves to a system of complaints handling. When and where existing, this chosen commitment and the system of self-regulation must be honored in the DSA as a fundamental expression of societal responsibility that makes additional regulation superfluous. In the light of necessary independence of journalists and editorial media, journalistic self-regulation must have priority over external regulation. Therefore the DSA should provide a legal exception for them.
- Online platforms shall notify at least 45 days in advance the producer of editorial content of any algorithmic changes that affect distribution. Sudden and unilateral deletion of journalistic content must be sanctioned by the mandatory independent auditing exercise.
- Audience engagement is becoming an essential tool for journalists and media to create trust and transparency. It cannot be that gatekeepers platforms would entrench their control over the formation of opinion online.
- The notice-and-take-down mechanisms strictly focuses on illegal content, as defined by national and European laws. Given the plurality of and divergences among national laws regulating freedom of expression, it is expected that companies play safe and ban a wider range of content than what would be strictly necessary and proportionate. Journalistic content cannot be unilaterally treated as illegal content by intermediaries. Online platforms will need to have a “fast-track appeals process” for journalists and should be held to account by the judicial services and independent regulators for the arbitrary removal of journalistic content.
- Recommender systems (Article 29) should apply to all platforms regulated under the DSA, and not only Very Large Online Platforms (VLOPs). Article 29 should impose VLOPs to unbundle hosting and content curation, and to provide access to third party recommender systems in order to guarantee users’ choice, freedom of expression and media pluralism.
- The EFJ demands a clarification about the qualification of “trusted flaggers” (art. 19). Journalists’ unions and associations should be able to qualify as trusted flaggers. When it comes to IP rights, trusted flaggers status should be granted to collective management organisations, rights holders’ associations, journalists’ unions and associations.Member States should make sure that the national independent regulatory authorities and bodies for the media are adequately involved in enforcement and oversight of the DSA.