European Federation of Journalists

Common Letter on algorithm transparency and data access on content-hosting platforms

Credit: Camilo Jimenez / Unsplash

On 16 June, the European Federation of Journalists joined a coalition of civil rights organisations in sending a letter to the European Commission highlighting the need for a regulatory framework including transparency and accountability obligations on content-hosting platforms in the framework of the European Commission’s “Shaping Europe’s Digital Future” and the upcoming public consultation on the Digital Services Act.

 

Dear Commissioner Breton, Vice President Jourova

CC: Commissioner Reynders 

We are writing as representatives of organisations that share your goals of an open, democratic and sustainable society. We also share your excitement at the opportunity this 

Driven by this shared concern for democracy, there is much that can be achieved through the digital agenda that you and your colleagues are leading. In that context, we are writing to ask that you establish regulatory frameworks for auditing the design of automated decision-making systems employed by commercial online content-hosting platforms for such purposes as content moderation, content curation, and the targeting and delivery of advertising.

As the Commission rightly highlights in Shaping Europe’s Digital Future, the integrity of the information environment  is fundamental to a robust democracy. Content-hosting platforms are now a primary channel for a significant proportion of citizens to access information, especially younger citizens: half or more of 18- to 29-year-olds in each European country use social media for news daily.1 As the Covid-19 pandemic has underscored, how the social media companies design their platforms and algorithms dramatically shapes how many people are reached by online content, and therefore the nature of democratic discourse.

These design decisions translate into real world consequences for public health and our democracy, including polarisation in political debate and radicalisation, mob violence and violence against 5G infrastructure, driven by mis- or disinformation. As one example,Facebook executives are reported to have known that “64% of all extremist group joins are due to our recommendation tools” and that these algorithm systems “grow the problem.” In the absence of any external accountability Facebook did not act on this information.

Despite the significant effect they have in shaping what people in Europe see online, these design decisions are largely made in the dark. The Covid-19 pandemic has made this plain, but the challenges were already there for our democracies. Those companies should not have a monopoly over decisions so fundamental to public health, nor should they be the ones to decide what is important for a democracy to know.

We are not addressing here the question of whether platform companies should be held liable for individual pieces of harmful content being on their platforms: we believe they should be made accountable for the role they play in actively promoting such content. Research suggests that controversial content is actively driven by algorithms that are designed to maximise revenues from sales of advertising: content that maintains users’ attention enables the platforms to gather more data about users, to profile them and target content accordingly.

The EU’s digital leadership provides a once in a generation opportunity to address the root causes behind the quick dissemination and amplification of harmful content, and to rebuild trust in the online information environment.

We believe that the Commission should impose new enforceable transparency and accountability obligations on content-hosting platforms. Any regulatory mechanism should have the power and the capacity to:

  • Examine the purpose, constitution, and policies around algorithmic or automated decision-making systems, and to interview people who build and interact with different parts of that system, and observe how people use the system.
  • Identify and assess what data was used to train the algorithm, how it was collected, and whether it is enriched with other data sources, and whether that data changed over time.
  • Enable data access to third parties (for example civil society organisations, academia, journalists) for public interest scrutiny. Concretely, this would mean institutionalising privileged data-sharing partnerships and ensuring the content- hosting platforms produce high quality, workable, APIs with data.
  • Develop, in consultation with relevant stakeholders, including civil society, appropriate guidance for state-of-the-art procedures regarding human rights impact assessment, as recommended by the Council of Europe, as part of human rights due diligence. These procedures should be mandatory with regard to all algorithmic systems with potentially significant human rights impacts.6
  • Enforce proportionate sanctions for breaching requirements, including mandatory compliance with the transparency requirements as well as financial penalties.
  • Ensure any transparency measures are designed to be in compliance with the GDPR.

It is only by undertaking this sort of inspection that an independent regulator, acting in the public interest, will be able to assess whether platform companies truly are acting responsibly. Moreover, this kind of approach would render unnecessary other proposed solutions such as laws dictating algorithmic neutrality or content filters, which would have the dramatic negative unintended consequences for people’s ability to exert their freedom of expression and their right to access information.

The EU can lead the word in protecting citizens’ rights and our democracies in a digital age. Algorithm inspection should be an important part of that, and we would welcome the chance to meet with you to discuss how we can support progress on this important matter.

Yours sincerely,

Tanya O’Carroll, Director, Amnesty Tech, Amnesty International

Luis Morago, Campaign Director, Avaaz

AHM Bazlur Rahman, CEO,
Bangladesh NGOs Network for Radio and Communication

Orsolya Reich,
Senior Advocacy Officer, Civil Liberties Union for Europe

Clara Hanot, Advocacy Coordinator, EU Disinfo Lab

Ken Godfrey, Executive Director, European Partnership for Democracy

Renate Schroeder, Director, European Federation of Journalists

Mira Milosevic,Executive Director, Global Forum for Media Development

Charles Bradley, Executive Director, Global Partners Digital

Henry Tuck, Head of Policy and Research, Institute for Strategic Dialogue

Gaelle Dusepulchre, Permanent representative to the EU, International Federation for Human Rights (FIDH)

Amy Brouillette, Research Director, Ranking Digital Rights

Julian Jaursch,Project Director, Stiftung Neue Verantwortung (SNV)