European Federation of Journalists

The ethical aspect of “robot” journalism


“Any human being, who is involved in a natural language generation (NLG) process, such as programmers or linguists, must take into account the ethical dimensions governing journalism” concluded Laurence Dierickx, a specialist in Computational Journalism, who was invited to the EFJ Digital Expert Group meeting, that took place on March 8 in Brussels. She explained that the computer scientists and start-up companies, who are heavily involved in the NLG, are not hold under the same ethical standards as journalists. Also in cases with automatically generated content, there is not always a mention that the article is written by a computer, which potentially confuse readers.

The main outcome of her impressive and detailed presentation was that “robot” journalism is not some vague idea of the the future journalism, but indeed an ongoing process that is growing in Europe and at global level. From Israeli software that transfers text into video to TV anchor-robots in Japan, the advanced technologies influence the dynamic of  digital journalism. Dierickx described several existing experiments in mainstream media with computational journalism: Le Monde in a cooperation with start-up Syllabs used automatically generated content, while covering French regional elections; the news agency Xinhua announced in 2015 that they would use NLG for sports and financial reports; Press Association revealed in 2016 that they are thinking about using  automated content for business, sports and elections coverage. In 2015, the Dutch government funded research on “robot” journalism in partnership with the Dutch Newspaper Publishers Association (NDP). In Germany, automated content is used more and more but without giving this information to the readers.

There is a need for full disclosure. Automated content should always be marked as “software”, said Dierickx.

The weak aspects of  computational journalism include lack of creativity, analytical skills and a sense of humour, whereas fast speed and productivity are considered the biggest assets. NLG should not be considered as a potential threat to traditional journalism, instead by generating routine stories, it may allow journalists to spend more time on in-depth and investigative stories. However, robot journalism rises ethical questions, including issues as transparency, liability  and copyright.

The group agreed about the importance of:

  • training for journalists how to use automated content and data journalism, and project management used in computational field;
  • dialogue between journalists, editors and computer and data scientists, as well with start-ups;
  • transparency including the use of algorithms;
  • apply journalists’ ethical standards to automated content and algorithms.

For more information see Laurence’s website.

Photo Credit: Diana Krovvidi