As outlined by Vera Jourova in her speech at the EU DisinfoLab’s annual conference last week, there is no magic solution to disinformation. Indeed, most probably, a cocktail of different measures targeting the economical, psychological, legal, technical and educational aspects of disinformation is likely to be more impactful to understand this moving target and provide effective and sustainable solutions. Talking about education, the European Commission just created an expert group on disinformation and digital literacy for a year mandate to assist the Commission in preparing common guidelines for teachers and educators to tackle disinformation and promote digital literacy through education and training. These guidelines should be published in autumn 2022. This initiative is part of the European Commission Digital Education Action Plan (2021-2027) and will enhance digital skills and competences for the digital transition.
The creation of such a group comes at the right time after the recent Facebook whistleblowers revelations such as Sophie Zhang and Frances Haugen. Ms Haugen, a former product manager at Facebook, recently disclosed some internal documents testifying that the company has been aware of serious problems caused by the platform for quite some time but that no action was undertaken to mitigate them. The Wall Street Journal led an investigation based on the documents. It reported that
- there is an unequal treatment for content moderation,
- the use of Instagram can be toxic for teenagers’ mental health,
- an algorithm modification pushing for more user engagement leads to angry use of the platform,
- only weak response was brought to internal flagging to drug cartels and human traffickers operating on Facebook,
- anti-vaccine activists used Facebook tools to spread doubts about Covid vaccination campaign and the company could not do anything, and
- AI cannot help enough to detect some illegal and harmful content nor to detect users creating multiple accounts.
The investigation also showed a different allocation of resources on content moderation favoring some western geographical spaces.
Time is up for concrete action to tackle disinformation. This echoes an ongoing discussion within the European Parliament following the JURI Committee opinion on the Digital Services Act proposal. The suggested an amendment which makes a lot of ink flow. Following a Joint Industry Letter communicated to several EP committees working on the DSA, JURI MEP’s added a new paragraph to article 12 of the proposal. Provision establishing an obligation to provide specific information in the terms and conditions of the intermediary services providers. The amendment wishes to restrict the Very Large Online Platforms (VLOPs) power regarding editorial content uploaded by news publishers. It prohibits to “remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions, unless it is illegal content”. Illegal content would remain a competence from the Very Large Online Platforms associated with the dissemination to the public of press publications or audiovisual media services but not disinformation.
On the one hand, the publishing industry wants to avoid press censorship and tech companies discretionary using undue power on news content which would have already passed an editorial check.
On the other hand, disinformation researchers revolted, alarm and warn that this exemption will contribute and increase disinformation spread. Why is that so? Illegal content does not include disinformation, indeed this type of content is only considered as harmful content.
Disinformation is as old as the concept of information but today the phenomenon took a frightening dimension with the internet and social media. It has never been as easy and cheap to spread disinformation. Holding disinformation creators accountable is challenging given the difficulty to track down the original source and who exactly writes, edits and publishes. In addition, people are in demand of alternative narratives in what some call the post-trust era of democracies. These factors transformed these fertile grounds into a genuine market for disinformation. According to Diana Wallis, President of the EU DisinfoLab: “Disinformation is like a parasite that if allowed to grow and persist, will eventually kill its host”. Disinformation is poisoning the public debate by a broad range of actors for a broad range of motives and diverse means including “media”.
Media are a key enabler and spreader of disinformation: from manipulation of media or journalists spreading disinformation unintentionally, to buzz/clickbait content improving the visibility and growth of the media online space, to media owned and mandated to spread disinformation and polarized content or foreign interference content. Exempting any organization that publishes regularly on a topic with an intention to provide information from content moderation would permit a considerable part of disinformation providers to continue operating with less hindrance or even in a better impunity regime. For instance, Ms Wallis further explains that “the current labelling of Russia Today on Facebook (“as Russia state-controlled media”) would be deleted, or that Newsguard as a leading source of COVID-19 disinformation in French, will remain online, this time without the current label “Visit the COVID-19 Information Center for vaccine resources”.
There is definitely more attention to be paid on this important topic and better knowledge sharing on disinformation functioning would be welcomed. These events show the need to progress and continue research on disinformation in order to better understand this phenomenon and which measures would be the best to adopt in respect of fundamental rights. To be continued.