Action Plan Against Disinformation
This Action Plan answers the European Council’s call for measures to “protect the Union’s democratic systems and combat disinformation, including in the context of the upcoming European elections”. It builds on existing Commission initiatives and the work of the East Strategic Communication Task Force of the European External Action Service. It sets out actions to be taken by the Commission and the High Representative, with the assistance of the European External Action Service, in cooperation with Member States and the European Parliament.
This Plan includes input received from Member States, including via discussions at Council in Permanent Representatives Committees I and II, the Political Security Committee, relevant Council working parties and meetings of strategic communication and political directors of Ministries of Foreign Affairs. It also takes into account the cooperation with the Union’s key partners, including the North Atlantic Treaty Organization and the Group of 7 (G7).
Special Eurobarometer – Democracy & Elections
Democracy is a fundamental principle of the European Union. It is expressed in many ways in how the Union institutions work, not only by the rights of Member States to vote on issues, but also by the rights of citizens to directly elect members of the European Parliament to represent them. This representation, as well as citizens’ participation in decision making are key elements of democracy in the EU.
The importance of representation and participation, through free and fair elections and an open, informed and plural political debate represents the cornerstones of a functioning democracy.
This survey was commissioned by the Directorate-General for Justice and Consumers to explore citizens’ opinions and concerns about voting and elections, as well as their satisfaction with various aspects of democracy in the EU.
Journalism & Disinformation
UNESCO works to strengthen journalism education, and this publication is the latest offering in a line of cutting-edge knowledge resources.
In this publication, disinformation is generally used to refer to deliberate (often orchestrated) attempts to confuse or manipulate people through delivering dishonest information to them. This is often combined with parallel and intersecting communications strategies and a suite of other tactics like hacking or compromising of persons. Misinformation is generally used to refer to misleading information created or disseminated without manipulative or malicious intent. Both are problems for society, but disinformation is particularly dangerous because it is frequently organised, well resourced, and reinforced by automated technology.
This handbook therefore is a call to action. It is also an encouragement for journalists to engage in societal dialogue about how people at large decide on credibility and why some of them share unverified information. As with the news media, for journalism schools and their students, along with media trainers and their learners, this is a major opportunity for strong civic engagement with audiences.
Tackling Misinformation in an Open Society
This paper by Full Fact sets out a framework for a risk-based and proportionate response to the problems of misinformation and disinformation in the UK. The realistic goal is not to eliminate misinformation and disinformation, but is to build resilience against it.
They argue that immediate action is needed to tackle some urgent problems notably our outdated election law. But they also argue that rushing to come up with quick solutions to the range of issues could do more harm than good. We need to understand the wider issues clearly and design effective and proportionate solutions. Globally, some governments have pressed the panic button, leading them to come up with rushed, dangerous, and illiberal proposals. So far the UK has not. We should continue to try to work out how an open democratic society can tackle misinformation and disinformation while protecting free speech.
News in social media and messaging apps
The Reuters Institute for the Study of Journalism (RISJ), at the University of Oxford, commissioned exploratory research from Kantar Media to provide a topical supplement to the Digital News Report 2018.
The aim of the research was to provide a qualitative exploration of consumer behaviour, attitudes and motivations surrounding news consumption in social networks and messaging apps, and the pivot away from news in Facebook.
A short guide to the history of ’fake news’ and disinformation
Misinformation, disinformation and propaganda have been features of human communication since at least the Roman times when Antony met Cleopatra. Octavian waged a propaganda campaign against Antony that was designed to smear his reputation. This took the form of “short, sharp slogans written upon coins in the style of archaic Tweets.” These slogans painted Antony as a womaniser and a drunk, implying he had become Cleopatra’s puppet, having been corrupted by his affair with her. Octavian became Augustus, the first Roman Emperor and “fake news had allowed Octavian to hack the republican system once and for all.”
This learning module designed to be used by journalists, journalism trainers and educators (along with their students) provides historical context for the analysis of the 21st century ‘fake news’ crisis. Relevant case studies and a timeline are designed to better inform users about
the causes and consequences of ‘information disorder’.
Digital News Report 2018
Reuters Institute’s seventh annual report explores the changing environment around news across countries. The report is based on a survey of more than 74,000 people in 37 markets, along with additional qualitative research, which together make it the most comprehensive ongoing comparative study of news consumption in the world.
Europe remains a key focus, where the report covers 25 countries including Bulgaria for the first time this year, but they also cover six markets in Asia (Japan, South Korea, Taiwan, Hong Kong, Malaysia, and Singapore) along with four Latin American countries (Brazil, Argentina, Chile, and Mexico) and the United States and Canada from North America.
The International Council for Information Technology in Government Administration (ICA), the Organization for Economic Co-operation and Development (OECD), eGovlab at Stockholm University and the Open University have worked together to address the impact that the revolutionized availability of information on the Internet, has on public discourse.
Pillar 1 of the OECD’s Recommendation on Digital Government Strategies on the need for national governments to create an inclusive, transparent and accountable digital sphere has been used as the initial guideline. From then on and in view of recent developments regarding digital disruption through misinformation, the main challenges and potential solutions for governments and citizens are examined.
This is an issue which has implications for public order and democracy and therefore it is established that governments have the ultimate responsibility and ability to deal with the causes and consequences of misinformation. This paper argues that they also have the ability to fill the key area between non-profit organisations (such as fact-checkers) and the private sector (social media platforms) and involve citizens to a greater level. So, what actions should governments take to fill that crucial space?
A Multi-dimensional Approach to Disinformation
The analysis presented in this Report by the EU High Level Group on Disinformation starts from a shared understanding of disinformation as a phenomenon that goes well beyond the term «fake news». This term has been appropriated and used misleadingly by powerful actors to dismiss coverage that is simply found disagreeable.
Disinformation as defined in this Report includes all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit. It does not cover issues arising from the creation and dissemination online of illegal content (notably defamation, hate speech, incitement to violence), which are subject to regulatory remedies under EU or national laws. Nor does it cover other forms of deliberate but not misleading distortions of facts such a satire and parody.
This report is an attempt to comprehensively examine information disorder and its related challenges, such as filter bubbles and echo chambers. While the historical impact of rumours and fabricated content have been well documented, we argue that contemporary social
technology means that we are witnessing something new: information pollution at a global scale; a complex web of motivations for creating, disseminating and consuming these ‘polluted’ messages; a myriad of content types and techniques for amplifying content; innumerable platforms hosting and reproducing this content; and breakneck speeds of communication between trusted peers.
The direct and indirect impacts of information pollution are difficult to quantify. We’re only at the earliest of stages of understanding their implications. Since the results of the ‘Brexit’ vote in the UK, Donald Trump’s victory in the US and Kenya’s recent decision to nullify its national election result, there has been much discussion of how information disorder is influencing democracies. More concerning, however, are the long-term implications of dis-information campaigns designed specifically to sow mistrust and confusion and to sharpen existing sociocultural divisions using nationalistic, ethnic, racial and religious tensions.
Subscribe to our newsletter
Get our latest project updates, news and events announcements first!
Co-inform project is co-funded by Horizon 2020 – the Framework Programme for Research and Innovation (2014-2020)
H2020-SC6-CO-CREATION-2016-2017 (CO-CREATION FOR GROWTH AND INCLUSION)
Type of action: RIA (Research and Innovation action)
Proposal number: 770302