Misinformation, disinformation, and post-truth facts (varyingly called “lies”, “fake news”, and everything in between) are worrisome: their creeping existence is undeniable to anyone who has been using the Internet or a smartphone for some time, with a certain amount of innate literacy. The content that the internet provides so freely and abundantly is tainted by the potential danger of contamination by intentional, or accidental, wrongness. The topic at hand can be as trivial as cat pictures or as emotionally and morally charged as what are the exact current events surrounding refugees at EU or U.S. borders. At worst, the consequences of politically “coloured”, “tampered”, and “massaged” information are massive in their damage. In other words, we may say that free and open information can turn very quickly into manipulated and manipulative information. The tricky part is that we do not know very well when to be especially vigilant while reading or watching content online.
Due to this rich but camouflaged body of evidence that is more apparent to some than others, it is no wonder that fact-checking initiatives are widespread in the back offices of any news outlets that value their reputation. Such efforts are highly dedicated, organized, and professionalized around the sole and focused awareness that facts are only facts after having been checked. Their staff knows all tricks of the trade with regard to ensuring that facts are true, based on their provenance, source, details, or other potent markers. However, when such brave ventures fail and crumble completely, it is probably time to dismount all the heavy artillery for a minute and to look back at where misinformation truly originates from: our minds, which house many good things, but also our deepest and rawest emotions.
This above scenario is exactly what took place in the fateful past months, unleashed by an implosion of misinformation-loaded misconduct at the German weekly Der SPIEGEL. As an international icon of quality journalism, it was renowned for housing the World’s Largest Fact Checking Operation, which is seriously impressive. It invested money and sweat not only in the size of staff (70+) but also in the protection of autonomy and the construction of solid guidelines. However, it all unraveled in the affair surrounding a star journalist in his prime years, who had fabricated his articles to the point of fantasy fiction. Among his oeuvre, the most celebrated pieces were about Trump’s America, which the journalist painted as “redneck” America – a gun-toting, intolerant, anti-immigrant and irrationally religious nation. His skillful art of misinformation was propelled and ultimately appraised and lauded.
Adequately, the SPIEGEL leadership and German journalist colleagues have reacted with a mixture of public regret, embarrassment, and due self-reflection. However, there could not have been a more poignant analysis than the one penned by James Kirchick, an American journalist whose reaction piece was translated into German and published in the Frankfurter Allgemeine Zeitung. His view is understandably weary of a continental brand of liberal-elitist perception towards Americans that all too often seep into mundane and professional discourses. He traces the Relotius scandal back to a collective case of “motivated reasoning” at Der SPIEGEL’s editorial teams, which was enabled at the core by a European “tradition of distorted images” about America and, lastly, a “reflex-like anti-Americanism” that found its culminated affirmation in the election of Donald Trump. This verdict reflects the experience of veiled prejudice that even the most educated cannot escape and leaves us with a wistful note regarding the blinding effect of emotions (here: superiority) on our reception of truth, which is so toxic, so acidic that it dissolves institutional bricks and mortar that guard the practice of anti-misinformation.
How do we fight misinformation? How do we counter our own nature that transcends our best intentions? Dr. Sarah De Nigris, a Network Science Physicist at the University of Koblenz and Landau, deals with the topics of e-Democracy and misinformation. She argues that the internet doesn’t exactly make misinformation worse but just – in nitpicky terms – more visible. The “making it worse” part is therefore hard-wired into people’s minds, rather than being a product of opportunistic website algorithms or troll overlords. It is proven by much research that we jump at sinister types of stimulation, in which our brain delights with enthusiasm that is rather close to squealing glee. These dark sides of human nature have been vividly exposed by psychologists.
Monitoring the dissemination of facts, and the creation of facts by each other in society is without a doubt crucial for the prevention of misinformation. However, we must also understand the human role in wanting to believe and spread misinformation: it is only natural to seek the affirmation of beliefs. For the long term, a truly realistic fact-checking system will have to account for the subjectivity of truth and the relativity of fact. For example, the narration of the same event by a leftist partisan will never align with that of a rightist one, which creates conflict in classifying facts. We have a long way to go, and the waves of emotions strike harder in the ever-expanding tides of information. We begin to face this challenge correctly only by acknowledging the greatest source of misinformation: not bots, not the Internet, not social media, but ourselves.
The Co-Inform Research Project brings together a multidisciplinary team of scientists and practitioners to combat misinformation. Beyond creating opportunities for humans to automate the detection and correction of disinformation, it also aspires to shed light on motives and mindsets. Instead of the glum prospect of a post-truth era, we take a more optimistic view: we cannot force consensus, but we can strive for transparency.
Subscribe to our newsletter
Get our latest project updates, news and events announcements first!
Co-inform project is co-funded by Horizon 2020 – the Framework Programme for Research and Innovation (2014-2020)
H2020-SC6-CO-CREATION-2016-2017 (CO-CREATION FOR GROWTH AND INCLUSION)
Type of action: RIA (Research and Innovation action)
Proposal number: 770302