Blog

An Interview with Hugo Mercier
Author: Tracie Farrell (Research Associate) at the Knowledge Media Institute, The Open University

In mid-March, as the Covid-19 pandemic was already sweeping through Asia and continental Europe, we spoke to Hugo Mercier, a psychologist specialising in how and why human beings came to think and behave the way that they do. Hugo has written two excellent books on the subject “The Enigma of Reason”, co-authored with Dan Sperber and more recently, “Not Born Yesterday: The Science of Who We Trust And What We Believe”. We spoke to Hugo about his work on reason, and why human beings may be so vulnerable to misinformation (or not, as Hugo argues). Hugo was speaking to us by phone from his daily hourly walk in France, before the UK (where our interviewer is living) went into lockdown[1]. Now, as lockdown begins to ease, we reflect on the various events and impacts of the virus and it seems timely to share this interview.

Tracie: Hi Hugo, thanks for taking the time to talk to us! I just want to start by telling you a bit about our project and what we’re trying to achieve with Co-Inform. Our approach has been focusing on trust transparency and thinking twice. This includes tools to detect and highlight misinformation, educating the public about how to avoid or reflect on misinformation. As you know, we speak a lot about cognitive bias in this process. Some of the most pernicious cases of misinformation are not about false information, but those impacted by trust, social relationships and values. How do you see your work fitting in with the discussion on misinformation?

Hugo: The work that I am doing that is most relevant to misinformation is the work portrayed in the second book [Not Born Yesterday]. The main idea is that, on the whole, we are quite good at evaluating information, that we use a variety of cues, such as the competence of the source or the quality of the arguments to decide who to trust and what to believe. We’re quite good at this. The main issue with the information environment is that we reject too much information rather than accept too much information. We reject a lot of information we should accept. Nowadays, you are going to see a lot of people who will doubt information from a variety of sources that this epidemic is bad, that the virus is much worse than the flu and all of this. You’re going to see a lot of people doubt that and this is going to be a much more important problem than people accepting false information. For instance, if someone accepts that it’s all a conspiracy  by the Chinese, it is unclear that it will make them act in a way that is really deleterious, whereas if they reject that the virus is worse than the flu and they do nothing, there you have a problem.

Tracie: This is one of the biggest questions we ask ourselves – what is the real impact of misinformation? I got from your second book that you might think that the actual impact is really not that much, that people are not really believing this stuff that they are reading and sharing.

Hugo: I think it’s not huge, indeed. Either people don’t believe it, or if they do believe it, it is not impacting other opinions. You know, for instance, there are studies I mention in the book about people who believed that Obama was a Muslim, when that rumour was going around around 2008. What the data suggests is that people who already disliked Obama and had a very negative image of him tended to believe the rumour, but that believing the rumour did not make them like him less (even though, obviously, most of these people would have been prejudiced against Muslims). In theory, if they had really evaluated the information and thought, “okay well, who knew. He’s a Muslim. He’s betraying our nation’s religion” or whatever – then they should have come to like him much less than they did before and that did not seem to have happened. That suggests, on the whole, that the direction goes from having some pre-existing attitudes making it easier to accept such and such piece of misinformation that kind of fits with your priors. It doesn’t impact your priors all that much, unless you have a very good source. I think in that case, there was no credible source saying that Obama was a Muslim. So you can see how misinformation can be effective if it were conveyed by a source that’s credible. Even if you look at Trump and say “it’s bad if Trump shares a piece of false information because a lot of people like him”, even then, it’s not clear. For instance, there was another study in which some false statements from Trump were debunked and that didn’t really change people’s underlying opinion of him. More and more people are kind of skeptical about politicians on the whole. In one survey, people were asked what percentage of statements from politicians they thought were lies or misstatements, exaggerations, or inaccuracies. Even Republicans who like Trump said 40%. It was the same, actually, for Sanders[2]. So, on the whole, people are very skeptical and cynical toward politicians. So, even if the politicians say something false, it’s not clear how much impact it’s going to have. Now I don’t know if the head of the CDC says something false, it is possible that this information could have an effect. Thankfully, this rarely happens.

Tracie: There is Trump saying people should go to work anyway, at the start of this crisis…

Hugo:  I think in most of these cases we have to think about the opportunity cost. I don’t think that statement from Trump is going to get people who were planning on staying home to go to work. It’s more of a comfort to those people who were planning on going to work anyway. What’s a shame is that if it had been a different president and he had told people to try and not go to work as much as possible, maybe that would have had a small impact. So it is not so much the damage that is done by misinformation, but all the good that could have come from people spreading and sharing valuable information instead.

Tracie: I am sure people have asked you about Brexit and the election and all of these big moments that people associate with misinformation. Is that still your frame?

Hugo: I would question the premise, in that there is no data to suggest that fake news or anything like this played a major role. Even if it did, it would only be by changing .01 percent of the votes or something small like that. In the situation at hand, the issue is that many people are just not reacting enough because they are not getting sick themselves, or they are young and they know that the virus isn’t likely to affect them too badly. So they don’t respect the request by the government to do social distancing or hand washing or these types of things. It’s extraordinarily hard already to get doctors to wash their hands. So the odds that the public at large is going to all of sudden become hygiene freaks or germaphobes is pretty unlikely.

Tracie: All of these polarising issues right now, like vaccines, where there is a considerable amount of misinformation about this topic out there, you do see impacts of people changing their vaccination habits…

Hugo: What it comes down to is this. The vast majority of the people are going to trust their doctors in these decisions. And the core of people who don’t, the core of people who are really against vaccination, they were already resistant before.

Tracie: When you are in real life and you’re walking around a community of people and you are listening to them discuss ideas, you can see the size of the group of people who believe a certain thing or argument and you can calibrate your position on different things. How do you feel that the online world contributes to this, when some of our cues for credibility are obfuscated online. What impact does that have?

Hugo: That’s a big question. I guess it varies because so many people have different online experiences. It is possible to use the internet in a way that is very accurate and will facilitate you having better beliefs. Obviously, it is possible to use the internet in other ways. In a lot of ways the online world has not fulfilled one of its promises, when people thought it would be an epistemological promised land, or that the whole internet would be Wikipedia, but it doesn’t work that well. People who are really careful with their assessments, people who are really information hungry, or good at processing information, they will get even better, because they will have access to so much more information. People who are just bent on confirming their views or wreaking havoc, they can just do it so much more easily over the internet. So it’s not clear. On balance I think it’s going to make things better, but I just think it’s going to take longer for institutions and individuals to adapt to this new informational environment.

Tracie: One of our project goals is to increase people’s awareness for their exposure to misinformation by highlighting it. Is that a realistic goal, in your opinion? To make people aware of certain biases in certain information?

Hugo: I guess this is what Facebook and Twitter are trying to do, providing people a warning about what they are seeing. This can sometimes backfire in the sense that if you expect to see a tag when information is false, if you don’t see a tag, you might assume that the information is true, even though it might just not have been caught yet by the fact-checkers. So there is always this potential for a small backlash. But on the whole, I think it’s good. I mean, we don’t rely on first hand experience to acquire any information about the world at large, so we have to rely on information from sources, we have to rely on the mainstream media and it all works because we do have institutions like the mainstream media that have evolved so that we can by and large trust them to at least provide accurate information.

Tracie: What is the new literacy around information then? What could that look like?

Hugo: Increasingly, knowledge is going to be not knowing facts, but where to find facts. Some people are going to be good at finding information online and figuring out which sources are reliable. These people are going to be at an advantage, and it’s good to try to make everyone’s information environment as easy to navigate as possible, not only to make sure that correct information is available but that people have a good reason to trust it. You know, that it’s provided by sources that have a good record, that it’s transparent, that people can know how to access it, etc.

Tracie: In our research, we asked people to justify why they thought something was true or not and the answer we got most often was just “It felt right to me”, or “it didn’t seem true to me.”

Hugo: Well, that’s how it should be, in a way.

Tracie: Is that their intuition, what’s you’re calling their intuition being informed

Hugo: One of the things I’m arguing is that the main mechanism by which we evaluate information is something I call “plausibility checking”, where you compare the new information against what you already know and anything that doesn’t fit, you will be inclined to reject. Whenever you hear something, you can’t help yourself, as you understand it, you have an intuition about whether it fits with your priors.

Tracie: What I take away from your book is that whether reasoning or intuition is your way of knowing, information helps both.

Hugo: Sure. It can’t hurt.

Tracie: What about those moments when you really need to change your own mind or someone else’s. In your book you said it’s not enough to have an argument, it needs to fit inside of that other person’s epistemic position.

Hugo: The time in which you are most likely to find information that fits with your views is when you’re trying to persuade someone. Or when you’re anticipating having to persuade someone. That often happens when we’re talking about politics. For instance, I might be reading the political news, not so much to get an enlightened view of politics, but to find arguments so that I can shut up my brother-in-law when he disagrees with me. [both laugh]. That can end up giving people the impression that they are much more right than they actually are.

Tracie: What is then “epistemic vigilance”, is it a competency?

Hugo: It’s a set of cognitive skills and most of them are completely intuitive. When we read something, we have an intuition about how plausible it is. When you meet someone, you have an intuition about how trustworthy they are. Everybody has these intuitions, they come online very early in development as you start interacting with others. You start developing these intuitions and you hone them throughout your life by interacting with people and keeping track of who was reliable or not. It’s not something that people explicitly learn. Sometimes we can override our intuitions. Maybe there is a doctor who doesn’t seem trustworthy, but maybe I have heard from a few different people that they had a good experience, so I trust her. There is a large part of intuition that we don’t know exactly how it works, but it seems to be working very well on the whole.

Tracie: What advice do you have for the people trying to do this task of handling misinformation. I mean, I don’t want to put words in your mouth, but it sounds like you’re saying it’s not that big of a problem!

Hugo: I don’t think it’s the misinformation that’s the biggest problem. I think people should focus more on how to get people to trust sources that are on the whole reliable. Things that are not perfect, but they are so good compared to every other source, such that if everyone just believed them it would be a massive improvement over the current state. Being able to flag misinformation helps people to trust other sources by contrast. One of the things I’m worried about a bit is this whole angle of saying people are “not vigilant enough”, they are “too gullible”, they “need to develop critical thinking”. What worries me is that this will reinforce people’s tendencies to be too distrustful, too skeptical. We have to trust more. We have to trust the media more, and indeed the people who trust the media more are more knowledgeable. On the whole, most of the mainstream media are reliable. Yes, they might have their biases but the information they convey tends to be accurate as a rule. The problem is these trust deficits we have to try to fill somehow.

Tracie: Helping people to understand what good information looks like might be a different approach. Does that seem sensible?

Hugo: Trying to make it easier for people to appreciate a good piece of information, trying to make the cues that people will use to accept the piece of information and realise the source is reliable and competent, trying to make these cues more accessible and easier to grasp, I think that would be good. I don’t know how to do it, but it would be good.

Tracie: Well, that’s an area where Co-Inform may have some ideas. And we may come back to you on that question in the future!

Hugo: That’s completely fine. Feel Free.

Tracie: Well Hugo, I know you’re outside right now and I don’t want to take up any more of your time. I really appreciate you speaking with us.

Hugo: It was my pleasure and good luck with everything. Let me know what happens with your project!

 

[1]Hugo and our interviewer Tracie Farrell were speaking remotely, as he was outside in the wind. Some comments are paraphrased because of the audio quality. The interview has been checked with Hugo and he’s given it his stamp of approval.

[2]Bernie Sanders, one of the Democratic party candidates for the 2020 US Presidential Election

Co-Inform’s mission is to foster critical thinking and digital literacy. 

Academic surveys have shown that online misinformation is becoming more difficult to identify. Online misinformation has the potential to deceive even readers with strong literacy skills. Our goal is to provide citizens, journalists, and policymakers with tools to spot ‘fake news’ online, understand how they spread, and obtain access to verified information.

Subscribe to our newsletter

Get our latest project updates, news and events announcements first!

Co-Inform Copyright 2021
Disclaimer

Co-inform project is co-funded by Horizon 2020 – the Framework Programme for Research and Innovation (2014-2020)
H2020-SC6-CO-CREATION-2016-2017 (CO-CREATION FOR GROWTH AND INCLUSION)
Type of action: RIA (Research and Innovation action)
Proposal number: 770302

Share This