Fall of Misinformation Series: Ionica Smeets
Misinformation spreads easily and fast. It gets presented as news, whereas actual news gets dismissed as fake. Conflicting streams of information allows all sides to cherry-pick whatever is most comfortable, boosting degrees of confidence and confusing the deliberation of both politicians and voters. Sometimes the misinformation is created intentionally, sometimes it starts from a misleading graph or an exaggerated press release. From COVID-19 to QAnon, misinformation is on all our minds. What exactly is happening and why? Have we entered a post-truth era? What can we do as university and are we doing enough? The Young Academy Leiden approached some of the researchers currently working on the topic.
There is a lot of talk about the spread of misinformation and fake news. How do you understand this worrying phenomenon? What is happening in your view?
I am always a bit wary of people who make sweeping statements about this. It is very hard to compare the current situation to that of earlier times. I like to show my students a cartoon from 1802 that warns people that the new Smallpox vaccine will turn people into cows. Vaccine resistance, fake news and misinformation are not new. I am an optimistic person. Yes, it might be easier to spread misinformation and fake news in online media, but it is also easier to combat them.
You have done a lot of work on science communication. How do you see the relation between science communication and the spread of misinformation?
There is a subtle line between flat-out, intentionally spread misinformation and exaggerated news that is also misleading. A classic example is a study that finds a correlation between eating nuts and living longer and news articles that state that eating nuts will make you live longer. This kind of news can have a lot of impact on people who are worrying about their health.
I am very interested in how these processes work in science communication: why do scientific results get blown up in the media? In a study we did on Dutch health news with amongst others Peter Burger from Journalism Studies and Mattijs Numans from Public Health we found that 20% of academic press releases exaggerated the causal claims in the underlying peer-reviewed article and that 29% of newspaper articles did so. This was a replication of a British study that found similar results. The British team (InSciOut) later showed in a randomized controlled trial that exaggerated academic press releases are causing exaggerated news in the media. This is both good and bad news: it is very sad that incorrect news originates at universities, but it is also great that we can actually do something to improve the news from within academia. Just by sending out correct press releases ourselves.
Do you think that outreach can be done badly, given the incentives to grab attention and make things seem as interesting as possible?
Outreach can definitely be done badly and it always surprises me that academics who are very thorough and thoughtful in their own field, base what they do in outreach on gut feelings. The studies I mentioned just before also found that exaggerated press releases do not lead to more media attention, our intuition about this is wrong.
There are many ways to improve outreach: providing better training for academics, connecting the outcomes of relevant communication research with communicators, using check lists to see if press releases and blogs about research are factually accurate and evaluating outreach activities to see if they are reaching their goals. I could go on for hours about this!
I understand that you are part of a new project using fact checking to combat misleading graphs. Could you say a little about what the project seeks to discover?
We just started this project, thanks to a LUF Lustrum Subsidy. I have always been interested in misleading graphs: there are so many ways to distort data. When I was talking to Peter Burger about fact checking, we were wondering about how to do this for graphs. With Sanne Willems, a statistician at the Institute of Psychology, I did an earlier project on communicating probabilities and we had been thinking about ways to present data. The field on this is so interesting: experts from journalism, statistics, health communication, journalism and psychology have totally different ideas about how you should do this. In our project we are setting up large randomized experiments to test these different theories. Should you correct misleading charts with visual effects or with rhetorics? We hope to find out and translate our results into practical advice for people who are battling misinformation. The project is led by our postdoc Winnifred Wijnker who has a background in film studies who brings in a lot of knowledge about visual communication.
You have worked in various interdisciplinary teams. Do you think that scientists are finding each other easily across disciplines to work on this topic, or do you see many obstacles to interdisciplinary approaches?
When I had just finished my PhD in mathematics, I wanted to switch to doing research on science communication. I started by reading the literature, but it was incredibly hard for me to make the switch on my own. I tried to get help from scholars in the humanities and social sciences, but it was also really hard to find people who were up for collaborating with me and I kept feeling we were not understanding each other. I left academia and worked for a few years as a self-employed science journalist. When I came back to the university as a professor of science communication it was easier to find collaborators, since I knew so many people in different fields from my journalistic work. I also found many of my collaborators via Twitter, where we already ‘knew’ each other's work and personality before we connected. I really enjoy being in such interdisciplinary teams where everyone brings in different strengths and you can learn so much from each other.
But it is harder to work in a team like that, there is often more paperwork and these projects are even harder to get funded than monodisciplinary projects. And publishing results can be complicated: with a study on the communication of statistics, communication journals thought it was too much statistics, and statistics journals thought it was too much communication (even though they both thought our study was well-executed and very relevant). Universities and funding agencies can really help in changing this. If everyone agrees that there should be more interdisciplinary research to improve science, then they should put their money where their mouth is.