Navigation auf uzh.ch

Suche

UZH News

Fake news

Thinking More Slowly

Fake news is everywhere and achieving a new level of quality thanks to artificial intelligence. Safeguarding ourselves against it isn’t easy, as many people prefer to believe plausible misinformation rather than complicated facts.
Text by Stefan Stöcklin; Translation by Astrid Freuler
We encounter misinformation at every turn. To debunk it, it often helps to look a little closer at the source of the information. (Picture: istock/RichVintage)

Earlier this year, some surprising news regarding electric stones caused a stir. Miners in the Congo had discovered a mineral which, similar to a battery, could hold a charge. The associated video showed a young man holding a wire to the shiny metallic stone, causing a small lamp to light up. The report triggered a considerable response on social media – as did a message relating to the 5G communication standard. The latter stated that 5G’s fast data transmission relies on a higher frequency range of around 60 gigahertz, which is said to be associated with dangerous health risks.

If we didn’t have the news agencies’ fact checkers, both instances of misinformation would have probably spread further still. AP debunked the news from the Congo. Minerals cannot store electricity, at best they can transmit it. In the case of 5G, a brief search shows that the frequencies used are much lower than the 60 GHz claimed, lying in the same range as those used for common 4G networks or WLAN. Most of us, if not all, have come across false reports and fake news. A representative survey on Covid-19 misinformation carried out in spring 2021 by Sabrina Kessler from UZH’s Department of Communication and Media Research (IKMZ) and her colleagues highlights how widespread false information is.

A quarter of respondents said that they are confronted with misinformation on a regular or even daily basis. And according to the latest Science Barometer survey conducted in fall 2022, people in Switzerland “occasionally” get the impression that deliberate attempts are made to mislead them with scientific misinformation.

Sabrina Kessler

Anyone who thinks they’re immune to fake news is wrong. In principle, we’re all susceptible to it, at least on topics we know little about.

Sabrina H. Kessler
Communications researcher

Missing gatekeepers

While misinformation is quite rare in traditional media, in social media networks it is rife. “Everyone can generate and spread news on social media without any substantiation,” says Sabrina Kessler. Kessler, a senior teaching and research assistant at IKMZ who is researching misinformation and fake news across several projects, points to the missing gatekeepers on social media.

Unlike in mainstream media, where journalists check sources and pictures, social media channels largely operate without the input of a monitoring body. The circulation of information is boosted by invisible algorithms geared toward generating maximum interaction between users. Whatever is clicked on, liked or shared, rises within the hierarchy and becomes more visible in the feed. In this competition for attention, emotive and simplified information fares best. This makes it easier for misinformation to spread. Research shows that adding emotive words increases the reach of a post.

With the rapidly advancing digitalization, especially in the area of artificial intelligence, new forms of deception are emerging. Systems are getting ever more sophisticated in recognizing voices, languages and faces, paving the way for novel forms of disinformation. So-called deepfakes of prominent politicians are already in circulation. These put false statements into the mouths of a politician, who appears to be expressing them in their own voice. The ChatGPT chatbot is currently demonstrating the potential of such AI-based language models. “The generated content gives an impression of objectivity and reliability that doesn’t actually exist,” says Sabrina Kessler. This makes it all the more important to label content generated by chatbots as such.

There is a consensus among the established democracies of the West that disinformation and conspiracy theories are damaging. They undermine confidence in institutions and in science, can lead to the marginalization of social groups and even legitimize violence. A concrete illustration of the negative impact of fake news was and still is provided by the coronavirus pandemic. Countless incorrect reports about the development and production of vaccines were shared – from the dangers of the technology through to contamination of the vaccine and health risks associated with vaccination. “For people who are already skeptical, misinformation can increase their concerns and cause them to turn down the vaccination,” says Sabrina Kessler.

This stance endangers not only the skeptics but also society as a whole. At the same time, misinformation led some people to use abstruse alternative remedies such as methanol. In Iran, more than 1,000 people drank the toxic alcohol in March 2020 due to a claim that it would protect them from the virus. Almost 300 of them died.

Fast-thinking mode is intuitive, emotional and subconscious, slow-thinking mode is logical, deliberate and less immediate. Slow thinking enables us to analyze and evaluate information.

Fatal false reports

Alongside Covid-19, climate change is another topic that highlights the detrimental effects of fake news. For years, the denial of scientific evidence delayed the implementation of measures. Investigations show that the oil industry have known about human-made climate change since the 1970s and disputed it despite this knowledge, often by disseminating disinformation. These examples underline the fatal repercussions that falsified reports can have for society. “That is why counteracting the dissemination of misinformation and conspiracy theories is an important societal challenge,” says Sabrina Kessler.

A sobering note on this: “Anyone who thinks they’re immune to fake news is wrong. In principle, we’re all susceptible to it, at least on topics we know little about,” Kessler adds. After all, who can instantly evaluate a report on quantum physics or cryptography, or the mechanisms behind energy prices? In our busy day-to-day lives, we often don’t allow ourselves sufficient time to analyze and understand complex news. Instead, we operate in fast thinking mode. We make do with intuitive explanations and are more likely to trust statements if we have heard them before. This also applies to misinformation. That’s why Kessler doesn’t like to cite examples, precisely in order to avoid repetition of false facts that become lodged in people’s minds.

Avoiding inner conflict

The differentiation between fast and slow thinking goes back to a theory coined by the neuroscientist Daniel Kahnemann. Fast-thinking mode is intuitive, emotional and subconscious, slow-thinking mode is logical, deliberate and less immediate. A predominance of fast thinking makes us susceptible to disinformation and half-knowledge – and it focuses our attention on news that seems familiar to us. Slow thinking enables us to analyze and evaluate information. In real life, the two modes of thinking are linked and operate with and alongside each other.

However, the influence of fast and slow thinking on attitudes and opinions isn’t that easy to substantiate. “When investigating issues on the internet, many people do so independently of their opinion. This is also corroborated by my research,” Sabrina Kessler acknowledges. Yet, not everyone adjusts their opinion when the researched information is contrary to their views.

Especially people with strongly-held convictions are much less inclined to revise their opinions and prefer to adhere to alternative facts and fake news which confirm their world view.
Ignorant or stubborn people won’t reconsider their viewpoint even in the face of facts and evidence, and sometimes the complete opposite happens. We’ve probably all experienced discussions in which the disproof of an argument has led the other person to harden their views. “We call this the boomerang effect,” says Kessler, adding that this also shows in surveys. “Reading a text that refutes their beliefs can reinforce a participant’s incorrect convictions.”

Kessler explains this phenomenon as a cognitive dissonance. Contradictory opinions generate an inner conflict, an uncomfortable feeling. To get away from this, people with strong views will shore up their opinion and look for arguments that support their incorrect convictions. Instead of being persuaded by actual facts, they ignore them.

Immunization against fake news

While these psychological mechanisms hamper the fight against misinformation, the unmasking of fake news remains a herculean task. “We need implementable regulations on the internet, fact-checking organizations, promotion of scientific communication and educational provisions,” says Kessler, citing just a few of the measures that were named in a poll of experts.

As misinformation embeds itself in people’s minds more readily than it can later be disproved, the most effective form of preventing the spread of fake news entails people becoming immune against it, so to speak. Experts do in fact recommend a kind of mental vaccination against misinformation. This helps people to prepare themselves for possible fake news and, ideally, provides counterarguments in advance. “Ultimately, such a ‘vaccination’ enables people to unmask misinformation as it occurs,” adds Kessler. In the case of the electrically charged stones this would have been easy. The report had almost all the common hallmarks of disinformation.

This text is taken from the dossier of the UZH Magazine 1/2023.

Weiterführende Informationen

More informationen