• English
  • العربية
  • 中文
  • Français
  • Русский
  • Español

You are here

Countering Nuclear Misinformation

What Works and Why

Zion Lights

Memorable but misleading slogans can shape opinion more effectively than nuanced facts. (Graphic: A. Barber Huescar/IAEA)

Zion Lights is an award-winning science communicator based in the United Kingdom and known for promoting clean energy, especially nuclear power. In this interview with the IAEA, she addresses misinformation and disinformation about nuclear energy.

Misinformation and disinformation, in one form or another, have been around for a long time. What makes the situation different today?

History is full of examples of disinformation, which is shared with the intent to deceive, from Roman emperors shaping public perceptions through inscriptions on coins to Nazi propaganda using radio and cinema. And we’ve all seen the harmful effects of misinformation, which the United Nations defines as the unintentional spread of inaccurate information shared in good faith by people unaware that they are passing on falsehoods.

The biggest difference today is social media. It spreads false information globally and instantly. As a major news platform, social media has reshaped how we access and trust information. To counter it, we must understand why we are so susceptible.

Why are people so susceptible to misinformation?

At least 188 cognitive biases that influence people’s perception have been identified. These biases —shaped by past experiences and emotions and acting as mental shortcuts — make it easier for us to process information. However, they often reinforce our existing beliefs, leading us to accept falsehoods as truth.

Can you give examples of these cognitive biases?

Examples of cognitive bias include:

Confirmation bias: Seeking information that supports our beliefs.

Anchoring bias: Relying too heavily on initial information.

Availability bias: Believing what’s easiest to remember.

Familiarity bias: Accepting something as true because we hear it often.

What else contributes to misinformation?

Repetition strengthens misinformation. The more a falsehood is repeated, the more credible it appears. In psychology, this is known as ‘fluency for truth’, and it makes a lie easier to remember than complex scientific information. Framing plays a crucial role as well. Opponents of nuclear power have fuelled fears about nuclear waste for decades. Memorable but misleading slogans can shape opinion more effectively than nuanced facts.

How have you addressed nuclear misinformation in your own work?

I use concise, catchy phrases such as “it’s only waste if you waste it” and “meanwhile, fossil fuel waste is stored in the Earth’s atmosphere.” These slogans are both accurate and ‘sticky’ and have become widely used.

Initially, some scientists resisted my slogans, preferring scientific papers over simple messages. However, when grounded in truth, slogans are effective. Phrases like “nuclear saves lives” and “nuclear energy is clean energy” contribute to shifting perspectives.

Why don’t facts alone change minds?

Science communication is a distinct field, but many scientists aren’t trained in it. As a result, they often rely on an outdated approach known as the ‘information deficit model’, assuming more facts will change minds. But beliefs are shaped by cognitive, social and emotional factors. Simply providing more data is often ineffective.

What is pre-bunking and how does it help counter misinformation?

Pre-bunking is one approach to countering misinformation. Think of it as a ‘cognitive vaccine’ against propaganda. It was first proposed by psychologist William J. McGuire in the 1960s. McGuire hypothesized that people could learn to spot propaganda if they were warned about it beforehand through this technique known as ‘pre-bunking.’ With a few caveats, this approach largely works.

Pre-bunking involves presenting factually correct information along with a pre-emptive correction or a generic warning about misinformation before the person encounters the misinformation. This requires thinking about what objections might be raised to the factually correct information in order to dilute the power of counter-messaging.

How do more advanced techniques such as inoculation theory work?

More advanced pre-bunking techniques use ‘inoculation theory’, which exposes people to weaker forms of persuasion, enabling them to understand how misleading persuasive techniques are used and to build immunity against more persuasive arguments by using critical thinking. This technique has been shown to increase the accurate detection of misinformation. Understanding how misleading persuasive techniques are used enables a person to develop the cognitive tools needed to ward off future misinformation attacks, as research has also found that this type of inoculation on one topic can help people spot misinformation in other areas as well.

What role does education play in building resistance to misinformation?

It can and should play an important role. To counter the propensity to believe fake news, information and media literacy should be integrated into education. Information literacy helps people assess information effectively. Media literacy helps them navigate platforms and sources.

Have you ever personally changed your mind due to better information?

We have all been influenced by misinformation at some point, and we remain vulnerable. It took me many years to change my perspective on nuclear energy, shifting from opposition to support through exposure to better sources and different perspectives. Countering misinformation requires patience and persistence. It’s about engaging with people correctly to foster deeper understanding. No one is an empty cup.

May, 2025
Vol. 66-2

Stay in touch

Newsletter