Tell it Like it is: 7 Lessons from TMI

by Peter M. Sandman (continued)

4. Expect the Media to be Over-reassuring

In ordinary times, journalists tend to make the news as dramatic as they can; their sensationalist bias is built-in. But not in a crisis—that’s when journalists ally with their sources in a misguided effort to keep people calm by over-reassuring them.

The Kemeny Commission (the US government commission set up to investigate TMI) conducted a content analysis of network, wire service, and major newspaper coverage during the first week of the 1979 TMI accident. The Commission’s expectations of sensationalism were not confirmed. Of media passages that were clearly either alarming or reassuring in thrust, 60% were reassuring. If you stick to the technical issues, eliminating passages about inadequate flow of information and general expressions of fearfulness from local citizens, the preponderance of reassuring over alarming “technical” statements becomes 73% to 27%. It didn’t seem that way at the time, of course—for at least three reasons.

  1. Frightened people pick up more on negative information than on positive information; Vincent Covello, Director of the Center for Risk Communication in New York, argues that in a crisis it takes three pieces of good news to balance one piece of bad news.
  2. The information that something previously assumed to be safe may or may not be hazardous naturally strikes people as alarming, almost regardless of the amount of attention paid to the two sides. (Imagine reading this evening that scientists disagree over whether your favourite food is carcinogenic.) Thus, sociologist Allan Mazur has found that public fearfulness about risky new technologies is proportional to the amount of coverage, not to its character. TMI was a big, big story; even if the content was reassuring, the amount of content was itself alarming.
  3. Most importantly, over-reassuring content is alarming. The public, especially the local public, could tell that the authorities were deeply worried and thoroughly bewildered; in that context, seeing them on TV insisting that the plant was cooling according to design and everything was under control had to make things worse.

Reporters at TMI weren’t averse to accusing their sources of withholding information. But they were reluctant to report—reluctant even to notice—how often their sources didn’t know what was going on themselves and how frightened their sources were about what might happen next.

5. Keep it Simple

The need for simple explanations of complex phenomena isn’t just an axiom of crisis communication; it is fundamental to any sort of communication. But two things change in a crisis. First, audiences are less tolerant of complexity when they’re upset. Apathetic people just stop listening when they can’t understand what’s being said; interested people ask for clarification. But frightened or angry people decide you’re trying to con them, and therefore become more frightened and more angry.

The second reason why keeping it simple is a problem in crisis situations is this: Sources tend to speak more complexly when they’re upset. Some of this is unconscious; your anxiety makes you hide behind big words and fancy sentences. Some of it is intentional. Nuclear Regulatory Commission officials at Three Mile Island were worried (mistakenly, as it turned out) that a hydrogen bubble in the containment might explode and cause a meltdown. When they shared this possibility with journalists, they did it in such polysyllabic prose that reporters thought they were denying it, not acknowledging it.

The level of technical jargon was actually higher at TMI when the experts were talking to the public and the news media than when they were talking to each other. The transcripts of urgent telephone conversations between nuclear engineers were usually simpler to understand than the transcripts of news conferences. They said things to each other like: “It looks like we’ve got a load of core damage”— then made the same point to the media in phrases so technical that not one reporter got the message.

To be sure, jargon is a genuine tool of professional communication, conveying meaning (to those with the requisite training) precisely and concisely. But it also serves as a sort of membership badge, a sign of the status difference between the professional and everyone else. And especially in a crisis, it is a way to avoid looking scared and avoid communicating scary information.

6. Pay Attention to Outrage

Reporters are a pretty thick-skinned group when it comes to danger—the sorts of people who automatically drive toward the scene of any disaster. But they were frightened at TMI. It’s one of the few times I have ever witnessed a roomful of reporters rush a press secretary and demand to be moved further from the story.

Local citizens, obviously, were even likelier to have found the accident terrifying (though it is worth noting that, as usual, there was no panic). The biggest source of outrage at TMI was undoubtedly mistrust—a growing sense that MetEd executives for sure, and maybe NRC officials as well, weren’t saying everything they knew. (The sense that they didn’t know everything they should came later. Officials could have reduced post-crisis recriminations by acknowledging their uncertainty and all the things they wished they knew but didn’t.) As it usually does in crisis situations, the mistrust fed the fear. But there were plenty of other outrage components in play at TMI.

Among them:

Knowability. Expert disagreement is an aspect of knowability that generates even more outrage and fear than garden-variety uncertainty— and expert disagreement is rampant over the health effects of low levels of radiation. Some experts claim even very small exposures can lead to cancer; others argue that small exposures actually provide health benefits (the so-called hormesis hypothesis).

Another aspect of radiation’s knowability problem is its undetectability. Many reporters at TMI wore radiation monitors, a privilege few ordinary citizens had. Even so they were nervous. One reporter told me he’d be a lot more comfortable if radiation were purple instead of invisible. Another, a veteran war correspondent, noted: “In a war you worry that you might get hit. The hellish thing here is worrying that you already got hit.”

Control. One of the most important — and difficult—ways to help people cope with a crisis is to offer them things to do. Reporters were busy at TMI, which kept their fears at bay.Local residents, on the other hand, had little to do but follow the media and stew. That feeling of complete powerlessness generates a lot of extra fear. One possibility that was considered and rejected was to distribute potassium iodide (KI). It floods the thyroid with iodine; if there had been much radioactive iodine emitted at TMI (as it turns out there wasn’t), the KI could have prevented some thyroid cancers. But the real issue was a communication issue. Would distributing KI scare people by implying there might be serious radiation releases, or would it reassure people by giving them something to do to protect themselves?

The former argument won the day, and the KI stayed in the warehouse.

Dread. Cancer is an especially dreaded way to die. And among carcinogens, radiation is an especially dreaded source. Experts have calculated that particulates and other pollutants normally released into the air around Three Mile Island 25 years ago were deadlier than the amount of radiation actually released during the TMI accident. By shutting down some factories temporarily, therefore, the accident may even have improved local health! Despite these data, I still get two or three phone calls and emails a year from people who live near TMI, or are thinking of moving to the area, asking my advice on whether it’s safe. And many are still convinced it isn’t.

Memorability Nuclear disaster has been a feature of science fiction since the early 1950s. Almost everyone who lived through the TMI accident had already seen countless nuclear reactors run amok — in movies, in novels, in comic books. So it was easy to believe a meltdown was around the corner. It didn’t help that The China Syndrome, a movie about a nuclear power plant disaster, had just opened. Harold Denton, the senior manager the NRC had sent to the site, took an evening off to go see the movie in Harrisburg; a few hundred reporters (including me) went with him.

7. Get the Word Out

Most government agencies and corporations respond to crisis situations by constricting the flow of information. Terrified that the wrong people may say the wrong things, they identify one or two spokespeople and decree that nobody else is to do any communicating. In an effort to implement this centralized communication strategy, they do little or nothing to keep the rest of the organization informed.

There is certainly a downside to authorizing lots of spokespeople; the mantra of most risk communication experts is to “speak with one voice.” But I think the disadvantages of the one-voice approach outweigh the advantages. This approach almost always fails, as it failed at TMI. Reporters took down the license plate numbers of MetEd employees, got their addresses, and called them at home after shift. Inevitably, many talked—though what they knew was patchy and often mistaken. The designated information people for the NRC and the utility, meanwhile, had trouble getting their own information updates; those in the know were too busy coping with the accident to brief them. (The lesson here: There need to be technical experts at the scene whose designated job is to shuttle between the people who are managing the crisis and the people who are explaining it.) The state government felt its own information was so incomplete that Press Secretary Paul Critchlow asked one of his staff to play de facto reporter—trying to find out what was going on so Critchlow could tell the media and the Governor.

While the utility and the federal government tried to speak with one voice, the local anti-nuclear movement stopped speaking altogether. During the accident, hundreds of reporters called the Harrisburg office of TMI Alert, the area’s major anti-nuke group. They got a recorded message explaining that the staff had left town for their own safety.
In today’s world of 24/7 news coverage and the Internet, the information genie is out of the bottle. If official sources withhold information, we get it from unofficial sources; if official sources speak with one voice, we smell a rat and seek out other voices all the harder...and find them. But crisis information wasn’t controllable 25 years ago in central Pennsylvania either. As my wife and colleague Jody Lanard likes to point out, even in the pre-Gutenberg era, everyone in medieval villages knew when troubles were brewing. The information genie never was in the bottle. Keeping people informed and letting them talk is a wiser strategy than keeping them ignorant and hoping they won’t.

Peter M. Sandman is a pre-eminent risk communication consultant and speaker based in Princeton, NJ, USA. He is Professor of Human Ecology at Rutgers University and Professor of Environmental and Community Medicine at the Robert Wood Johnson Medical School. For more on Dr. Sandman’s approach to risk communication, see
www.psandman.com. E-mail: [email protected]

This article is based on one published in Safety at Work (April 2004), www.safetyatwork.biz