Tell it Like it is: 7 Lessons from TMI

by Peter M. Sandman

The Three Mile Island (TMI) nuclear plant accident in Middletown, USA made global news in March and April 1979. The event turned out to be a “school” for many. One “student” was then a young professor who covered the story behind the headlines, and learned about the news and information business along the way.



The most lasting effect Three Mile Island had on me was what it taught me about crisis communication — lessons that stood me in good stead over the 25-plus years that followed and especially after the September 11 terrorist attack on the United States.
What are some of these lessons?

1. Pay Attention to Communication

Just about all the experts agree that Three Mile Island (TMI) was not a serious accident. That doesn’t mean it wasn’t a serious screw-up. Things went wrong that should never go wrong. When they pumped the accident conditions through the Babcock and Wilcox (B&W) simulation of the TMI plant, they got a total core meltdown and a genuine catastrophe; fortunately, reality was less conservative than the B&W simulation. So it’s a little like a drunk successfully crossing a highway blindfolded. In human health terms, nothing much happened at TMI. Awful things almost happened.

TMI was by no means the only near-miss in the history of nuclear power. (The frequency of near-misses and the infrequency of real disasters — Chernobyl being the only one we know about for sure — signifies either that nuclear power is an intolerably dangerous technology and we’re living on borrowed time, or that “defense in depth” works and a miss is as good as a mile.) But TMI was the only near-miss that captivated public attention for weeks, that is widely misremembered as a public health catastrophe, that is still a potent symbol of nuclear risks, and that as a result has had devastating repercussions for the industry itself.

What went wrong at TMI — really, really wrong? The communication.

Communication professionals were minor players at TMI. I asked Jack Herbein, the Metropolitan Edison (MetEd) engineering vice president who managed the accident, why he so consistently ignored the advice of his PR specialist, Blaine Fabian. (Risk communication hadn’t been invented yet.) He told me, “PR isn’t a real field. It’s not like engineering. Anyone can do it.” That attitude, I think, cost MetEd and the nuclear power industry dearly. And that attitude continues to dominate the nuclear industry, contributing to one communication gaffe after another. Nuclear power proponents keep shooting themselves in the foot for lack of risk communication expertise. (This observation is obviously a little self-serving, since I sell risk communication training, but I think it’s also on target.) Although risk communication skills are learnable, they’re not bred in the bone — certainly not bred in the bone for the average nuclear engineer.

2. Err on the Alarming Side

In the early hours and days of the TMI accident, nobody knew for sure what was happening. That encouraged Metropolitan Edison to put the best face on things, to make the most reassuring statements it could make given what was known at the time. So as the news got worse, MetEd had to keep going back to the public and the authorities to say, in effect, “it’s worse than we thought.”

This violated a cardinal rule of crisis communication: Always err on the alarming side. Make your first communication sufficiently cautious that later communications are likely to take the form, “it’s not as bad as we feared,” rather than “it’s worse than we thought.” In the 25 years since, I have seen countless corporations and government agencies make the same mistake. Its cost: the source loses all credibility. And since the source is obviously underreacting, everybody else tends to get on the other side of the seesaw and overreact.

That’s why Pennsylvania Governor Dick Thornburgh ordered an evacuation of pregnant women and preschool children. MetEd was saying the amount of radiation escaping the site didn’t justify any evacuation—and MetEd, it turns out, was right. But MetEd had been understating the seriousness of the accident from the outset. When the head of the Pennsylvania Emergency Management Agency (PEMA) misinterpreted a radiation reading from a helicopter flying through the plume, thinking it was probably an offsite reading of exposures reaching populated areas, Thornburgh didn’t even check with the no-longer-credible utility (which could have told him PEMA had misunderstood the situation). He decided better safe than sorry and ordered the evacuation.

In contrast to Metropolitan Edison, the Pennsylvania Department of Health adopted an appropriately cautious approach. The Health Department was worried that radioactive iodine-131 might escape from the nuclear plant, be deposited on the grass, get eaten by dairy cattle, and end up in local milk. Over a two week period health officials issued several warnings urging people not to drink the milk. Meanwhile, they kept doing assays of the milk without finding any I-131. Their announcements moved slowly from “there will probably be I-131 in the milk” to “there may be I-131 in the milk” to “there doesn’t seem to be I-131 in the milk, but let us do one more round of testing just to be sure.”

By the time the Health Department declared the milk safe to drink, virtually everyone believed it. While the Health Department’s caution hurt the dairy industry briefly, the rebound was quick because health officials were credibly seen as looking out for people’s health more than for the dairy industry’s short-term profits. This is a model for BSE and the beef industry, for SARS and the travel industry, for avian flu and the poultry industry.

3. Don't Lie, and Don't Tell Half-Truths

Companies and government agencies try hard not to lie outright, but they usually feel entitled to say things that are technically accurate but misleading—especially in a crisis when they are trying to keep people calm. Ethics aside, the strategy usually backfires. People learn the other half of the truth, or just sense that they aren’t being leveled with, and that in itself exacerbates their anxiety. Panic is rare in crisis situations; people often feel panicky but usually manage to act rationally and even altruistically. But panic is paradoxically likelier when the authorities are being less than candid in their effort to avert panic.

The nuclear power plant in central Pennsylvania was in deep trouble. The emergency core cooling system had been mistakenly turned off; a hydrogen bubble in the containment structure was considered capable of exploding, which might breach the core vessel and cause a meltdown. In the midst of the crisis, when any number of things were going wrong, MetEd put out a news release claiming that the plant was “cooling according to design.”
Months later I asked the PR director how he could justify such a statement. Nuclear plants are designed to survive a serious accident, he explained. They are designed to protect the public even though many things are going wrong. So even though many things were going wrong at TMI, the plant was, nonetheless, “cooling according to design.” Needless to say, his argument that he hadn’t actually lied did not keep his misleading statement from irreparably damaging the company’s credibility.

Page 1 of 2