Lessons From Fukushima

The IAEA’s “Expert Mission to Japan” recently released its preliminary summary with important lessons for avoiding another Fukushima. It also has much to teach us about avoiding an even worse disaster involving nuclear weapons.

Among its key findings:

1. The tsunami hazard for several [Japanese nuclear reactor] sites was underestimated.

2. Nuclear regulatory systems should … ensure that regulatory independence and clarity of roles are preserved.

3. Severe long term combinations of external events should be adequately covered in design, operations, resourcing and emergency arrangements.

4. Emergency arrangements, especially for the early phases, should be designed to be robust in responding to severe accidents.

Here’s how those conclusions relate to a disaster involving nuclear weapons:

1. Society grossly underestimates the risk associated with nuclear weapons. Its complacency is inconsistent with Henry Kissinger’s estimate that “if nothing fundamental changes, then I would expect the use of nuclear weapons [by terrorists] in some 10 year period is very possible.” A number of other experts in defense and diplomatic matters concur. My work applying risk analysis to a failure of nuclear deterrence indicates that a child born today has at least a 10% chance of not living out his or her natural life due to our reliance on nuclear weapons, and the odds may be worse than 50-50.

2. The management of our nuclear arsenal lacks independent oversight. While, in theory, civilians are in control of the military, the reality is very different. Bruce Blair, a former Minuteman Launch Control Officer and now head of the World Security Institute, relates how former Secretary of Defense Robert McNamara was “shocked, absolutely shocked and outraged” when he learned that his order to put locks on those ICBMs had been nullified by setting the combinations to all zeros. Unlike McNamara, the brass were more worried about not being able to use the missiles should they be needed than they were by an unauthorized or accidental launch. Blair has another column that also demonstrates the need for greater independent oversight.

3. Combinations of events that could cause a nuclear catastrophe have not been considered. On January 25, 1995, a meteorological research rocket launched from Norway was mistaken by Russian air defense as the prelude to a nuclear attack, causing the Russian nuclear launch codes to be opened. Fortunately, Yeltsin was sober enough to make the right decision and was not overruled by his General Staff – who had the ability, but not the authority, to launch without his assent. At least as important, this false alarm did not occur during a crisis, such as the 2008 Georgian War, when it would have been more likely to be mistaken for the real thing.

4. Instead of being designed to resolve crises in their early phases, nuclear strategy often dictates a bellicose response to demonstrate resolve and increase the credibility of an otherwise incredible threat. This is clearly enunciated in a now declassified 1995 US STRATCOMM report entitled “Essentials of Post-Cold War Deterrence,” which states in part:

Because of the value that comes from the ambiguity of what the US may do to an adversary if the acts we seek to deter are carried out, it hurts to portray ourselves as too fully rational and cool-headed. The fact that some elements may appear to be potentially “out of control” can be beneficial to creating and reinforcing fears and doubts within the minds of an adversary’s decision makers. This essential sense of fear is the working force of deterrence. That the US may become irrational and vindictive if its vital interests are attacked should be part of the national persona we project to all adversaries.

The nuclear disaster at Fukushima has reminded society that it must be more prudent in planning for nuclear power. I pray that it also reminds us how imprudent it is to clothe the earth in a nuclear vest, controlled by fallible human beings.

Martin Hellman

For more information, visit our related web site or read my recent paper, “How Risky Is Nuclear Optimism?”

About Martin Hellman

I am a professor at Stanford University, best known for my invention of public key cryptography -- the technology that protects the secure part of the Internet, such as electronic banking. But, for almost 30 years, my primary interest has been how fallible human beings can survive possessing nuclear weapons, where even one mistake could be catastrophic. My latest project is a book with the audacious subtitle "Creating True Love at Home & Peace on the Planet." Its soon to open website explains: https://anewmap.com.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s