Les Earnest is well known to digital cognoscenti for his contributions to artificial intelligence, robotics and the Internet, but few know of his warning that inadvertent erections could start a nuclear war:
In the 1950s I helped design the SAGE [Semi-Automatic Ground Environment] air defense system … when we reviewed the BOMARC [Missile] launch control system, one of our engineers noticed a rather serious defect: if the launch command system was tested, … the “test” switch was then returned to “operate” without individually resetting the control systems in each missile that had been tested, they would all immediately erect and launch! Needless to say, that “feature” was modified rather soon after we mentioned it to Boeing. …
The SAGE system used land lines to transmit launch commands to the missile site and these lines were duplexed for reliability. … However on examination we discovered that if both lines were bad at the same time, the system would … amplify whatever noise was there and interpret it as a stream of random bits. … Fortunately, we were able to show that getting a complete set of acceptable guidance commands within this time was extremely improbable, so this failure mode did not present a nuclear safety threat. The official name of the first BOMARC model was IM-99A, so I wrote a report about this problem titled Inadvertent erection of the IM-99A.
Les concludes that:
SAGE was a gigantic fraud on taxpayers in that it was a “peacetime defense system” that would malfunction under an actual attack, much like France’s Maginot Line did in World War 2. … SAGE thus gave rise to a corrupt military-industrial-political establishment that has produced a string of largely useless command-control and weapons projects such as President Reagan’s phony “Star Wars” defense program and the current ongoing deployment of anti-missile systems that don’t work. But that is another story.
Whether or not one agrees with Les’ assessment of missile defense, in the Alice-in-Wonderland world of nuclear strategy, some supposed safeguards have the potential to turn around and bite us. A short clip from Peter Sellers’ classic movie Dr. Strangelove makes that point all too well. In this scene, the president has just been informed that a rogue general is in the process of starting World War III:
Muffley [the American president, played by Peter Sellers]: General Turgidson, I find this very difficult to understand. I was under the impression that I was the only one in authority to order the use of nuclear weapons.
Turgidson [an Air Force general, played by George C. Scott]: That’s right sir. You are the only person authorized to do so. And although I hate to judge before all the facts are in, it’s beginning to look like General Ripper exceeded his authority.
Muffley: It certainly does. Far beyond the point I would have imagined possible.
Turgidson: Well perhaps you’re forgetting the provisions of plan R, sir.
Muffley: Plan R?
Turgidson: Plan R is an emergency war plan in which a lower echelon commander may order nuclear retaliation after a sneak attack if the normal chain of command is disrupted. You approved it, sir. You must remember. Surely you must recall, sir, when Senator Buford made that big hassle about our deterrent lacking credibility. The idea was for plan R to be a sort of retaliatory safeguard.
Muffley: A safeguard.
Turgidson: I admit the human element seems to have failed us here. But the idea was to discourage the Russkies from any hope that they could knock out Washington, and yourself, sir, as part of a general sneak attack, and escape retaliation because of lack of proper command and control.
The tradeoff between maintaining a credible deterrent and the danger of an unauthorized launch is one that deserves far more public attention. To reduce the risk of an accidental nuclear war, we need to better balance the risks and rewards of strategies intended to enhance our national security.
Les Earnest’s full comments on SAGE
An earlier post on this blog about another dangerous “safeguard”