← Technical Library
Nuclear Weapons • Cold War

Command and Control

Nuclear Weapons, the Damascus Accident, and the Illusion of Safety

Eric Schlosser Penguin Press, 2013 632 pages

The Book

Command and Control runs on two parallel tracks. The first is a minute-by-minute reconstruction of the September 18–19, 1980, explosion at a Titan II missile silo near Damascus, Arkansas — a disaster triggered when a maintenance technician dropped a socket wrench and punctured a fuel tank holding aerosol-50, a storable hypergolic propellant. Over the next eight hours, the silo filled with toxic vapor, a series of miscommunications and procedural failures compounded the crisis, and at 3:00 AM the fuel ignited, blowing the 740-ton silo door 200 yards and launching a 9-megaton W53 thermonuclear warhead 100 feet into the air. It landed in a ditch. It did not detonate. One airman — Senior Airman David Livingston — died. Twenty-one others were injured.

The second track is a comprehensive history of American nuclear weapons safety, command and control systems, and the dozens of accidents the military designates "Broken Arrows" — incidents involving the loss, theft, seizure, or accidental detonation risk of nuclear weapons. Schlosser weaves these threads together, alternating chapters between the unfolding Damascus crisis and the broader institutional history, each illuminating the other. The Damascus story provides human urgency; the historical narrative provides systemic context.

Schlosser obtained thousands of pages of previously classified documents through Freedom of Information Act requests, supplemented by hundreds of interviews with missile crew members, weapons designers, Strategic Air Command (SAC) officers, Pentagon officials, and nuclear safety engineers. The resulting book reveals, in forensic detail, how close the United States came to accidental nuclear catastrophe — not once, but repeatedly, across decades, in incidents the public was never told about.

The Author

Eric Schlosser is an investigative journalist best known for Fast Food Nation (2001), which exposed the health, labor, and environmental consequences of the American fast food industry. That book spent two years on the New York Times bestseller list and established Schlosser's method: exhaustive documentary research, deep sourcing within closed institutions, and narrative structure that uses individual human stories to illuminate systemic failures.

Command and Control took six years of research. Schlosser interviewed surviving members of the Damascus missile crew — Propellant Transfer System (PTS) team members who entered the silo as it filled with toxic aerosol-50 vapor, officers who argued over evacuation decisions, and the airmen who were standing at the blast door when the explosion occurred. He interviewed weapons designers at Los Alamos and Sandia National Laboratories, former SAC commanders, safety engineers who had fought internally for stronger warhead safeguards, and Pentagon officials who had managed nuclear weapons policy across multiple administrations.

The book won the Los Angeles Times Book Prize for History in 2013 and was a finalist for the Pulitzer Prize in History. It was adapted into a 2016 PBS American Experience documentary of the same name, directed by Robert Kenner.

Key Insights

The Damascus Accident

On the evening of September 18, 1980, Airman First Class Jeff Kennedy was performing routine maintenance inside Titan II silo 374-7 near Damascus, Arkansas. He was using a socket wrench with a heavy socket attached to a torque wrench to remove a pressure cap on the missile's second stage. The socket slipped off the wrench, fell approximately 70 feet, bounced off a thrust mount, and punctured the first-stage fuel tank. The tank held aerosol-50 — a 50/50 blend of UDMH and hydrazine, the same family of storable hypergolic propellants documented in John D. Clark's Ignition!. On contact with its oxidizer, nitrogen tetroxide (NTO), aerosol-50 ignites spontaneously. That was, after all, the entire point of hypergolic propellants: instant, reliable ignition without spark systems. The design feature that made the Titan II a credible deterrent — its ability to launch within 60 seconds — was also what made the silo a bomb. Over the next eight hours, fuel leaked from the punctured tank, filling the silo with toxic, flammable vapor. At 3:00 AM on September 19, the fuel-air mixture ignited. The explosion blew the 740-ton reinforced concrete silo door 200 yards, hurled the W53 warhead — carrying a 9-megaton thermonuclear weapon, the most powerful warhead in the American arsenal — approximately 100 feet into the air. It landed in a roadside ditch 200 feet from the silo. It did not detonate. Senior Airman David Livingston, who was standing near the silo entrance, was killed by the blast. Twenty-one others were injured, several critically.

The Illusion of Safety

The official position of the United States military has always been that nuclear weapons are safe and under positive control at all times. Schlosser systematically dismantles this claim. He documents at least 32 publicly acknowledged "Broken Arrow" incidents and reveals through declassified records that the actual number is far higher. In 1958, a Mark 6 nuclear bomb was accidentally dropped from a B-47 over Mars Bluff, South Carolina; the conventional explosives detonated, destroying a house and injuring six people, though the nuclear core was not inserted. In 1961, a B-52 broke apart over Goldsboro, North Carolina, releasing two Mark 39 hydrogen bombs. One went through five of its six safety mechanisms — the arm/safe switch moved to "arm," the timer ran, the barometric pressure switches fired, and the firing signal was sent. A single low-voltage switch — the least reliable component in the safety chain — prevented a 3.8-megaton detonation over the Carolina countryside. In 1966, a B-52 collided with a refueling tanker over Palomares, Spain, scattering four hydrogen bombs across the Spanish coast; two had their conventional explosives detonate, spreading plutonium contamination over 558 acres of farmland. In 1968, a B-52 crashed on sea ice near Thule, Greenland, scattering weapons debris and plutonium across the ice sheet. The pattern Schlosser establishes is unmistakable: the safety record was maintained not by design but by luck.

Always/Never

The core technical dilemma of nuclear weapons safety is captured in two words: always/never. A nuclear weapon must always detonate when authorized command is given — this is reliability. It must never detonate under any other circumstance — this is safety. These two requirements are in fundamental tension. Every engineering decision that makes a weapon harder to detonate accidentally also introduces a potential failure mode that could prevent it from detonating when commanded. A coded lock can malfunction. A safety switch can jam. An environmental sensor can misread conditions. The weapons designers at Sandia and Los Alamos spent decades navigating this tradeoff, and the story of that navigation — the arguments, the compromises, the institutional resistance — is the engineering heart of the book. Always/never is not a problem that can be definitively solved. It can only be managed, and the history Schlosser documents is a record of how often the management failed.

Permissive Action Links

Permissive Action Links (PALs) are coded electronic locks built into nuclear weapons to prevent unauthorized detonation. The concept emerged in the late 1950s from weapons designers at Sandia National Laboratories who recognized that physical custody alone was an insufficient safeguard — a stolen weapon, a rogue officer, or a compromised unit could potentially arm and fire a warhead. PALs require the entry of a multi-digit code, transmitted through the chain of command from the President, before the weapon's electrical firing circuit can be completed. Secretary of Defense Robert McNamara ordered their installation across the arsenal beginning in 1962, over fierce resistance from the Air Force and Strategic Air Command. SAC's objection was operational: PALs introduced a component that could prevent a weapon from firing in combat, violating the "always" half of always/never. The resistance was so deep that for years, the PAL codes on Minuteman ICBMs were set to 00000000 — eight zeros — effectively bypassing the locks entirely while maintaining technical compliance with McNamara's order. The codes remained at 00000000 until 1977.

The Human Factor

Again and again in Schlosser's narrative, the proximate cause of safety failures was not engineering deficiency but human error, maintenance shortcuts, procedural violations, and institutional cultures that prioritized combat readiness over safety. SAC under General Curtis LeMay and his successors maintained a culture of perpetual alert that treated safety concerns as obstacles to mission readiness. Maintenance crews were undertrained, overworked, and under pressure to keep weapons systems operational at all times. At Damascus, the socket wrench that punctured the fuel tank was not the approved tool for the job — the correct tool had a torque-limiting mechanism that would have prevented the socket from detaching. But the correct tool was heavier and harder to use on an elevated platform, so crews routinely substituted the lighter, unapproved wrench. The violation was known. It was tolerated. The organization's real priority was keeping the missile on alert, and everything else — including safety procedures designed to prevent exactly what happened — was subordinated to that imperative.

The Sandia Engineers

Some of the book's most compelling figures are the weapons safety engineers at Sandia National Laboratories who fought, often against their own institutional hierarchy, to improve warhead safety. Bob Peurifoy, a Sandia engineer who spent decades pushing for safer weapon designs, emerges as a central character. Peurifoy advocated for insensitive high explosives (IHE) that would not detonate in a fire or crash, strong link/weak link safety systems that used environmental sensing devices to distinguish between normal and abnormal conditions, and enhanced nuclear detonation safety (ENDS) features. He was repeatedly told that the existing weapons were safe enough, that modifications would be too expensive, and that raising safety concerns publicly would undermine confidence in the deterrent. Peurifoy persisted. Many of the safety improvements in the modern arsenal exist because he and a small number of colleagues refused to accept "safe enough" when the consequence of failure was a nuclear detonation on American soil.

Selected Quotes

"Every nuclear weapon in the history of the United States has been involved in an accident."

— Paraphrase of a Sandia engineer's assessment, reflecting the statistical reality of Broken Arrow incidents across the arsenal

"The odds of a nuclear weapon being involved in an accident were small. But the United States had manufactured about seventy thousand of them — and even a slight probability, multiplied by thousands, could add up."

— Eric Schlosser, on the mathematics of nuclear risk

"By the simplest, most straightforward reading of the evidence, the Mk 39 Mod 2 bomb that landed on a farm near Faro, North Carolina — one of the two hydrogen bombs released from a B-52 that broke apart in midair — very nearly detonated. Only one safety mechanism prevented a full-scale nuclear explosion."

— Eric Schlosser, on the 1961 Goldsboro incident

"If you demand a weapon that will always work, you can't also promise that it will never go off by accident. The 'always' and the 'never' are at war with each other."

— A Sandia weapons designer, on the fundamental tension of nuclear safety engineering

"The Air Force had assured the public, for years, that hydrogen bombs did not, and could not, accidentally detonate. And yet leaked documents suggested that the Air Force itself wasn't so sure."

— Eric Schlosser, on the gap between public assurances and internal assessments

"We were just kids. We didn't know what we were doing. We were eighteen, nineteen years old, and they put us in charge of the most powerful weapon ever made."

— A former Titan II missile crew member, recalling his service

Where We Are Now

Schlosser's book was published in 2013, when the American nuclear arsenal was entering a period of relative neglect — aging infrastructure, declining budgets, and a widespread assumption that nuclear weapons were a legacy concern. That assumption has been overturned. The United States is now in the early stages of the largest nuclear weapons modernization program since the Cold War, and the strategic environment is more complex and multipolar than at any point since 1945. Every problem Schlosser documented — the always/never tradeoff, the human factor, the gap between institutional assurances and operational reality — is being replayed at a new scale.

The Nuclear Triad Modernization

The United States maintains its nuclear deterrent through a triad of delivery systems: land-based intercontinental ballistic missiles (ICBMs), submarine-launched ballistic missiles (SLBMs), and strategic bombers. All three legs are being simultaneously replaced for the first time since the original systems were deployed in the 1960s.

Leg Current System Planned Replacement Status
ICBMs LGM-30G Minuteman III (deployed 1970) LGM-35A Sentinel (Northrop Grumman) Delayed, over budget; Nunn-McCurdy breach declared 2024; estimated cost exceeds $140 billion
SLBMs / Submarines Ohio-class SSBN (14 boats, Trident II D5 missiles) Columbia-class SSBN (12 boats, Trident II D5LE missiles) Lead boat USS District of Columbia under construction; first patrol targeted for late 2030s
Bombers B-2 Spirit, B-52H Stratofortress B-21 Raider (Northrop Grumman) First flight November 2023; produced at Plant 42, Palmdale, CA; entering low-rate production

The total estimated cost of nuclear modernization across all three legs, plus warhead life extension programs and supporting infrastructure, is $1.5 to $2 trillion over 30 years. The Sentinel ICBM program alone has become one of the most troubled defense acquisitions in recent history, triggering a Nunn-McCurdy cost breach — the statutory threshold requiring Congressional notification and program recertification — in 2024, with costs projected to more than double the original estimates.

Arms Control in Collapse

The New START treaty, the last remaining bilateral nuclear arms control agreement between the United States and Russia, was signed in 2010 and limited each side to 1,550 deployed strategic warheads and 700 deployed delivery vehicles. Russia suspended its participation in February 2023, and the treaty expired in February 2026 with no successor agreement in negotiation. For the first time since 1972, there is no treaty-based framework constraining the nuclear arsenals of the two largest nuclear powers, and no mutual verification regime providing transparency into force posture and deployments.

The collapse of arms control is not limited to New START. The Intermediate-Range Nuclear Forces (INF) Treaty was abandoned in 2019. The Open Skies Treaty ended in 2020. The Comprehensive Nuclear-Test-Ban Treaty (CTBT) remains unratified by the United States. The architecture of arms control that constrained nuclear competition for five decades has largely dissolved.

New Nuclear States and Threshold Powers

The nuclear landscape Schlosser described was essentially bipolar: the United States and the Soviet Union, with smaller arsenals held by the United Kingdom, France, and China, and emerging programs in Israel, India, and Pakistan. That landscape is now significantly more complex. North Korea has conducted six nuclear tests and is estimated to possess 40–50 warheads with a growing missile delivery capability, including ICBMs theoretically capable of reaching the continental United States. Iran maintains a nuclear program that has produced enriched uranium at 60% purity — a short technical step from weapons-grade 90% — and is widely assessed to be a nuclear threshold state, capable of producing a weapon within weeks to months of a political decision. China is undergoing a dramatic nuclear buildup, with the Department of Defense estimating an arsenal of over 500 warheads by 2030 and more than 1,000 by 2035, including the construction of approximately 300 new ICBM silos in western China.

Hypersonic Weapons

A new class of delivery systems is complicating the already fragile architecture of nuclear stability. Russia's Avangard hypersonic glide vehicle, mounted atop an ICBM, maneuvers at speeds exceeding Mach 20 on unpredictable trajectories, designed specifically to defeat missile defense systems. China's DF-ZF hypersonic glide vehicle serves the same purpose. The United States is developing its own hypersonic weapons across multiple programs. These weapons compress decision timelines: a hypersonic weapon launched from a submarine off the coast could reach inland targets in minutes, far less time than the already-thin decision window for a traditional ICBM. The always/never problem does not change, but the time available to exercise command and control shrinks dramatically.

AI and Nuclear Command

The most consequential new variable in nuclear command and control is artificial intelligence. The debate centers on the role of autonomous and semi-autonomous systems in nuclear decision-making — specifically, whether AI should be integrated into early warning, threat assessment, or launch recommendation systems. Proponents argue that AI can process sensor data faster than human operators, reducing the risk of missed or misinterpreted warnings. Critics point to the same always/never problem Schlosser documents, amplified by machine speed: an AI system that misinterprets a satellite anomaly, a sensor glitch, or an ambiguous radar return could generate a recommendation for nuclear response in seconds, faster than any human can intervene.

The historical record provides precedent for concern. On September 26, 1983, Soviet lieutenant colonel Stanislav Petrov received an alert from the Oko satellite early warning system indicating five incoming American ICBMs. Petrov judged the alert to be a false alarm — partly because the system showed only five missiles, an illogically small first strike — and did not pass the alert up the chain of command. He was correct: sunlight reflecting off high-altitude clouds had triggered the sensors. A machine operating at machine speed, without Petrov's human judgment and contextual reasoning, might have escalated. The question of whether to trust AI with nuclear decisions is the always/never dilemma translated into software: a system must always detect a real attack and never generate a false alarm. These requirements remain in direct opposition, and no algorithm resolves the tension.

Cyber Threats to Nuclear C2

Schlosser, writing in 2013, could not have fully anticipated the scale of cybersecurity threats to nuclear command and control (C2) systems. The nuclear enterprise depends on communications networks, satellite systems, sensor arrays, and digital infrastructure that are all potential targets for cyberattack. A successful intrusion into early warning systems could generate false data mimicking an incoming attack. A disruption of secure communications could sever the chain of command at the moment it is most needed. The very age of some systems — the Minuteman III still relies on infrastructure designed in the 1960s — creates an ambiguous risk profile: legacy systems are less vulnerable to conventional network attacks because they predate modern networking, but they also lack modern security architecture and are maintained by a shrinking pool of personnel who understand their design.

The Doomsday Clock

The Bulletin of the Atomic Scientists' Doomsday Clock, a symbolic measure of existential risk to civilization, was set to 90 seconds to midnight in January 2023 — the closest it has ever been. It has remained at 90 seconds through 2024 and 2025, reflecting the combined pressures of the war in Ukraine and associated nuclear threats from Russia, the collapse of arms control treaties, proliferation risks, and the increasing integration of AI into military systems. For context, the Clock stood at 17 minutes to midnight in 1991, at the end of the Cold War. The current setting reflects a consensus among nuclear security experts that the aggregate risk environment is worse than at any point during the Cold War, including the Cuban Missile Crisis (when the Clock was set at 7 minutes).

Modern Safety: Has Always/Never Been Solved?

The good news, to the extent there is any, is that the specific engineering failures Schlosser documents have been substantially addressed. Modern warheads incorporate insensitive high explosives (IHE) that will not detonate in a fire, crash, or impact — addressing the Palomares and Thule scenarios directly. Strong link/weak link safety systems use environmental sensing devices (ESDs) that distinguish between the unique conditions of authorized delivery (specific acceleration profiles, barometric signatures, thermal environments) and abnormal conditions (crashes, fires, drops). Strong links are designed to survive any abnormal environment without activating; weak links are designed to permanently disable the weapon's firing circuit before abnormal conditions can activate the strong links. Enhanced PALs with multi-digit codes, limited-try lockout mechanisms, and anti-tamper features have replaced the 00000000 codes of the Cold War era.

But the human factor — the failure mode Schlosser returns to most insistently — remains. In 2007, six nuclear-armed AGM-129 cruise missiles were mistakenly loaded onto a B-52 at Minot Air Force Base and flown to Barksdale Air Force Base in Louisiana. For 36 hours, six live nuclear warheads were unaccounted for in the inventory system. No one noticed. The incident led to the firing of the Secretary of the Air Force and the Air Force Chief of Staff — the first time both had been removed simultaneously. In 2013, the year Command and Control was published, an investigation at Malmstrom Air Force Base found that Minuteman III launch officers had been cheating on proficiency exams, with answers shared across the force. The pattern Schlosser identified — institutional pressure to maintain readiness, at the expense of the procedures designed to ensure safety — had not changed.

The Titan II Connection

The propellants that exploded at Damascus — aerosol-50 (UDMH/hydrazine blend) as fuel and nitrogen tetroxide as oxidizer — are the same storable hypergolics that John D. Clark documented in Ignition!, another book in this library. Clark's account is the chemistry; Schlosser's is the consequence. The Titan II used these propellants precisely because they were hypergolic: self-igniting on contact, eliminating the need for ignition systems and enabling a launch within 60 seconds of the order. The design that made the Titan II a credible nuclear deterrent — storable, instantly ready, no cryogenic fueling delays — also meant that any breach in the fuel or oxidizer system created the conditions for a catastrophic, uncontrollable fire. Clark wrote about the chemistry with dark humor. Schlosser writes about the same chemicals with the gravity of a man documenting what happens when the chemistry escapes the laboratory and enters a weapons system maintained by teenagers working 24-hour shifts in underground silos.

Verdict

Command and Control is a 632-page argument, built on thousands of declassified documents and hundreds of interviews, that nuclear weapons safety in the United States has been maintained more by luck than by design. Schlosser does not argue against nuclear deterrence itself. He argues that the gap between what the public has been told about nuclear safety and what actually happened — the dropped bombs, the failed safety mechanisms, the fires, the crashes, the lost weapons, the PAL codes set to all zeros — represents a profound institutional failure of transparency and accountability. The engineering story is fascinating: the always/never tradeoff, the strong link/weak link architecture, the decades-long internal battles between safety advocates and operational commanders. But the human story is what gives the book its force. These were not abstract systems. They were maintained by real people, under real institutional pressures, making real mistakes, with consequences that could have been measured in megatons.

In the current era — with the largest nuclear modernization program since the Cold War underway, arms control frameworks collapsing, new nuclear states emerging, hypersonic weapons compressing decision timelines, and AI being integrated into military command systems — Schlosser's core warning is more relevant than it was in 2013. Complex systems fail in complex ways. The consequences of nuclear failure are absolute and irreversible. And the institutions responsible for managing that risk have, historically, been more interested in projecting confidence than in confronting the reality of how close they have come to catastrophe. The book does not offer comfortable reassurance. It offers something more valuable: the documented record of what actually happened, and the engineering and institutional analysis of why.