Cyber Vulnerabilities and Nuclear Weapons Risks

Lauren J. Borja and M. V. Ramana
School of Public Policy and Global Affairs, University of British Columbia

In October 2018, the United States Government Accountability Organization (GAO) reported that “mission-critical cyber vulnerabilities” had been found in many weapon systems being developed by the U.S. Department of Defense (DOD). These vulnerabilities allowed testers to “take control of systems and largely operate undetected” [1, p. 21], and could allow hackers to do the same.

The GAO report identified three underlying reasons for this problem. First, computers have proliferated in the designs of almost all weapon systems and enable many of their functions and communications. Second, the DOD has only recently prioritized cybersecurity in its weapon systems; in many cases, cybersecurity was not even a consideration when earlier weapon systems were designed. Finally, the DOD has a shallow understanding of how to construct secure weapon systems after ignoring them for many years. As a result, the GAO report said, the DOD has fielded a generation of insecure weapon systems, which could jeopardize military networks for years to come.

The report did not name specific weapon systems or describe faults in detail; however, its authors confirmed that their study included the systems that can cause the most destruction: nuclear weapons [2]. Concerns about the cybersecurity of nuclear weapons have also been raised by outside researchers [3]–[6], and the warning comes at a time when many states are planning large modernizations of their nuclear arsenals.

This article will discuss some of these risks. It will start with a description of the basic principles of cybersecurity, their application to the U.S. nuclear arsenal, and how these principles are being challenged by the modernization of nuclear forces. It then describes a key policy choice that increases the risk that a cyberattack on some elements of the nuclear arsenal could result in accidental or inadvertent nuclear weapons use, and ends with some recommendations for improving safety.

General Cybersecurity Principles

Academics studying computer science and the computer industry have identified many principles to create secure computer systems [7]. Ideally, these principles should be incorporated during the design of any digital network that supports critical infrastructure, such as those used to control the electrical grid, send financial information, or, more importantly, to command and control nuclear arsenals. Implementing these principles after the design stage, when the system is being tested or deployed, could lead to costly and incomplete fixes for faults and bugs.

The six basic principles of secure computing are availability, reliability, safety, integrity, maintainability, and confidentiality.

  • Availability describes a system that readily offers the intended function. A system that lacks availability will experience intermittent outages in service, such as a cell phone service that is only available if the user is close to particular cell towers.
  • Reliability is the ability of a system to offer the correct function. Systems that are not reliable will not perform the intended service some of the time, such as a cell phone that sometimes connects to a different number than the user dialed.
  • Safety refers to a system whose operation, intended or otherwise, does not cause harm. Unintended actions can also be considered when assessing safety; a system that fails in a manner that does not cause harm is called fail-safe.
  • Integrity describes a system that prohibits nonauthorized users from altering its functions or data. Electronic banking services are used because people trust that the numbers they can view online reflect the balances in their accounts. The integrity of this system would break down if these numbers could be changed by hackers or rogue bank employees.
  • Confidentiality is a property of a system that does not disclose information to unauthorized parties. If online banking were to allow people other than the owners of the bank account and authorized bank employees to access the user account numbers and balances, it would no longer be confidential.
  • Maintainability refers to a system that can be repaired or updated. Systems that can only be updated during certain periods or under certain circumstances have poor maintainability.

While it is clear that all of these principles are desirable, achieving them simultaneously within a single system is challenging. The Mars Polar Lander, which crashed into the surface of Mars in December 1999, is an example of a system that was reliable but not safe [8]. The crash occurred because the landing software mistook turbulence from the Martian atmosphere as confirmation that Polar Lander had reached the planet’s surface. The software, performing according to its specifications, turned off the engines slowing the Polar Lander’s descent [9]. Prof. Nancy Leveson from the Massachusetts Institute of Technology, who specializes in building reliable and safe software systems, points out that, in some cases, safety and reliability can be in conflict: “making the system safer may decrease reliability and enhancing reliability may decrease safety” [8, p. 7].

Cybersecurity Principles and U.S. ICBMs

The United States operates a triad of delivery vehicles for its nuclear weapons. One component of this triad is the “land-based ballistic missile force (ICBMs)” consisting “of 400 land-based ICBMs, each deployed with one warhead” [10, p. i]. These ICBMs are stored inside hardened missile silos, which are controlled by a network of launch control centers. Each center directly controls, on average, about ten missiles, and a secondary launch control center monitors the commands sent by the primary center [11]. Each ICBM contains a missile guidance computer, which is responsible for directing the nuclear-armed missile to its intended target [12]. The missile guidance computer can store multiple target locations according to what is called for by the U.S. nuclear war plan [13]. To launch a nuclear attack, the launch control centers specify both a target location and an execution order by either selecting one of the pre-stored options or manually entering different information [14].

It is not straightforward to apply the previously described principles to assess the cybersecurity of the U.S. ICBM force because publicly available information on these systems is greatly limited. One indirect approach is to examine measures of general security and there it is clear that the United States does seem to apply the principles of availability, reliability, safety, integrity, maintainability, and confidentiality to its ICBM force [15]. That being said, there are certain differences between physical- and cyber- security measures, and the incorporation of security measures in the physical realm should not be taken as confirmation of these principles in the cyber realm.

As described in the case of the Mars Polar Lander, there are conflicts between these different principles. A particular problem for nuclear weapons is the need for confidentiality, which can and does impact their safety or reliability. Because of the sheer complexity of the computer systems involved, the design of the hardware and software will have to be carried out by teams of engineers, often involving people who may not be allowed to handle confidential or secret data. Thus, specifications for how the system should operate must be communicated in a non-classified manner. But that is usually incomplete, and the inadequacy of information available to software and hardware professionals can lead to problems.

An illustration of the affect this problem can have was the replacement in 2007 of the internal guidance computers inside U.S. land-based nuclear missiles [16]. During testing, the new system gave inaccurate guidance [17]. These inaccuracies were eventually traced to errors in rounding and truncation in the software of the guidance computer [18]. The most likely reason for this underlying error is that design specifications were not properly outlined, as was the case with the Mars Polar Lander [19].

Modernization Brings New Challenges for Cybersecurity

When it comes the U.S. nuclear arsenal, one particular cybersecurity concern pertains to the system for nuclear command and control, a term given to the infrastructure, procedures, and policies used to direct and control nuclear forces. This system involves a vast network of computers that are interconnected and in constant communication with each other. As a result, there are many points of attack for potential hackers. Other aspects of the nuclear arsenal are also computerized and thus susceptible to cyberattacks. If left unaddressed, some of the weaknesses identified could lead to the accidental launch of nuclear weapons and even inadvertent nuclear war [3]–[6].

Worsening the cybersecurity problem, the United States [20] is modernizing its nuclear arsenal by developing new weapon systems and investing in improvements in their command and control networks. If modernizations of nuclear arsenals are to follow the precedent established by other military weapon systems, the use of computers and digital electronics in these systems will greatly expand, accentuating cyber risks to nuclear command and control.

Not only does modernization involve the introduction of computers into hitherto uncomputerized parts of the arsenal, it also brings in new challenges to older parts of the system that have already incorporated computers and digital systems. Many of the components inside the nuclear command and control system are decades old; one instance of such an old system that has been often discussed in public is the use of floppy discs to direct nuclear forces [21]. New components will therefore be drastically different from those currently in use. That might seem like an advance, but the change will bring safety and security challenges. For instance, using floppy discs, an outdated technology, is definitely inconvenient and compromises maintainability, but, because of their relatively small capacity for data storage, they are less susceptible to computer malware and thus enhances integrity.

A further complication is the ongoing globalization of the supply chain for commercially-available computer components. Today’s computer components are often designed, fabricated, and assembled in many different nations. Even if a country has set up in-house fabrication facilities, it is likely that these will use machinery produced in other countries. The usage of equipment from multiple countries increases the possibility that malicious design features or hardware components might be covertly embedded into devices. Once they have been built in, these covert features are almost impossible to detect [22].

As nuclear systems are modernized, the tendency is to use commercially-available products, because developing new components from scratch will require more testing and resources. An example of the use of commercially-available, also called off-the-shelf, products includes the set of instruments that communicate flight data during missile flight tests. A U.S. Air Force budget justification document said that the replacement effort for this component “will maximize the use of off-the-shelf components to meet mission requirements” [23, p. 7].

The cybersecurity of the nuclear arsenal is also challenged by the lifecycle of modern computer components, which require more frequent updates and replacement. For computer systems used in everyday life, these updates are an inconvenience, but most people realize that undertaking these routine updates lowers their chances of being infected by malware or viruses. Weapon systems are different and incorporating such updates in a timely fashion might be difficult because of various constraints. For example, U.S. nuclear missiles are expected to be in continuous use and, as a result, missile maintenance requires prior approval by higher authority [24]. Nuclear command and control systems are also likely to have similarly constrained maintenance schedules. As a result, known vulnerabilities will likely persist much longer in weapon systems than in other systems that can be updated more frequently. These known vulnerabilities will certainly continue to compromise security.

The WannaCry ransomware attack, which crippled the British National Health Services, illustrated the dangers of infrequent software updates. The attack used a vulnerability that affected systems that had not yet installed the most recent Microsoft patch [25]. Replacing parts in the U.S. land-based missile system often stretches over many years, from design to fielding. Because of this, there is a chance that components might have become obsolete by the time they are fielded.

Concerns Due to Policy Choices

U.S. nuclear weapons policy exacerbates the existing cyber security vulnerabilities as well as those caused by modernization. If the United States military were to get information from the many sensors it has deployed that there was an incoming nuclear attack, its stated policy allows the President to launch nuclear weapons against the country that is believed to be attacking. This launch could take place within a few minutes, before the arrival of the attacking missiles. The hope is that countries will be less likely to launch a preemptive strike, because, in theory, retaliation is always guaranteed and this would decrease the value of a hypothetical first strike.

Because its nuclear command and control system must comply with the short decision-making times necessary for such rapid launch, the system is configured for quick use. This way of deploying missiles increases the risk of accidental or inadvertent use. For example, storing nuclear warheads inside missiles that are fueled with combustible materials leads to the potential for accidental explosions [26]. In 1980, an accidental leak of liquid propellant from a U.S. ICBM led to an explosion that ejected parts of the missile, including the nuclear warhead, from its reinforced silo. Thankfully, the nuclear warhead did not detonate [27].

There are also other precursors to potential nuclear war, sometimes due to computers from an earlier era. On June 3, 1980 at 2:26 am, a computer screen in a command post of the U.S. Strategic Air Command began indicating incoming Soviet-launched ballistic missiles. Within a few seconds, more missiles appeared [28]. Bomber pilots were notified to start their engines and await orders on the tarmac. U.S. ICBMs were prepared for launch as well. Fortunately people realized that something was amiss—the number of incoming missiles fluctuated wildly with no clear pattern of attack. A threat assessment conference dismissed the signals as spurious, and the nuclear bombers and missiles were told that the alert had ended. An investigation revealed that a computer chip failure had led to the erroneous readings [29]. However, if the spuriousness of the warning had not been realized with the roughly 30 minutes it takes for a missile to fly from the Soviet Union to the United States, the U.S. President might have decided to launch missiles at the Soviet Union.

A Modern Nuclear Accident

As more digital components are built into the nuclear command and control network and the weapons themselves, the potential for cyber related problems will increase. New problems will also present themselves as more components become digital or older components are modernized. Computer components introduce more uncertainty into the already complex task of controlling and directing nuclear forces. When combined with the short timescales for decision making put into place by launch on warning policies, problems introduced by digital components could lead to unpredictable accidents with catastrophic consequences [30], [31]. Knowing that modern technology makes small errors more likely, it is important to take steps to ensure that these malfunctions do not led to catastrophe.

Without access to classified data, it is difficult to prescribe specific steps that can increase the overall safety of the system. However, one can argue in general that measures that enhance safety, even if it comes at the cost of availability, are desirable from the viewpoint of reducing the chances of accidental or inadvertent nuclear weapons use. One example is to store the nuclear warheads separately from the ballistic missiles that carry them. Such measures would lower the risk of inadvertent or accidental nuclear war.

Lowered risk, of course, doesn’t mean no risk. In the longer term, the only way to eliminate such a risk is to eliminate nuclear weapons altogether.



[1] GAO, “Weapon Systems Cybersecurity: DoD Just Beginning to Grapple with Scale of Vulnerabilities,” United States Government Accountability Office, Washington, D.C., GAO-19-128, Oct. 2018.

[2] “New U.S. Weapons Systems Are a Hackers’ Bonanza, Investigators Find - The New York Times.” [Online]. Available: [Accessed: 12-Oct-2018].

[3] B. G. Blair, “Global Zero on Nuclear Risk Reduction: De-Alerting and Stabilizing the World’s Nuclear Force Postures,” Global Zero, Apr. 2015.

[4] A. Futter, “Hacking the Bomb: Cyber Threats and Nuclear Weapons,” Washington D.C.: Georgetown University Press, 2018, p. 208.

[5] B. Unal and P. Lewis, “Cybersecurity of Nuclear Weapons Systems: Threats, Vulnerabilities and Consequences,” The Royal Institute of International Affairs Chatham House, London, Research Paper, 2018.

[6] P. O. Stoutland and S. Pitts-Kiefer, “Nuclear Weapons in the New Cyber Age,” Nuclear Threat Initiative, Sep. 2018.

[7] A. Avizienis, J.- Laprie, B. Randell, and C. Landwehr, “Basic concepts and taxonomy of dependable and secure computing,” IEEE Trans. Dependable Secure Comput., vol. 1, no. 1, pp. 11–33, Jan. 2004.

[8] N. Leveson, Engineering a Safer World. MIT Press, 2011.

[9] “NASA Reveals Probable Cause of Mars Polar Lander and Deep Space-2 Mission Failures.” [Online]. Available: [Accessed: 25-Nov-2018].

[10] CRS, “U.S. Strategic Nuclear Forces: Background, Developments, and Issues,” Congressional Research Service, Washington, D. C., Nov. 2018.

[11] B. Blair, Strategic Command and Control, 1st edition. Washington, D.C: Brookings Institution Press, 1985.

[12] T. S. ICBM Prime team, “Minuteman Weapons System History and Description.” Intercontinental Ballistic Missile (ICBM) System Program Office (SPO), Jul-2001.

[13] B. G. Blair, “Mad Fiction,” Nonproliferation Rev., vol. 21, no. 2, pp. 239–250, Apr. 2014.

[14] Minuteman Systems Engineering, “WS-133A-M Upgrade Wing III and V: Integrated Minuteman Command and Control Systems,” Boeing Aerospace Company, Seattle, WA, D2-19944–1, Feb. 1978.

[15] L. J. Borja, “The Price of Modernization: Cyber Vulnerabilities in the Nuclear Arsenal,” in preparation.

[16] “Team Malmstrom first to accomplish missile guidance system replacement,” Malmstrom Air Force Base. [Online]. Available: [Accessed: 18-Nov-2018].

[17] “MINUTEMAN III GUIDANCE AND PROPULSION REPLACEMENT PROGRAMS (GRP),” The Office of the Director, Operational Test and Evaluation (DOT&E), Annual Report, Jan. 2001.

[18] “Minuteman III Guidance and Propulsion Replacement Programs,” The Office of the Director, Operational Test and Evaluation (DOT&E), Annual Report, Jan. 2003.

[19] N. Leveson, “Engineering a Safer World,” MIT Press. [Online]. Available: [Accessed: 30-Jan-2018].

[20] H. M. Kristensen and R. M. Norris, “United States nuclear forces, 2018,” Bull. At. Sci., vol. 72, no. 4, pp. 120–131, Mar. 2018.

[21] “Information Technology: Federal Agencies Need to Address Aging Legacy Systems,” US Government Accountability Office, GAO-16-46, May 2016.

[22] S. Bhasin and F. Regazzoni, “A survey on hardware trojan detection techniques,” in 2015 IEEE International Symposium on Circuits and Systems (ISCAS), 2015, pp. 2021–2024.

[23] “ICBM - EMD FY2013,” Air Force, Exhibit R-2, RDT&E Budget Item Justification 0604851F, Feb. 2012.

[24] Commander of the Air Force Global Strike Command, “Intercontinental Ballistic Missile (ICBM) Operational Test and Evaluation (OT&E),” Air Force, Test and Evaluation 99–102, Mar. 2011.

[25] C. Graham, “NHS cyber attack: Everything you need to know about ‘biggest ransomware’ offensive in history,” The Telegraph, 13-May-2017.

[26] Z. Mian, M. V. Ramana, and R. Rajaraman, “Plutonium dispersal and health hazards from nuclear weapon accidents,” Curr. Sci., vol. 80, no. 10, pp. 1275–1284, 2001.

[27] E. Schlosser, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, Reprint edition. New York: Penguin Books, 2014.

[28] D. Hoffman, “The Fear Factor,” Foreign Policy, 08-Jul-2010.

[29] “False Alarms in the Nuclear Age — NOVA | PBS.” [Online]. Available: [Accessed: 27-Aug-2018].

[30] C. Perrow, Normal Accidents: Living with High-Risk Technologies, Revised ed. Princeton, NJ: Princeton University Press, 1999.

[31] S. D. Sagan, The Limits of Safety: Organizations, Accidents, and Nuclear Weapons, 1st ed. Princeton University Press, 1995.

These contributions have not been peer-refereed. They represent solely the view(s) of the author(s) and not necessarily the view of APS.