AI Nuclear

7 bookmarks
Custom sorting
Thinking Machines Will Change Future Warfare
Thinking Machines Will Change Future Warfare
Until now, deterrence has been about humans trying to dissuade other humans from doing something. But what if the thinking is done by AI and autonomous systems? A wargame explored what happens to deterrence when decisions can be made at machine speeds and when states can put fewer human lives at risk.
·rand.org·
Thinking Machines Will Change Future Warfare
Does automation bias decision-making?
Does automation bias decision-making?
Computerized system monitors and decision aids are increasingly common additions to critical decision-making contexts such as intensive care units, nu…
·sciencedirect.com·
Does automation bias decision-making?
Artificial Intelligence and Nuclear Command, Control, & Communications: The Risks of Integration - EA Forum
Artificial Intelligence and Nuclear Command, Control, & Communications: The Risks of Integration - EA Forum
(This report is viewable as a Google Doc here.) …
Possible interventions include funding new experimental wargaming efforts, funding research at key think tanks, and increasing international AI governance ef
Fortunately, both the U.S. and U.K. have made formal declarations that humans will always retain political control and remain in the decision-making loop when nuclear weapons are concerned.[10
New and more powerful ML systems could be used to improve the speed and quality of assessment completed by NC3
The other core reason for focusing on  early warning and decision-support systems within NC3 is their susceptibility and influence on the possibility of inadvertent us
The machine had made the wrong call and Petrov’s skepticism and critical thinking combined likely contributed to preventing Soviet retaliation.
1980 NORAD “changed its rules and standards regarding the evidence needed to support a launch on warning.”
In one sense these stories demonstrate that even in the face of complex and flawed systems, organizational safety measures can prevent inadvertent use. However, they also speak to the frightening ease with which we arrive at the potential brink of nuclear use when even a small mistake is made.
anything that impacts early warning and decision support systems requires the utmost scrutiny even if the change stands to potentially improve the safety of such systems.
their history with false positives, anything that impacts early warning and decision support systems requires the utmost scrutiny even if the change stands to potentially improve the safety of such system
Furthermore, current NC3 systems are aging and the last major update was during the 1980s.[2
Not only will modernized NC3 incorporate ML but there is a real risk of rushed integration with higher risk tolerance than normally accepted
humans will always be in the loop, but isn't this just beefed up version of the original problem?
·forum.effectivealtruism.org·
Artificial Intelligence and Nuclear Command, Control, & Communications: The Risks of Integration - EA Forum
Rethinking Nuclear Deterrence in the Age of Artificial Intelligence - Modern War Institute
Rethinking Nuclear Deterrence in the Age of Artificial Intelligence - Modern War Institute
Editor’s note: The following is based on an article by the author recently published in Defense & Security Analysis, entitled “Deterrence in the Age of Artificial Intelligence & Autonomy: A […]
AI-augmented cyber tools’ machine speed could enable an attacker to exploit a narrow window of opportunity to penetrate an adversary’s cyber defenses or use advanced persistent threat tools
swarms of robotic systems fused with AI and machine-learning techniques may presage a powerful interplay of increased range, accuracy, mass, coordination, intelligence, and speed in a future conflict.
usceptibility to invest in sunk costs, skewed risk judgment, heuristics, and groupthink)
·mwi.usma.edu·
Rethinking Nuclear Deterrence in the Age of Artificial Intelligence - Modern War Institute