Three key misconceptions in the debate about AI and existential risk July 15, 2024
Jack Kelly, AI Research Fellow at PSR | Bulletin of the Atomic Scientists
Arguing about the relative importance of existential threats ignores the fundamental truth that any credible existential threat is one too many and that all must be engaged in parallel. If you accept that AI might pose an existential threat, then it should be a societal priority to address this threat, even if you are more concerned about another issue.
More Campaign Update
Why Isn’t the Nuclear Threat a 2024 Campaign Issue?
Bob Dodge (center) with Back from the Brink supporters at the March to Abolish Nuclear Weapons. Robert Dodge, MD, PSR Board Member | Common Dreams...
The Extortionist’s Doctrine
Elaine Scarry, PhD, PSR Board Member | Boston Review Framed wholly as defensive and preventative (and from day to day, largely successful in deflecting our...
AI in Nuclear Weapons
Artificial Intelligence (AI) is a disruptive and powerful technology that is being rapidly integrated with U.S. military infrastructure. Through an examination of some of the...