LessWrong (Curated & Popular)

"Reevaluating AGI Ruin in 2026" by lc

This episode reevaluates Eliezer Yudkowsky's 2020 essay "AGI Ruin: A List of Lethalities," which outlines 43 reasons why the creation of artificial general intelligence could lead to human extinction. It also considers Paul Christiano's response, "Where I Agree and Disagree with…

Listen