Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization
Eliezer Yudkowsky, a researcher and writer on superintelligent AI, discusses the dangers of AI and its potential impact on human civilization. He covers topics such as GPT-4, AGI alignment, and the timeline for AGI.