AI safety guru: “Everyone will die” if we build superintelligence

Eliezer Yudkowsky elaborates on the importance of an international treaty for shutting down AI before it ends humanity.