Ilya Sutskever, OpenAI ex-chief scientist, planned for a doomsday bunker for the day when machines become smarter than man

Sutskever believed his AI researchers needed protection once they ultimately achieved their goal of creating artificial general intelligence, or AGI.