OpenAI scientists wanted “a doomsday bunker” before AGI surpasses human intelligence and threatens humanity
Former OpenAI chief scientist Ilya Sutskever reportedly expressed concerns about AGI, proposing “a doomsday bunker” to seek refuge in an unprecedented disaster.