- Dec 17, 2013
- 78,803
- 164
- 0
Former OpenAI chief scientist Ilya Sutskever reportedly expressed concerns about AGI, proposing "a doomsday bunker" to seek refuge in an unprecedented disaster.
OpenAI scientists wanted "a doomsday bunker" before AGI surpasses human intelligence and threatens humanity : Read more
OpenAI scientists wanted "a doomsday bunker" before AGI surpasses human intelligence and threatens humanity : Read more