'Humanity's remaining timeline? It looks more like five years than 50': meet the neo-lu... - 0 views
-
A few weeks back, in January, the largest-ever survey of AI researchers found that 16% of them believed their work would lead to the extinction of humankind.
-
“That’s a one-in-six chance of catastrophe,” says Alistair Stewart, a former British soldier turned master’s student. “That’s Russian-roulette odds.”
-
What would the others have us do? Stewart, the soldier turned grad student, wants a moratorium on the development of AIs until we understand them better – until those Russian-roulette-like odds improve. Yudkowsky would have us freeze everything today, this instant. “You could say that nobody’s allowed to train something more powerful than GPT-4,” he suggests. “Humanity could decide not to die and it would not be that hard.”