>Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.
It's fair enough but for that you should probably encourage mucking about with the present open source AI software which is quite a long way off extinction risk AGI but would let us try to figure how to deal with the stuff.
>Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.
It's fair enough but for that you should probably encourage mucking about with the present open source AI software which is quite a long way off extinction risk AGI but would let us try to figure how to deal with the stuff.