Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Re the Center for AI Safety's sentence

>Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.

It's fair enough but for that you should probably encourage mucking about with the present open source AI software which is quite a long way off extinction risk AGI but would let us try to figure how to deal with the stuff.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: