Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh come on, the entire bill was against open source models, it’s pure business. “AI safety”, at least of the X-risk variety, is a non-issue.


> “AI safety”, at least of the X-risk variety, is a non-issue.

i have no earthly idea why people feel so confident making statements like this.

at current rate of progress, you should have absolutely massive error bars for what capabilities will like in 3,5,10 years.


I am not sure we will be able to build something smarter than ourselves, but I sure hope for it. It is becoming increasingly obvious that we as civilization are not that smart, and there are strict limits of what we can achieve with our biology, and it would be great if at least our creations could surpass these limits.


Sure, but we should heavily focus on doing it safely.

We already can build machines using similar techniques that are superhuman in narrow capabilities like chess and as good as the best humans in some narrow disciplines of math. I think it is not unreasonable to expect we will generalize.


Nuclear weapons, at least in the quantities they are currently stockpiled, are not an existential risk even for industrial civilization, nevermind the human species. To claim that in 10 years AI will be more dangerous and consequential than the weapons that ushered in the Atomic Age is quite a leap.


Viruses are just sequences of RNA/DNA and we are already showing that transformers have extreme proficiency in sequence modeling.

In 10 years we have gone from AlexNet to GPT2 to GPT o1. If future capabilities make it so any semi-state actor with a lab can build a deadly virus (and this is only one of MANY possible and easily plausible scenarios) then we have already likely equaled potential destruction of the atomic age. And that’s just the stuff I can anticipate.


To make a deadly virus, you need just a degree in bioengineering and about $1M of equipment; the information is readily available. If you are on the cheap side, you can source the equipment nearly free on lab clearance sales etc. It is 2024. Children do their first genetic engineering projects in _high school_. You can buy a CRISPR kit _right now_ for about $150.

This horse has sailed long ago.


That is not even remotely true for a novel viral sequence that would be very dangerous.

And funny that we have gone from “nothing is more dangerous than the atomic bomb” to “anyone with a bio degree and 1 million is more dangerous than the atomic bomb”


> That is not even remotely true for a novel viral sequence that would be very dangerous.

We don't need fully novel sequences. The original Spanish Flu strain sequence is published and well known. And you can always modify existing ones with known aspects (e.g. as in this infamous paper: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC114026/). And yes, it is more difficult to build an atomic bomb, as materials are thankfully expensive and difficult to obtain.


I find it hard to believe that Google, Microsoft and OpenAI would oppose a bill against open source models.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: