Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To some degree you're correct -- LLMs can be viewed as the kind of "sufficiently advanced" compiler we've always dreamed of. Our dreams didn't include the compiler lying to us though, so we have not achieved utopia.

LLMs are more like DNA transcription, where some percentage of the time it just injects a random mutation into the transcript, either causing am evolutionary advantage, or a terminal disease.

This whole AI industry right now is trying to figure out how to always get the good mutation, and I don't think it's something that can be controlled that way. It will probably turn out that on a long enough timescale, left unattended, LLMs are guaranteed to give your codebase cancer.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: