Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My suspicion is because they (HN) are very concerned this technology is pushing hard into their domain expertise and feel threatened (and, rightfully so).




While it will suck when that happens (and inevitably it will), that time is not now. I'm not one to say LLMs are useless, but they aren't all they're being marketed to be.

Or they might know better than you. A painful idea.

Painful? What's painful when someone has a different opinion? I think that is healthy.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: