Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> HN is weird because at one point this was the location where I found the most technically folks, and now for LLM I find them at reddit.

Is this an effort to chastise the viewpoint advanced? Because his viewpoint makes sense to me: I can run biggish models on my 128GB Macbook but not huge ones-- even 2b quantized ones suck too many resources.

So I run a combination of local stuff and remote stuff depending upon various factors (cost, sensitivity of information, convenience/whether I'm at home, amount of battery left, etc ;)

Yes, bigger models are better, but often smaller is good enough.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: