Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I believe in dead salmon, they do.




Thank you for the giggle, I misread this as a statement of faith and a non-sequitur.

I had an fMRI and also believe in dead salmon now, it's a common side effect but it's worth it for the diagnostic data they get.

Yeah, really needed the comma on the left side of the parenthesis.

They cause hallucinations in dead salmon? I find that hard to believe.


I'm not 100% sure I'd call that a hallucination, but it's close enough and interesting enough that I'm happy to stand corrected.

When improper use of a statistical model generates bogus inferences in generative AI, we call the result a "hallucination"...

It should have been called confabulation, hallucination is not the correct analog, tech bros simply used the first word they thought of and it unfortunately stuck.

Undesirable output might be more accurate, since there is absolutely no difference in the process of creating a useful output vs a “hallucination” other than the utility of the resulting data.

I had a partially formed insight along these lines, that LLMs exist in this latent space of information that has so little external grounding. A sort of deeamspace. I wonder if embodying them in robots will anchor them to some kind of ground-truth source?


Loss of consciousness seems equally unlikely.

True, though an easier mistake to make, I imagine.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: