Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Related:

Teen in love with chatbot killed himself – can the chatbot be held responsible?

https://news.ycombinator.com/item?id=45726556



How is it possible to develop emotional connection to anyone with a goldfish memory? I'd be surprised if context size there is more than 50K tokens.


Most AI companions have memory systems. It's still quite simplistic but it's not "goldfish memory" or just limited to the context window.

From what I understand, some inputs from the user will trigger a tool call that searches the memory database and other times it will search for a memory randomly.

With that said, I think people started falling in love with LLMs before memory systems and they probably even fell in love with chatbots before LLMs.

I believe that the simple, unfortunate truth is that love is not at all a rational processes and your body doesn't need much input to produce those feelings.


People were developing an emotional connection to the Eliza program in the late 1960s [0], the code for which would have contained less characters than the discussion on this page.

0. https://www.theguardian.com/technology/2023/jul/25/joseph-we...


People got attached to a bot that only asked questions.


People are lonely and desire connection.


a woman married a bridge in 2013




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: