Yes, I'm experiencing this in the form of 'suggestion' buttons.
It's so clearly trained on my own replies that it parrots stuff I've said, but it tends to get the sense of the words literally backwards and wrong. If I used that I would be telling my fans the opposite of what I actually meant, or various other catastrophically not-even-wrong assertions. It really, really is not figuring out what it's being fed. Sometimes I let people in on what the AI is suggesting I say.
It's not actually wresting the controls from my hands and talking to my fans AS me… yet.
> It's so clearly trained on my own replies that it parrots stuff I've said, but it tends to get the sense of the words literally backwards and wrong.
"Yeah back then I was weirded out by their now standard response prediction thing, it was always just one or two words off exactly what I wanted to reply (usually missing/adding a negation, or substituting a word with an antonym)"[1]
Feels like that ended with an ai escape when I was hoping that the punchline would be that all the apparently happy, well adjusted, coherent users were in fact ai trained in the preapocalypse and the insane sick, repetitive people were the real users living in the post apocalypse.
It's so clearly trained on my own replies that it parrots stuff I've said, but it tends to get the sense of the words literally backwards and wrong. If I used that I would be telling my fans the opposite of what I actually meant, or various other catastrophically not-even-wrong assertions. It really, really is not figuring out what it's being fed. Sometimes I let people in on what the AI is suggesting I say.
It's not actually wresting the controls from my hands and talking to my fans AS me… yet.