I recently went to an social interest group meeting. It begins with a presentation, which is generally confirmed in advance by the organizer based on a text abstract of the planned presentation submitted by a future presenter. This time, the presentation turned out to be staggeringly bad, mostly consisting of filler words and interjections, and presenter was struggling to express even the basic idea. I could not fathom how that could happen, considering the idea was presumably expressed in the abstract. I had to write one before, and felt annoying and difficult to write down what I felt was all ready to be said in my head, but as a result I knew what I wanted to say and how I could say it when in front of a dozen of people.
At some point, the organizer (to help situation) quoted from the abstract that was submitted, and it was indeed apparently decently written. After that, I could not help thinking that the likelihood the abstract was written by an LLM is very high. In that case, the presenter certainly reviewed it, but crucially did not write it—the thought process that makes the idea part of your active vocabulary, and you capable of expressing it, would not have taken place.
To reiterate, I don’t know whether it happened or not, but even if no LLM was involved in this instance (perhaps it was just a particularly violent case of stage fright, despite the event being very small and the IRL vibe extremely casual) it would be beyond silly to assume that it would not be happening going forward.
I used to think that it is beneficial if an LLM can help in handling certain boring signaling communication for people who are very bad at it, acting as a sort of an equalizer. A model writes stuff, you approve it, and you gain access somewhere without having, say, the written language flair of someone who went to a prestigious school, or having to spend time on something that seems unnecessary.
I am changing my mind now. Sure, the case I have described is one of the more extreme ones, but it made me think how signals are actually signals for a reason[0], and when some signals go away the communication field does not become equalized—instead, other signals and barriers are used: money, IRL meetings, invitation, some sort of privacy-violating invasive check of humanity, etc., or the communication that relied on some signals before would simply not happen now. When the Web and tech in general had removed a whole lot of constraints on communication, we still could rely on those signals, but that is apparently coming to an end.
Writing a birthday card is another endangered signal. The impression from the movies, how these gestures become very cheap if they come from some rich CEO who certainly has a personal assistant for this sort of stuff, now applies to everyone (including people who would never touch an LLM with a ten foot pole). Once we all know that a birthday card can be reduced to “I have read and approved this message”, as social beings we won’t stop needing the psychological impact of such positive gestures—we only stop receiving it.
I am not sure I see all of the above as positive (even if in the latter case I am slightly optimistic that some viable substitute for those signals could be found within personal relationships).
[0] Even that reason is dubious, like discrimination by a social criteria, well then that’s not going to go anywhere. Tech is not going to solve that human problem, besides perhaps a very fleeting handful of months when some techies gain edge while everyone is still catching on. New barriers will be erected, the core problem left unaddressed.
At some point, the organizer (to help situation) quoted from the abstract that was submitted, and it was indeed apparently decently written. After that, I could not help thinking that the likelihood the abstract was written by an LLM is very high. In that case, the presenter certainly reviewed it, but crucially did not write it—the thought process that makes the idea part of your active vocabulary, and you capable of expressing it, would not have taken place.
To reiterate, I don’t know whether it happened or not, but even if no LLM was involved in this instance (perhaps it was just a particularly violent case of stage fright, despite the event being very small and the IRL vibe extremely casual) it would be beyond silly to assume that it would not be happening going forward.
I used to think that it is beneficial if an LLM can help in handling certain boring signaling communication for people who are very bad at it, acting as a sort of an equalizer. A model writes stuff, you approve it, and you gain access somewhere without having, say, the written language flair of someone who went to a prestigious school, or having to spend time on something that seems unnecessary.
I am changing my mind now. Sure, the case I have described is one of the more extreme ones, but it made me think how signals are actually signals for a reason[0], and when some signals go away the communication field does not become equalized—instead, other signals and barriers are used: money, IRL meetings, invitation, some sort of privacy-violating invasive check of humanity, etc., or the communication that relied on some signals before would simply not happen now. When the Web and tech in general had removed a whole lot of constraints on communication, we still could rely on those signals, but that is apparently coming to an end.
Writing a birthday card is another endangered signal. The impression from the movies, how these gestures become very cheap if they come from some rich CEO who certainly has a personal assistant for this sort of stuff, now applies to everyone (including people who would never touch an LLM with a ten foot pole). Once we all know that a birthday card can be reduced to “I have read and approved this message”, as social beings we won’t stop needing the psychological impact of such positive gestures—we only stop receiving it.
I am not sure I see all of the above as positive (even if in the latter case I am slightly optimistic that some viable substitute for those signals could be found within personal relationships).
[0] Even that reason is dubious, like discrimination by a social criteria, well then that’s not going to go anywhere. Tech is not going to solve that human problem, besides perhaps a very fleeting handful of months when some techies gain edge while everyone is still catching on. New barriers will be erected, the core problem left unaddressed.