I agree that methods of influencing public opinion have always existed starting as far back as town criers in Medieval City Squares. But with the arrival of the internet and later social media it began to grow to a much larger scale.
The appearance of cheap or even free AI has made it completely total. People don’t reread, don’t double-check. Blind trust is the main threat.
I started to realize the scale of the shitshow when my friend (an educated and not stupid person) suddenly started using em dashes in personal messages.
Before that
he ignored all punctuation.
This turned into its own rant again, so apologies for it:
These things can all be said about cable television's influence as well. Standardization of accents across America and the lack of fact checking against whatever the news says. 9/11 and it's consequences and influences in American beliefs are a good example. This sort of influence happens often and is not new to the internet, television, or radio. Changes in how we handle capitalization in English are a result of the printing press even.
I've seen people unknowingly change the way they talk or communicate. This time it's just GPT that's the influence instead of government A, media station B, or book C. This is, and always has been, the case and isn't even a new phenomenon. Every 10 to 20 years it happens en-masse again and again. "lol" is ubiquitous, my mom and grandma use emojis. Depending on how long you've been around, you've seen a number of these sorts of events happen that fundamentally shifted all of society to a new way of thinking. Anyone born after the event has no idea that people didn't feel that way before it. I don't find it to be related to intelligence how it affects people. It has more to do with their degree of stubbornness or cynicism how much they are immediately influenced.
Blind trust is and always has been a thing. In fact, its arguably a very strong survival mechanism. The strong leader/elder knows how to do survive better than the rest, so follow their lead and don't question it. After all, there are hundreds of ways you could die immediately by doing something stupid, but it's hard to keep yourself alive.
If anything, the main issue with GPT is that it can hallucinate some really really wrong things and advise people to do harmful behaviors. LLMs don't have a billion years of evolutionary experience to say what will or won't kill a living thing. It isn't that people are being unduly influenced by it, the problem is that it has no survival instinct, and humans interact with text like they are talking to someone alive. If someone else alive told you, with authority, to mix two cleaning agents in your kitchen to get a stain out, you might do it. Now, in reality, any human that would do so would perish, and they could no longer give said advice. The machine though, it hallucinated the idea and can propagate it without evolutionary consequence to itself.
Fundamentally, that is the issue. Em-dashes aren't negatively impacting the way we communicate really. Blind trust is an inbuilt part of human nature, we aren't going to be able to change that. The issue is the LLM itself without safety protocols in place for safe handling of it. I think, honestly, that it's roughly on par with schedule 1 drugs. Things that manipulate fundamental biological aspects outside of normal parameters in a way that "can" negatively impact people. Think along the lines of morphine. People get addicted to opiates and die from them, spiraling out of control as their body fails to cope with artificial manipulation. Morphine is a drug that has prevented a lot of pain and saved lives through the prevention of shock. In a similar manner, LLMs influence the mind in a way that exploits systems not meant to handle such outcomes, but have the potential to unlock creativity and productivity of individuals beyond what they normally could achieve. It can also cause them to spiral and ruin their lives.
An apt, if funny, comparison is the divide in the portrayal of demons in Frieren. The author states that they've essentially evolved mimicry and just use language as a tool to eat humans. They're exploiting a fundamental biological behavior, like how a virus does. Some people get upset that they're being portrayed in that way. An LLM functions in a similar manner. It's been designed to artificially mimic conversational behavior but fundamentally, it is just doing matrix mathematics in the background. The consequences to it for a wrong answer are minimal, even if they result in the ultimate negative outcome for a user (death). One should be careful not to associate the ability to mimic speech with having human motivations. Nor should one attribute what is a fundamental human problem with a new technological development.
Humans are inbuilt with blind trust. To a degree, its how we learn. Blind trust is why you believe that the color blue and the word for blue refer to the same thing. You had blind trust in an authority figure (usually a parent). To this day, you haven't really challenged most blind beliefs. To do so would also be a waste of your time. People are social creatures with social influences, that comes with all the packaging around it. The majority of what we know or learn is blind trust.
As for specifically:
I started to realize the scale of the shitshow when my friend (an educated and not stupid person) suddenly started using em dashes in personal messages.
I don't think you've actually realized the scale of the shitshow yet my friend. This has been going on to this degree since ancient times. That's the true scale of the shitshow, it is human nature. Yes, even smart people (often times more-so, because you need to have a lot of blind trust to actually reach high levels of comprehensions in certain fields of study). Always. It's fundamental to life itself (in a true biological sense, not a literary one).