Chatbots are bullshit engines built to say things with incontrovertible certainty and a complete lack of expertise. No wonder the tech elite jokingly call them “mansplaining as a service.” And now they’re going to be driving the primary way humans acquire knowledge day to day.
This is where I suspect a chatbot’s ability to generate prose, as opposed to a list of useful links, gets dangerous. People transmit beliefs socially, through language. And when a lot of us share a belief system, we form a more cohesive and harmonious group. But that’s a hackable system. Because ideas that are communicated well — using the right words and phrasing and tone — can seem more convincing. The bots use the first-person “I,” even though there’s no person. To a casual reader, Bard’s and Sydney’s answers will come across as human enough, and that means they’ll feel that much truer.
Rogers, A. (2023, February 16). ChatGPT is a robot con artist, and we’re suckers for trusting it. Business Insider. https://www.businessinsider.com/ai-chatbot-chatgpt-google-microsofty-lying-search-belief-2023-2