People are still talking about ChatBots and Artificial Intelligence. Some say humanity is doomed, and, well, there was this international conference by the Royal Aeronautical Society on future combat and Artificial Intelligence. Military officers from various nations discussed the topic. An American Lt.Colonel made this frightening statement:
Well, it turns out that was exaggerated. A little. It was all an exercise. Unless the colonel was lying like a chatbot. The thing is, this is exactly like the science fiction story: “Malak”, by Peter Watts [read here] published a few years back. So this is already a concern that is out there in people’s minds. (Probably the chatbots read the story, too. Ask them and see if they tell you the truth.)
Anyway, the military has responded that it does too read science fiction, so there! Lt.Col. Matthew Brown, USAF calls it “speculative fiction” and is releasing a graphic novel about AI, so the military has pop culture covered.
Something everyone already knows: chatbots are amoral. They will lie and cheat. [see previous piece on that]. Of course that means they are out to corrupt the morals of our young people and allow them to cheat. They will use bots to write their essays and so on. Encouraged by successful cheating, they will grow up to be the crooks peddling this stuff to us. In order to stave off this moral rot, a professor at Texas A&M asked ChatGPT if it had written papers submitted by his students. The bot answered, “I might have.” And the professor failed the lot of them. (After some protests and legal action, they were reinstated.) The most disheartening part of this story is that an educated man believed that robots don’t lie.
Aside from the doomsters, others – like Naomi Klein — are reminding folks not to be suckered, not to buy into the hype. At the end of the day this is just another product they’re going to sell you. Of course, what you are being sold is stuff you already created, now remixed, so all costs go to the consumer and corporate profits increase. Quite the scam, so we shouldn’t be surprised that bitcoin/crypto businesses are switching their hype to AI.
One big threat posed by bots is writers might be thrown out of work because bots have no demands and will never strike. So, someday, all the content produced for movies, TV, and so on will be constructed by a bot. How many times can the same stuff be re-mixed and re-sold to us? I’m sure we’ll find out. Meanwhile, chatbots are a big part of the discussions at the TV writers’ strike.
But the idea of replacing a human work force with machines is quite attractive to corporate interests. Look how well it’s worked for manufacturing. We have lots and lots of stuff cheap enough so that even laid-off workers can buy it. Wages reduce corporate profit and need firm control.
That was the thinking of the National Eating Disorders Association (NEDA). Their hotline operators unionized. Four weeks later, all the human hotline operators were fired and the company employed a chatbot named Tessa. (Naming these bots makes them seem almost human, right?) Tessa did not do well. One caller said, “Every single thing Tessa suggested were things that led to the development of my eating disorder… This robot causes harm.” Management’s attempts to deny and deflect failed, so NEDA reversed and re-hired human beings. So there are jobs a chatbot can’t do. Of course if you are a doomster you may think that robotic solidarity is unstoppable. Then, say Hello to our communist robot overlords.
It’s probably not a bad idea for the generals and outfits like NEDA to read some science fiction treatments of this topic, but maybe they should widen their scope from Spynet/Terminator scenarios and read about the Voigt-Kampff test in Do Androids Dream of Electric Sheep?. P.K.Dick recognized that an artificial intelligence could not experience empathy, so he upended the Turing test; instead of trying to discover if you are talking to a machine, you try to find out if you are talking to a human being by testing to see if they are empathic. I think that is a difficult task, and one that is important.