World Leaders Keep Asking Chatbots For Advice

Photo: Hollandse Hoogte/Shutterstock

On December 3, 2024, South Korean President Yoon Suk Yeol declared martial law, hurling his country into an ongoing political crisis that has already resulted in his impeachment and arrest. In their effort to reconstruct the events of the day, investigators uncovered a strange detail: Hours before the declaration, the president’s head of security was asking ChatGPT about coups. From Korean paper The Hankyoreh, in translation:

According to the police and prosecution on the 18th, the police’s application for an arrest warrant submitted to the Seoul Western District Prosecutors’ Office reportedly included content showing that Chief Lee searched for “martial law,” “martial law declaration,” and “dissolution of the National Assembly” at around 8:20 p.m. on December 3rd of last year…. [W]hen Chief Lee searched for the word, the State Council members had not yet arrived at the Presidential Office.

This is farcical, sure, and things didn’t work out. It also doubles as deranged ad for the service. Is your boss declaring martial law? Is he communicating poorly and failing to provide clear directions about what happens next? Need someone, or something, to talk to about a sensitive subject? Are you not getting what you need from a Google search for “martial law Korea?” Try ChatGPT!

Meanwhile, in the United Kingdom, a freedom of information request filed by a reporter for New Scientist produced a comparatively tame — but also sort of embarrassing — related story:

The UK’s technology secretary, Peter Kyle, has asked ChatGPT for advice on why the adoption of artificial intelligence is so slow in the UK business community – and which podcasts he should appear on… ChatGPT returned a 10-point list of problems hindering adoption, including sections on “Limited Awareness and Understanding”, “Regulatory and Ethical Concerns” and “Lack of Government or Institutional Support”. The chatbot advised Kyle: “While the UK government has launched initiatives to encourage AI adoption, many SMBs are unaware of these programs or find them difficult to navigate. Limited access to funding or incentives to de-risk AI investment can also deter adoption.”

Here we have a different sort of farce: A guy whose job involves ingesting and repeating a bunch of anodyne conventional wisdom getting what he needs from a sort of conventional wisdom machine; as in the case of the martial law chat, the use of AI here is mostly notable for its novelty. People in power with access to a wide range of rare resources also Google like the rest of us, and now some of them use chatbots, too.

Still, these stories tell us things. One is about chatbot products themselves which, however else their users understand them, are commercial web services that record what they’re doing. No less than chats with a person, or search logs, they produce evidence. We can expect ChatGPT and similar products to make more cameos in world events, but also in criminal and civil court. They also suggest a particular sort of chatbot user that doesn’t show up so often in conversations about AI adoption, which tend to focus on phenomena like homework help and/or cheating, office productivity and programming, and routine Google replacement.

The Kyle example in particular brings to mind a conversation I had early last year with a media executive — the sort of person who is used to asking things of people professionally gathered around him — who gushed about AI, and how it had changed his life. Asked how, he said, with a hint of self-deprecation, that it was simple: He could ask it about anything, it would spit out a nice clean paragraph, and then he could walk into any room and pretend to know what he’s talking about.

For most of us, having access to a chatbot that can effectively or at least plausibly answer a wide range of questions is strange and new. For a certain class of power people, chatbots don’t provide something novel, but rather a pocketable version of something familiar: a constantly available — and maybe flattering, and maybe obsequious — assistant. Like a human assistant, they can make you look good, or maybe get you out of a bind (but, so far, not through a coup.) Also like a human assistant, it keeps notes. You know, just in case.