Defend Truth


The Geneva Convention, artificial intelligence and the future of warfare

The Geneva Convention, artificial intelligence and the future of warfare
(Image:, prompted by author)

‘Doom mongering makes people glassy eyed…(but) those who dismiss catastrophe are, I believe, discounting the objective facts before us. After all, we are not talking about the proliferation of motorbikes and washing machines here’ – Mustafa Suleyman, co-founder of DeepMind and author of Technology, Power, and the Twenty-first Century’s Greatest Dilemma.

AI has speedily mutated into use-cases we never thought were possible a few years back, and the need to constrain it has gained attention while we try to gather our wits and decide on matters of ethics and safety. As the scenario unfolds, a new and unsettling thought is taking root. 

The thought concerns what the militaries of various countries are doing with AI and the fact that they care about very different use-cases from the rest of us.

Before probing this unsettling thought, we should probably wave a nostalgic goodbye to the hard-won and then much-ignored Geneva Conventions of the past 160 years. The first Convention took place in 1864, with attending nations all signing up to the principle of humane treatment of sick and wounded soldiers on the battlefield. Three more treaties followed in which, first, shipwrecked armed forces were added (1906), then prisoners of war (1926) and, finally, civilians in war zones (1949), the last in response to the atrocities of World War 2. A few more protocols have since been added but the 1949 document is the core. 

Today’s geopolitical landscape is a sad testament to this entirely noble, now largely unsuccessful, initiative. Even the most superficial sweep of recent and current conflicts in Africa, the Middle East, East Timor, Ukraine, not to mention the brutal crushing of internal dissent in North Korea and by China at home and abroad, makes depressing reading. The Geneva Convention is disregarded for the most part (there are no consequences for breaching it, other than tut-tutting by the UN), leaving those countries that do try to follow the rules looking like outliers, even suckers. 

We seem to regard this horror as regrettable at most. We either avert our eyes, or gaze upon the carnage, despair a bit and then go about our business. And, in some cases, cruelty even becomes acceptable, as in a study (quoted here) which concluded that one in four South Africans think that rape is sometimes justified as a weapon of war. 

Which brings us to AI. 

Much of the commentary around the daily innovations in the field of AI is fuelled by publicly available information. And there is much of that. It is a huge and chaotic market square ringing with loud-hailers, braggadocio and intellectual finery of all kinds on show. Announcements about algorithms, research papers, company valuations, start-ups, billionaires newly minted, novel devices, clever applications and, of course, much soothsaying about things-to-come, both good and bad. 

But in military and defence establishments it is a different story. Work is going on at a furious pace, quietly, in near-hermetic secrecy and with almost unlimited funding. In countries like the US and China as much as in Russia, Korea, Israel, India, Iran, Saudi Arabia and Europe. And I would submit that the Geneva Convention is not much discussed in those labs where new weapons of war are dreamt up. There can be only two issues at stake for defence establishments worldwide – how will our country survive if we lose this race, and how may an adversary be overpowered or eliminated – quickly, cheaply and completely – using this new tool. 

In the worst scenarios, there is a possibility that an AI weapon will one day diverge from its original intent, triggered by a bug in the code, a hack, or (worse still) by surreptitiously setting its own goals.

There are whispers now emerging. Rumours about great advances in low-cost autonomous offensive weaponry. For example, cheap AI “drone swarms” – thousands of $1,000 drones acting in concert to avoid repellents. Targeted AI-designed bio-weapons capable of disabling or killing millions in days, the antidote available only to their country of origin. Terminator-like robot soldiers, trundling through city streets demolishing buildings at a fraction of the cost and with more accuracy than human soldiers. AI-created software bots, designed to shut down water supplies, electricity, telecoms and supply chains, bringing a country to its knees within weeks. 

If all this sounds familiar, it should. These sorts of weapons are the well-worn tropes of 1,000 science fiction TV shows, books and movies. So here is the unsettling thought. Whatever is being cooked up in the satanic mills of weapons design is likely to be beyond anything any of us can imagine. The capabilities of AI, focused by a government anxious about its own vulnerability, must surely produce dark experiments which will result in unprecedented misery if unleashed. 

There is, to be fair, considerable pushback against this view. Analysts talk brightly about imbuing AI weapons with moral imperatives to minimise human suffering, about surgically targeting adversaries only, about deploying clever AI-controlled defensive barriers. However, the entire momentum of autonomous weapons’ development necessarily involves a reduction of human judgement in their operation and, in the worst scenarios, there is a possibility (some will say a certainty) that an AI weapon will one day diverge from its original intent, triggered by a bug in the code, a hack, or (worse still) by surreptitiously setting its own goals.

Read more in Daily Maverick: When are we all going to lose our jobs to AI?

Consider this, from retired Air Force Lieutenant-General David Deptula: “It’s important to remember that the enemy gets a vote. Even if we stopped autonomy research and other military AI development, the Chinese and to a lesser degree Russians will certainly continue their own AI research. Both countries have shown little interest in pursuing future arms control agreements.”

Or this, from retired US Air Force General Charles Wald: “I don’t believe the US is going to go down the path of allowing things… where you don’t have human control. But I’m not sure somebody else might not do that.”

In the zero-sum game of warfare it is this fear that drives both strategy and tactics. And you can be sure there will be no dog-eared copies of the Geneva Convention being trawled for moral guidance. DM

Steven Boykey Sidley is a professor of practice at JBS, University of Johannesburg. His new book It’s Mine: How the Crypto Industry is Redefining Ownership is published by Maverick 451 in South Africa and Legend Times Group in the UK/EU, available now.


Comments - Please in order to comment.

Please peer review 3 community comments before your comment can be posted


This article is free to read.

Sign up for free or sign in to continue reading.

Unlike our competitors, we don’t force you to pay to read the news but we do need your email address to make your experience better.

Nearly there! Create a password to finish signing up with us:

Please enter your password or get a sign in link if you’ve forgotten

Open Sesame! Thanks for signing up.

We would like our readers to start paying for Daily Maverick...

…but we are not going to force you to. Over 10 million users come to us each month for the news. We have not put it behind a paywall because the truth should not be a luxury.

Instead we ask our readers who can afford to contribute, even a small amount each month, to do so.

If you appreciate it and want to see us keep going then please consider contributing whatever you can.

Support Daily Maverick→
Payment options

It'Mine: How the Crypto Industry is Redefining Ownership

There must be more to blockchains than just Bitcoin.

There is. And it's coming to a future near you soon.

It's Mine is an entertaining and accessible look at how Bitcoin made its mark, how it all works and how it challenges our long-held beliefs, from renowned expert and frequent Daily Maverick contributor Steven Boykey Sidley.

Become a Maverick Insider

This could have been a paywall

On another site this would have been a paywall. Maverick Insider keeps our content free for all.

Become an Insider
Elections24 Newsletter Banner

On May 29 2024, South Africans will make their mark in another way.

Get your exclusive, in-depth Election 2024 newsletter curated by Ferial Haffajee delivered straight to your inbox.