There is a website called thispersondoesnotexist.com. Check it out. Every face you see was constructed by artificial intelligence, and they are indistinguishable from real faces. In fact, what are you seeing is one of the last bastions of truth creepily disassembled.
The technologies behind this are galloping ahead gleefully. Fake audio clips of famous people talking in their real voices popped up two years ago from Adobe (the software, called VoCo, was quickly withdrawn) – the technology is now old hat. There are already convincing videos of people saying things that they didn’t – mouth and tongue movements are constructed after the fact. Just take a look at this for instance, for a jolt of horror.
Stare, then, into the dark maw of the future of truth.
While there are still experts (and technologies) that can ferret out falsities, particularly in fake videos, it is a battle that is being lost. As machine learning and related new technologies accelerate, the “just-noticeable-difference” threshold will disappear, and it will be, quite simply, impossible to tell which photograph, which video, which audio clip is the real thing.
Imagine for a moment that you are a person of malicious intent and lots of money (none among this readership, right?). Today, using existing technology, you can create a photograph of your target in an infinite variety of compromising or career-ruining circumstances. A defence of “I was never there”, or “that wasn’t me”, is unlikely to remove a dark smudge of doubt that will hang over the poor victim forever, even in the best of circumstances.
It is all a matter of incentives, and the largest of those incentives dangles in front of large-stake politics, especially those for whom good-old precepts of honesty are quaint concepts. Take the worst of them – Putin, Duterte, Trump, Orban, Maduro, Assad – does one for a moment think that, if presented with a possibility of mortally wounding an opponent, they wouldn’t take it? The rewards could be spectacular, the risks are almost negligible. Remember how many Americans were convinced that Hillary Clinton was running a child sex ring out of a pizza parlour, simply from a few tweets? Now imagine the multiplying power of a judiciously constructed photo or voice clip alongside them.
Consider our own backyard. While there is dishonest spin in play across the entire political spectrum, the careful voter can usually cut through the chaff and bluster. But then there are the blinding lies. Such as Julius Malema’s oft-repeated slur of opponents as “agents” or “spies”. He did this years ago to a British journalist, and now (more viciously and with far greater and more frightening consequence) to Karima Brown. No evidence necessary – he just had to say it and his followers believed it.
Now imagine that the slur against Brown was accompanied by a photo placing her at the house of some vicious racist or even in the lobby of a government intelligence office. Simple to do, with the right technology and dark intent. Brown and her supporters would justly cry foul, but the damage to her would be even greater than it is now, and probably ruinous and irrevocable.
Does one doubt that deepfaking will be employed in the service of the useful political lie in South Africa? And as the technology advances, will it not simply become the Orwellian world of metatruths to grease the wheels of power-seeking?
I recently took a course on hacking. It was exhaustive and carefully constructed and well-presented. What was astonishing was that the hacking tools were beautifully made – graphic user interfaces, handy guides, examples galore, mostly free of charge. With names like Evilgate and Metasploit and Nexpose and Veil. Anyone with even a light techie background can master them, and I guarantee that they will be able to wreak havoc at almost all corporations and government departments in SA. Google and Facebook are pretty well protected, but our proudest public companies and institutions, who mistakenly think that firewalls and passwords protect them, not so much.
So it will be with deepfake, where similarly easy-to-use tools will soon proliferate. Upload a photo of a person, upload a photo of a crowd. Within seconds that person will be in the crowd, undetectable as fake. Upload a video of a politician talking, tell the system what you want the person to say, and within minutes the doctored video will be on your hard drive. All drag-and-drop stuff. Child’s play.
So is there a light at the end of this dark tunnel? Yes, possibly. It lies in the technology of blockchain, which is simply a database whose entries are verified by multiple parties at the source and cannot be changed by anyone once lodged. One can imagine such technology being used to “certify” photos and videos and audio clips before evildoers have a chance to repurpose them. But it will need to be a global initiative, and we all know how that is going in other fields, such as climate change.
So, until then, be increasingly careful of believing anything you see or hear on the internet, unless it is from an organisation whose legacy engenders trust.
Which, sadly, leaves us sealed in our bubbles and adrift in suspicion of those on the other side. DM