AI – a terrifying story about your face and a company called Clearview
Clearview's AI facial recognition technology has enabled a dystopian nightmare, allowing them to scrape the internet for 30 billion photos of faces and associated information, which they are selling to police departments, security and other government agencies.
While many people have been looking at ChatGPT and other generative AIs with prickling anxiety, a much scarier story has been unfolding, for quite a while, largely outside the headlines. It has to do with a little-known company called Clearview, started in 2017 by Vietnamese-Australian Hoan Ton-That and right-winger Richard Schwartz, and it is the stuff of a dystopian Hollywood film plot.
Whose ending has not yet been written.
Clearview sort of stumbled into a business model when they sought to become the Google of facial image recognition and search. They scraped as much of the public internet as they could, eventually amassing an estimated 30 billion photos of faces, along with as much additional information about each person that they could find, such as name, age, education, purchasing history, political leanings, names of friends, whatever. So, if Clearview is provided with a random photo of an individual face, the system (trained in AI facial recognition techniques) can immediately match it to other images of the same person as well as associated sundry information — even if the photo is taken in a bad light or from a different angle.
(Scraping means that you do not have to access a back-end database; the code simply grabs information off a screen, as when you load Facebook and search for someone’s name. That information is onscreen, public and can be legally captured.)
You and I are almost certainly in Clearview’s database.
Where did they scrape their images from? Instagram, Twitter, Venmo, Facebook, Flickr, YouTube and other sources, using bots and other fancy tech. They were very good at it, no hacking required. In 2020 they had 2 billion faces, by 2023 they had 30 billion, greatly helped by the fact that we tend to post our photos everywhere online (think birthday shots or travel pics or profile pictures). Not to mention our friends posting their pictures of us on their sites and pages.
Now consider this.
You are on holiday in China. A street camera snaps your photo (there are hundreds of thousands of them in China). The photo is matched to the images in their database, using AI. The system instantaneously matches your face to the face of someone who attended a pro-Uyghur rally, because there is a photograph of you at that rally that was snapped by a stranger and posted to Flickr, and you happened to be in the background. A few minutes later you are picked up by Chinese security and taken in for questioning.
Or you walk into a retail store with in-house cameras. Your image is uploaded and matched with other photos of you, one of which is attached to a melancholy post about your recent divorce. An unctuous salesman immediately materialises at your side with a sympathetic smile, saying — you look kind of down, can I recommend this shiny thing to cheer you up?
Read more in Daily Maverick: Macy’s Sued Over Use of Clearview Facial-Recognition Software
Or you are a lawyer involved in a big lawsuit against, say, your local soccer league. You go to see a game on Saturday with your kids, as you always do, and a camera at the entrance snaps your face. Security is notified that a lawyer working on the lawsuit against the league has just entered the stadium. You are accosted by security personnel and escorted out. (A version of this last scenario actually happened).
Or you apply for a job. The recruiter cross-references your image (picked up from a Google search) to a Venmo transaction contributing to the Democratic Party (Venmo has your profile picture). The recruiter is a MAGA Republican. You never get the interview.
All of this is enabled by AI-fueled face recognition technologies that long predate ChatGPT and Large Language Models. There is not even any particularly fancy tech required. The founders of Clearview initially just used open-source stuff to get started, although reportedly they have now vastly improved their face search algorithm, achieving over 98% accuracy. It is also worth remembering that both Google and Facebook have had this technology for years and have decided not to use it (Facebook withdrew the feature from its app a few years ago).
So who are they selling to? Originally, they wanted to sell to the retail sector and to right-wing political action groups to help them target political enemies (both founders are aligned with the extreme right-wing). But Clearview ran into a spot of bother with the law in some places, specifically in a few American states that have biometric protection statutes (like Bipa in Ohio).
Their market is now supposedly only police departments, security and other government agencies. However, Clearview is a private company and it is unclear who else they may be selling to. It is certainly true that the technology has been a boon to criminal justice systems (thousands of police departments have bought it) with multiple convictions directly tied to the use of Clearview, including child abuse cases.
Sadly, it doesn’t take a great deal of imagination to see how this new AI development can also be abused, particularly in those places where the state has little respect for the privacy of its citizens. And Clearview has been happily selling their technology to such places.
Turn the other cheek?
Here’s the horror movie endgame of this tech. You will not be able to walk anywhere in public — a park, a restaurant, a beach, a shop — without risking the possibility that any random stranger could just snap your face with his iPhone and immediately find out everything about you, dating back decades. If this AI technology remains unconstrained and unregulated your present and your past may become public property.
Never mind the awful things that may happen at some future date with ChatGPT or Grok or Bard or LLMs or generative AI. This is happening now. And if this potential violation of your privacy doesn’t chill your bones, nothing will.
(For those interested in the complete and discomforting story of Clearview, I recommend a new book called Your Face Belongs to Us, by journalist Kashmir Hill, also covered in a Verge interview with the author here). DM
Steven Boykey Sidley is a professor of practice at JBS, University of Johannesburg. His new book It’s Mine: How the Crypto Industry is Redefining Ownership is published by Maverick451 in SA and Legend Times Group in UK/EU, available now.