The delegates are impressed. They’ve travelled to Shanghai, China to learn how to improve South Africa’s policing. As civilian members of the parliamentary portfolio committee on police, they are responsible for oversight of the South African Police Services. It’s October 2017, and they are on a study tour to exchange knowledge with their Chinese counterparts.
As part of the tour, they find themselves in the Shanghai Municipal Public Security Bureau’s 24-hour Command Centre, taking a look at the city’s high-tech CCTV camera system. Chief of the Command Centre, Mr Chen Zen, tells them that Shanghai’s 24 million citizens are being watched by 31,000 cameras, which serve to assist its 50,000 police officers. And, he says, since the system’s been installed, there’s been compelling evidence that surveillance has become a major deterrent of crime, making it easy to prevent, detect and prosecute criminal behaviour. This, in turn, has led to a significant decrease in violent crime, as well as a drop in reckless driving.
The SA delegates are in agreement: for a long time, they tell the Chinese, they’ve been urging SAPS to invest heavily in technology to fight crime.
But China is more than a few steps ahead of SA when it comes to camera surveillance. In the city of Shenzhen, about 1,500km to the southeast of Shanghai, traffic police are using facial recognition technology to identify jaywalkers. Facial recognition is a biometric measurement, much like a fingerprint, or an iris scan. It is possible to use software to compare a picture, taken by a CCTV camera of a person’s face, to mug shots of millions of people in a database. That should allow you to identify an individual if their data is on the database – theoretically.
The Shenzhen traffic department has installed a system that takes a photograph of the face of anyone who crosses a zebra crossing when the little man is not green. If you are in the city for more than 30 days, you must be registered with the traffic authorities. This means that it is compulsory to provide them with your identifying information and a photograph of your face. This is stored in a database.
If the jaywalker’s face matches a face in the database, an offence is registered against that person’s name. The system keeps count of an individual’s illegal crossings. To deter the behaviour, a blurred photo of the jaywalker, their surname and partial ID number, appear on a large LED screen at the intersection where the transgression occurred. Jaywalkers can then be fined.
The Shenzhen traffic authorities have also set up a website on which jaywalkers are thus named and shamed. The company that provides the technology to the police, Intellifusion, is also working towards setting up a system whereby the jaywalker will be notified of the offence via text or a mobile app.
Back in South Africa, and almost one year later, SAPS is yet to invest heavily in advanced CCTV camera systems as suggested by the parliamentary committee.
But the same cannot be said for South Africa’s major cities. The last 20 years has seen the establishment and growth of CCTV surveillance systems in various large cities and towns, including – but not limited to – Johannesburg, Pretoria, Cape Town, Nelson Mandela Bay, and eThekwini.
One of the factors behind the proliferation of CCTV systems is a booming local private security industry. Every year, the South African journal, Hi-Tech Security Solutions, publishes a 100+ page handbook on CCTV surveillance, detailing the latest products available to government and the private sector. Retailers and manufacturers from around the world vie for lucrative contracts. (The City of Joburg, for instance, paid R50-million to have their CCTV system installed in 2008. The City said earlier this year that it costs about R1-million to maintain the system monthly, although industry sources estimate it to cost at least R2-million to R3-million a month.)
Private companies push several product features, apart from facial recognition. For instance, you can purchase a camera with thermal capabilities; it can sense heat, be it from a living thing or a warm car engine. This allows you to track movement and find objects even if it is completely dark. Other examples include movement prediction and body language analysis. The ultimate aim of all this is to prevent crime by predicting what might happen: if, for instance, the camera “recognises” a certain person moving past the same point more often than normal, displaying certain behaviours or body movements, the system alerts the operator, who in turn can then notify a police response team to attend to the scene even before a crime is committed.
One entity that has bought into this hi-tech approach is the City of Johannesburg. In July 2018, the City reportedly embarked on the first phase of a massive upgrade to the current CCTV system, which consists of 450 cameras in its crime-ridden CBD and close surrounds. The new cameras to be added to the network are said to be capable of facial recognition and motion prediction. The first phase would see 50 cameras installed, taking the City’s total tally of cameras to 500. The installation is to be outsourced to a third party by the Metropolitan Trading Company, which manages the project on behalf of the Johannesburg Metropolitan Police Department.
The new cameras will be able to complete a 360-degree rotation, and “cover a distance of up to a kilometre with consistency and clear visuals”, according to Sipho Phakathi, the MTC’s field service technician in charge of the City’s “intelligence Video Analytic Platform Project”.
BusinessTech reported in July 2018 that Phakathi said the “highly advanced” cameras would be set up in “high-crime hotspots” with high concentrations of traffic and people. Phakathi said facial recognition capabilities would allow for the camera to register when someone moves through the same spot one too many times. The camera will then “recognise” the pattern and alert the control room.
But the technology could lead to the targeting of thousands of innocent people, and false arrests.
To understand how this could happen, Daily Maverick spoke to three industry experts, all of whom wished to remain anonymous. Between the three of them, they have over four decades of experience in the business, and share an intimate knowledge of urban and state visual surveillance in South Africa.
Facial recognition, explains the first expert, is far from infallible, and will have to undergo considerable further development before it can be utilised to pick a face out of a big, bustling crowd with 100% accuracy.
“We’ve tested extensively. At no point can we pick up a person, in the street, which is a known criminal. The database (of faces) needs to be built.”
But it’s not just any database. Our source explains that much more than a mug shot is required; you would need to take photographs from various angles because you won’t necessarily be facing the camera when you are walking down the street, window shopping, or working your way through a busy crowd. The angle is not the only variable that the software will have to take into account; lighting is crucial. Simply put, if it’s too dark, the camera will either fail to register you, or mistake you for someone else. The result could be a false arrest. You could also end up on a database of suspicious persons.
Asked if he thought that Joburg’s facial recognition software would be accurate, our second industry insider responded: “I highly doubt it.”
He explained that the camera would have to be at street level (this is the case in China), because once a person is five to eight meters away, the recording quality is too poor for effective recognition. From a distance of 500m to 600m the resolution simply becomes too low, and there is too much turbulence. Turbulence occurs because a camera is never completely still. The farther the focal point, the more any slight movement of the camera will affect the clarity of the picture.
A third source, who works for a major security company, told Daily Maverick: “It’s not that easy. You need a really good database (of faces). But we’ve never seen that it works.”
The only time when facial recognition is relatively accurate, says the first source, is within a controlled environment. In a mine, for instance, when a worker needs to access a certain area, the person would face the camera squarely, within one metre, keeping still, and the lighting conditions would be suitable.
The technology could have very negative consequences for the public.
In Great Britain, facial recognition technology has led to false arrests and the targeting of thousands of innocent people. In May 2018, British lobby group BigBrotherWatch obtained information about the use of facial recognition from 35 UK police departments by making lawful requests for information. (In SA, such requests can be made through the Promotion of Access to Information Act, which allows South Africans to request information from a state body or private entity in order to exercise or protect constitutional rights.)
According to the report, on average, automated facial recognition wrongly identified innocent people in 95% of cases. In the case of the South Wales Police, accuracy stood at 9%. In total, the South Wales police stored biometric photos of 2,451 people wrongly identified as persons-of-interest or criminals. In total, 31 people who had been incorrectly targeted as criminals had to prove their real identity to police.
Apart from lacking accuracy, facial recognition software is also less likely to recognise you if your skin is darker. In other words, it’s racially biased. In July this year, the American Civil Liberties Union warned that this could exacerbate racial discrimination. The organisation reported on a test they had conducted with Amazon’s facial recognition tool, Rekognition. At least one police department in the United States had started using Rekognition at the time of the ACLU report, according to the organisation.
The ACLU test used the software to compare the photographs of the faces of the United States Congress members to the faces of 25,000 photos of people arrested for a crime. The software mistakenly identified 28 Congress members as people who were arrested. In other words, the software could not recognise the difference between the 28 Congress members’ faces, and those of arrestees. Although the misidentified members included men and women of all ages and various racial backgrounds, the software was biased.
“Nearly 40 percent of Rekognition’s false matches in our test were of people of colour, even though they make up only 20 percent of Congress,” the report stated.
One explanation for this bias is that facial recognition technology was primarily developed by and tested on white males in the US, the UK, and Germany. Following the ACLU report, Google’s director of cloud computing, Diane Greene, told the BBC that the technology has “inherent biases” and requires more “diversity” to be accurate.
But, if facial recognition was 100% accurate, if could become a very dangerous double-edged sword, with the side cutting down privacy and freedom being considerably sharper.
In 2017, Stanford University researchers succeeded in using facial detection technology to determine the sexual orientation of people using the pictures they posted on a dating site. From over 35,000 images, the technology was able to accurately distinguish between gay and straight men 81% the time. For women, it was 74%. The researchers warned that since companies and governments are increasingly using this type of technology to “detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women”.
In 2016, researchers from the McMaster University in Canada and the Shanghai Jiao Tong University in China said that their facial recognition system could distinguish between the faces of convicted criminals and innocent people, based on their facial features. Their methodology came under fire from their peers, but their research marks a step towards the old and highly discriminatory practice of physiognomy – studying someone’s physical and facial traits to determine their character traits.
And this point brings the conversation back to the new technology that Johannesburg is introducing. Because at the heart of it lies the concept of predictive policing – the use of data analysis to predict whether or not someone will commit a crime, and then taking precautions to prevent that crime.
Apart from facial recognition, pattern analysis of any sort of movement is aimed at such prediction. It is possible to use the CCTV system to detect when there is suspicious movement. For instance, people suddenly flocking to a street corner could indicate that a fight is breaking out there. The system can be programmed to alert the operator whenever a large crowd gathers in what is usually a walkway. The operator could then relay the message that police officers should be deployed to that spot to prevent any calamities.
Similarly, there is gait analytics – measuring the tempo and length of a person’s stride to predict what they may do. Programming software to identify someone who is running, or who is walking through the area at an unusual pace, could signal a suspicious person, and again one could deploy police following an alert from the system.
But again, in South Africa, this type of technology won’t work, according to our industry sources. South Africa’s traffic and pedestrian flow are simply too irregular and unpredictable. People could gather on a street corner at any moment for any reason, or similarly break out into a jog. Jaywalking is practically a national pastime. As for gait analytics, says the second industry source, it still has a while to go before it can be used in a public setting.
Which brings one to the human factor. To prevent the type of bias and inaccuracy that comes with hi-tech CCTV solutions, human operators require proper training.
Daily Maverick spoke to a behavioural specialist to get some insight. Essentially, security personnel can be trained to pick up on crime indicators. These are behavioural aspects and body language that could be signs that a crime has the potential to occur, or is in progress.
“If we have a look at biases and indicators of behaviour – they shouldn’t coincide. You’re looking for specific behaviours. For instance, somebody who is about to commit a crime will usually look around differently from the way a normal person would.”
The trick is to “benchmark” (for a specific area or within a specific context) the behaviours which are routine. This can then be objectively compared to behaviours that are out of place.
Up to now, Johannesburg has relied on human operators to conduct surveillance. However, according to two industry sources, Johannesburg has recently lost its experienced and trained operators. In 2017, the City did not renew its contract with Omega Risk Solutions – the company that ran Joburg’s operations since the CCTV system’s inception in 2008. A source who asked to remain anonymous because he has direct ties with the City, said that the operators who were relieved from their duties had extensive training, and most had seven to nine years of experience.
It’s unclear what to expect from Johannesburg’s new surveillance cameras. However, the consequences of employing a CCTV system using questionable technology, combined with an operations room that has lost decades of experience in one fell swoop, could prove severe.
Daily Maverick sent the City of Johannesburg detailed questions regarding the cost of the new systems, the problems inherent to facial recognition technology, the safeguards they would have in place to ensure footage and data was not leaked from their operations room, as well as an inquiry about complaints from the public that the current system is not functioning properly.
This was the response from Wayne Minnaar, Johannesburg Metro Police Senior Superintendent:
“The CCTV contributes significantly towards the safety in the Johannesburg CBD, therefore detail of its functioning is confidential and information may not be divulged for security reasons.” DM
Heidi Swart is a journalist who has extensively investigated surveillance and intelligence services in South Africa. This story was commissioned by the Media Policy and Democracy Project, an initiative of the University of Johannesburg’s department of journalism, film and TV and Unisa’s department of communication science.
"Those who will not reason are bigots; those who cannot are fools; and those who dare not are slaves." ~ George Gordon Byron