Crime: an unproudly South African scourge. Who doesn’t want to walk around our streets without fear of being mugged or worse?
‘Smart’ policing has become the next big thing in street-crime fighting. This policing model uses Information and Communication Technologies (ICT’s) to enrich policing efforts. The South African Police Service (Saps), some metropolitan municipalities and private companies are teaming up to use data-driven technologies such as Closed-Circuit Television Cameras (CCTV) in the fight against crime.
Increasingly, these cameras are being loaded with ‘smart’ capabilities such as Automatic Number Recognition (ANPR) and facial recognition software.
ANPR allows the police to match vehicle number plates picked up on CCTV to a vehicle’s owner, if the vehicle has been used in a crime. Facial recognition allows them to identify a person from a facial database if s/he has been up to no good in public spaces. These sound like excellent crime fighting tools. Or not.
Not only are their impacts on crime levels unclear – raising public policy questions about costs versus benefits – but they can lead to unregulated state surveillance, and even political surveillance.
In fact, CCTV the world over has been so disappointingly bad at bringing down crime levels, that UK academic Clive Norris has even referred to its continued spread as ‘the success of failure’.
Everyone is familiar with stories of CCTV successes, and the media tend to focus on these successes. But academics that have moved beyond the anecdotes and conducted impact assessments have presented a very different picture. Largely, research points to CCTV cameras not being very effective in bringing down crime.
For instance, some academics have found that CCTV systems displace crime rather than deter it. Where CCTV had contributed to crime fighting, these benefits were localised and often not statistically significant. In fact, they are more effective in bringing down crime in confined spaces, such as parking lots.
For instance, a Civilian Secretariat for the Police study of CCTV usage in the Gauteng town of Benoni, found that crime was being displaced to parts of the city that were not under surveillance. Furthermore, the admissibility into evidence of CCTV footage remained unclear.
Take the case of one of the most CCTV-saturated cities in the world, namely London. There, CCTV has not contributed significantly to prosecutions, which has reduced its usefulness as an investigatory tool.
In fact, in 2016, the Westminster Council announced its intention to decommission its CCTV camera network, arguing that the costs far outweighed the benefits: and this in spite of the fact that Westminster borough is a tourism magnet.
South Africa does not have a tradition of conducting impact assessments of CCTV, and the existing academic studies tend to focus on policy and legislative issues. This means that the public is forced to rely on the state’s claims about effectiveness.
However, typically state agencies talk up the positive impacts for public relations purposes, reflecting a technologically-determinist, quasi-religious faith in CCTV to solve social problems. Their faith is hardly surprising: rolling out CCTV cameras makes them look like they are doing something about crime, even if they aren’t.
But sometimes, the state’s own information problematises their claims. In the case of Khayelitsha in Cape Town, the police made use of CCTV evidence a mere five times over a ten year period. In 2015, the police were criticised again for making only 107 arrests following 2,640 criminal incidents having been caught on camera in the broader Cape Town area. These statistics suggested that the problem of CCTV under-utilisation was not confined to Khayelitsha only, and that policing needed more basic ‘fixes’.
Needless to say, other kinds of crime are not recorded by street cameras: for example, white collar crime like corruption and domestic crime. So, in other words, investment in CCTV networks helps to perpetuate the stereotype that the most serious crimes are street crime perpetrated by strangers (often poor and working-class ones at that).
At the same time, the state has underinvested in ‘smart’ technologies that could detect the crimes of the wealthy. In fact, the state has run down one of the most powerful surveillance tools to fight organised crime, namely targeted lawful communications interception.
There is a consistent trend around the world that smart policing technologies overpromise on their contributions to crime fighting and under-deliver. This is because their uptake is often producer-driven rather than consumer-led.
In other words, technology companies oversell their products, and police agencies and local governments are receptive customers looking for silver bullets for crime problems. In the absence of any public counter-discourse, these agencies tend fall for the sales pitch.
Researchers at Cambridge University developed ANPR as an anti-terror tool. Yet, even in its home country, its usefulness in crime fighting has been questioned when relied on excessively and exclusively. Initially, its Optical Character Recognition (OCR) capabilities were unreliable, leading to misidentification of vehicles, but gradually it has improved, reducing danger of false matches.
But ANPR has not been restricted to crime fighting; it has been used for political surveillance purposes, too. In 2009 in the UK, ANPR was used to identify a vehicle at a peaceful protest, resulting in their occupants being profiled as domestic extremists and stopped by police. This incident led to UK technology writer James Bridle referring to ANPR as the country’s ‘next generation surveillance export’.
Facial recognition software has also become controversial in that it has used personal data gathered for one purposes, for another purpose, and not necessarily with a data subjects’ consent (a basic requirement of data protection law).
For instance, in 2011, the British Columbia Privacy Commissioner criticised the Insurance Corporation of British Columbia and the police for using the Corporation’s driver’s licence database to identify people involved in a riot at a hockey match. People who had provided their information to the Corporation for one purpose, had not agreed to it being used for another purpose.
In fact, this is one of the most criticised features of facial recognition: if it is confined to matches using criminal ‘mugshot’ databases, then fewer privacy issues may arise. But the fact that databases of ordinary innocent civilians may be searched, raises serious questions about their privacy rights. In effect, innocent citizens are being subjected to what academics at the Centre for Privacy and Law have described as a virtual, perpetual line-up.
Facial recognition carries with it the risks of false positives or false negatives: so innocent people may be identified falsely, while guilty people may not. A police study has also suggested that the technology may have an in-built racial and age biases. This is because the algorithms used for facial recognition have more difficulty identifying black and younger people accurately.
These problems increase the potential for falsely accusing people of a crime based on race or age, leading to racial or age profiling. If state agencies are acquiring this software for law enforcement purposes, then they should include accuracy as a key requirement in any tender documents.
Most controversially, some countries (such as Australia) are exploring using facial recognition in CCTV cameras in public spaces, to scan and identify ordinary passers-by in real time. This is a truly frightening prospect, as it means that pedestrians can have no expectation of privacy in public spaces whatsoever.
‘Smart’ CCTV can be used to violate a person’s locational privacy, or the right of a person to move about freely, without having his or her movements tracked.
This right is becoming increasingly important, as information about a person’s movements can reveal a great deal about a person’s personal, social and political activities. Governments of a more authoritarian bent can, for instance, misuse this information to establish a person’s movements, political involvement and associations.
Your movements constitute personal information, which you have a right to exercise control over in terms of the Protection of Personal Information Act (POPI). However, the police, spy agencies and private security companies are likely to argue that the Act does not cover criminal justice and national security issues.
But they are wrong. The Act applies to these areas if it can be shown that existing privacy protections are inadequate, and smart CCTV roll-out is practically unregulated at the moment.
Consider these two photographs. I took the first in Montreal and the second in Melville, Johannesburg. Spot the difference.
Privacy regulators generally insist that public CCTV cameras are signposted, whether they are operated by state or private actors, and that these signs include information about the data controller responsible for operating them and the contact details. This allows people to contact the controller to enforce their privacy rights if needed. In central London, signs warn people that ANPR is in operation. Yet in South Africa, signage on CCTV cameras is inadequate to non-existent.
But here’s the thing. The privacy regulator provided for in POPI is still being set up, which means that, for the spy agencies, it is open season on our personal data.
Surely, you may argue, people cannot have a reasonable expectation of privacy when they enter public spaces? Wrong. This often-heard argument misses the point that public surveillance devices are becoming cheaper, more widespread and more invasive.
Gone are the days when you needed an expensive helicopter to track a protest; now you can just send up a drone. These new capabilities increase the potential of them to be used for anti-democratic surveillance purposes.
Cape Town and Johannesburg have been rolling out smart CCTV cameras fitted with ANPR and Johannesburg has also expressed interest in incorporating facial recognition into its camera systems.
Yet, in Cape Town, CCTV roll-out has tended to ‘follow the money’, with a more well-resourced system in the CBD contributing to gentrification, while the townships were sparsely populated by CCTV cameras. ANPR has been used successfully to track down drivers with outstanding warrants, though.
In the case of Johannesburg, I learnt during my research that they intended to establish a facial database for facial recognition purposes, but there was no indication that the database would be confined to criminal suspects only. The City could not provide any CCTV policy covering, for instance, data retention, minimisation and destruction issues.
Neither city appeared to have conducted privacy impact assessments of their CCTV plans. Signage did not meet basic data protection standards, as though keeping CCTV cameras and their capabilities secret was a virtue. Johannesburg was also thinking of tapping into the police’s database of facial photographs for facial recognition purposes.
The Department of Home Affairs is also planning to augment its biometric ‘smart’ ID card system by collecting iris prints and facial photographs. This will mean that in time to come, it will have a massive stash of everyone’s biometric data and it has ambitions to make this the government database of choice to provide a single view of the citizen. Already, the police have live, real-time access to this database. Rightfully, there should be a policy that forbids them from accessing peoples’ biometrics for the purposes of political profiling.
One thing is clear. The state’s technological innovations have run far, far ahead of the policy, and the resulting dangers for democracy are both real and present. Ramaphosa needs to ensure that the Information Regulator that is meant to look into all of this, is independent and well-resourced. He needs to ensure that it starts its work as a matter of urgency. It cannot be allowed to become another patchy champion of our privacy rights, like the Inspector General of Intelligence.
The public needs to stay ‘woke’ on the new forms of state surveillance. Otherwise the next criminal suspect caught on camera and falsely accused of a crime while coming home from a protest might be you. DM
Jane Duncan is a professor in the Department of Journalism, Film and Television at the University of Johannesburg. Her new book is called ‘Stopping the spies: constructing and resisting the surveillance state in South Africa’ (forthcoming from Wits University Press).
There is a computer security class in the University of Virginia called Defence Against the Dark Arts.