Defend Truth


Transparency and accountability are fundamental for the use of algorithms in the public sector


Gabriella Razzano is the Executive Director of OpenUp ( and an Atlantic Senior Fellow in Social and Economic Equity. She also acts as legal consultant and researcher, focused on issues of transparency, open data, technology and law.

The use of algorithms in the African public sector is not as uncommon as you might think. What must be realised, in order to ensure a happy future for public sector algorithm use, is that algorithmic decision-making is not in fact neutral.

A recent Daily Maverick article covered the use of algorithms by the South African Revenue Service (SARS) in what they romantically call their “risk engine”, but it occurs across departments and even tiers of government. 

My ongoing research with colleagues has examined the use of algorithms in Nigeria under the National Cash Transfer Programme, in Kenya’s Affordable Housing Programme and even in South Africa’s Drakenstein Municipality as a mechanism for their indigent register management. This work has been enlightening.

SARS is clearly enamoured with its systems.

There are immense opportunities in the considered use of algorithms in the public sector for driving social development through greater efficiencies and cost savings, and even through its indirect act of innovating the public service and its data practices. 

Sometimes it is the act of automating itself – removing it even just a bit more from a human decision-maker – that is in fact the benefit. As SARS praised: “Our risk engine is agnostic to who the taxpayer is.” And in Drakenstein, an implementer of the technology noted that moving the decision a little further from the politics was in fact a boon.

Yet, what must be realised in order to ensure a happy future for public sector algorithm use, is that algorithmic decision-making is not in fact neutral – and may not be for a number of reasons. 

There are biases that can be embedded in the formulation of the algorithms. More importantly in the South African context, there are biases – or even just dramatic inaccuracies in the underlying data that is used to be run through these algorithms. 

On 8 March we celebrated #BreakTheBias, and there has been tremendous recent work highlighting how gender bias in actual datasets has had negative impacts on not just the design of technology, but even the design of policy

SARS, and other public sector algorithm users, are using data as the underlying component – but as the famous adage in technology reminds us: “Garbage in, garbage out.” Maintaining data quality must be a focus for the public sector (something the emerging data protection framework from the Protection of Personal Information Act (Popia) will only help with in part). 

It would be interesting, too, to better understand the exact nature of the algorithms being used. Yet famously, the algorithms entertained are frequently described as a “black box” technology, with us unable to unpack or peer into its inner workings. In the context of the public service in particular, though, this “reality” is simply unacceptable. 

Just like the myth of the neutral algorithm, the myth of the impenetrable algorithm can be rebutted with a simple requirement: transparency by design. The public sector must be accountable in its actions, even when those actions are perpetrated by a machine on their behalf. 

But why does accountability matter so much? Because the risk of exclusion from important public services is just so very great: you risk exclusion from services and assets. And the specific vulnerabilities of those excluded or included from benefits of the algorithmic decision-making systems we explored in Kenya, Nigeria and South Africa were pronounced. We all engage with the public sector – with all our vulnerabilities.

What mechanisms will SARS and other public institutions institute to help with facilitating transparency? The simple truth is that, if we are expected to engage with public sector technologies, we need to trust them. The engendering of trust in a system requires access to information and data about that system, about the political and economic dynamics involved (for example, procurement), about the outcomes and processes it uses, and even about the data (and quality) which underpin it. 

This is why data subject rights, like those in our Popia, are an important component of creating a trusted environment for the implementation of algorithmic decision-making technologies.

So, somewhat boringly perhaps, it comes down to the fundamentals of good governance: more transparency, more accountability. 

Then we move to better data governance practices, and to improving the quality of public data. And then we try to think of more specific issues on algorithmic and, ultimately, artificial intelligence governance. 

SARS sees the opportunity present in algorithms and technology – but public actors have some big questions to answer about their practices, which they should be answering as they begin to creatively engage more and more with technology’s potential for advancing public service. DM


Comments - Please in order to comment.

Please peer review 3 community comments before your comment can be posted