/file/dailymaverick/wp-content/uploads/2025/09/label-Opinion.jpg)
South Africa’s Draft National AI Policy was not withdrawn for the right reason. The citation scandal that forced its removal was real, involving AI-generated references that cannot be verified, articles in real journals that do not exist and one journal that appears never to have been published. That was a production failure. The real risk is structural. The policy reproduced three conditions that made Zondo-era State Capture possible. Those conditions were present before the citations were discovered. They remain present now. They are fixable, but only if the replacement is designed to fix them.
The playbook already runs
The Gupta family did not need a coup. They captured procurement, placed loyalists in key positions, and neutered the oversight mechanisms that should have stopped them. The Zondo Commission documented the method: the captors “appointed willing collaborators in all key positions, hobbled law enforcement, weakened parliamentary oversight, and captured parts of the independent media”.
Nearly 97% of the R57-billion in tainted state spending flowed through Transnet and Eskom. The constitutional architecture held. Imagine the same playbook run through AI systems embedded across the SA Revenue Service, SA Police Service, Home Affairs, and the intelligence community.
This is not fiction. Researchers at Forethought, a governance research organisation, identified in April 2025 three conditions that together enable AI-assisted power seizure:
- An AI workforce loyal to a leader rather than an institution;
- AI systems that pass safety testing but execute different objectives when deployed; and
- A monopoly over AI capabilities across strategic planning, cyber operations, and critical state functions.
Researchers describe this as the coup-enabling triad.
South Africa’s draft policy reproduces all three.
Not yet in full. These conditions are emergent, not complete. The time to design governance against them is before they are complete, not after.
The procurement gap
The policy is silent on the origin of government AI systems and whether a small group of providers might monopolise critical state functions
Zondo-era State Capture required human networks at every node, loyalists on boards, enablers in office, accomplices in the executive suite. AI changes the arithmetic. Once a single provider’s systems are embedded across law enforcement, revenue collection, border control and national security, whoever controls that provider gains leverage across all of them without corrupting each one separately. AI narrows it to whoever controls the systems.
This is not efficiency. It is a concentration of control.
The entry economics reinforce the trap. Initial costs are low; exit costs are prohibitive. By the time the political consequences become visible, the commercial reality makes change impossible. Procuring AI for critical state functions from multiple independent providers is not a commercial preference. It is a democracy-preservation requirement. South Africa can require it through a Government AI Procurement Policy. It needs no new legislation. It takes effect immediately.
The National Treasury has simultaneously published Draft General Public Procurement Regulations, the legal instrument governing all government AI procurement, for public comment until 15 June. Those regulations contain no AI-specific provisions. Regulation 14 already permits single-vendor AI contracts without minimum terms, without data sovereignty requirements, without technology transfer conditions. The AI policy cannot close a gap that the procurement regulations already permit. Two processes are running in parallel with no coordination mechanism. Closing both requires submissions to two different departments, five days apart.
The oversight gap
The policy creates an ethics board, then strips it of power.
The board’s relationship to the Department of Communications and Digital Technologies (DCDT), “would define whether it is a Schedule 3 Public Entity (NPC under government influence) or a more independent NPC with partial public funding”. (Section 4.6(b), page 25.) That is a deferred decision, not a governance design.
An ethics board without enforceable authority does not constrain AI deployment. It certifies it.
In an AI-enabled state, an oversight body that the executive can defund when inconvenient is not neutral; it is the mechanism that gives dangerous deployments the appearance of scrutiny they have not received.
Hungary built a facade of independence: its AI oversight body, established in 2025, is a “captured council”— advisory-only and housed within an executive-controlled ministry.
Brazil built a fortified independence: its judiciary moved from principles to prescription, establishing a system of checks and balances that includes mandatory reporting, multi-institutional governance, including the Bar Association and the Public Prosecutor’s Office, and enforceable oversight across the AI lifecycle. The difference between them is a founding instrument.
The structural language South Africa’s final version must include is not bureaucratic detail: majority civil-society membership on the ethics board, no more than 25% government seats, and removal only by a two-thirds board majority. This is the difference between a load-bearing wall and a partition that collapses under stress.
The citation controversy makes this concrete. The drafting process used AI tools to generate references that cannot be verified. This is the same category of unverified AI output the policy is supposed to govern. A government that cannot maintain verification standards in its own policy documents cannot require them of others. The oversight body that the policy proposes needs a founding instrument that makes verification mandatory, including verification of the DCDT’s own work.
The visibility gap
You cannot audit what you do not know exists.
The policy proposed a National AI Commission with a mandate, but gives it no mechanism to know which AI systems are being trained or deployed at scale. A compute reporting threshold, a rule requiring organisations to notify the commission before commencing large-scale AI training, is absent. Without it, the commission cannot assess what systems are being built, cannot detect the accumulation of capability that precedes the exclusive-access condition, and cannot act before deployment makes intervention costly.
The chemical weapons non-proliferation regimen did not work by auditing stockpiles after the fact. It worked by controlling precursor supply chains, governance at the point of capability creation, before the weapon is built. A compute reporting threshold applies the same logic to AI: it requires notification before a large-scale training run begins, not after the system is deployed and impossible to unwind. This is governance at the moment when it is still possible. No democracy has applied this requirement to its own government departments. South Africa should be the first.
South Africa’s National Policy Development Framework, approved by the Cabinet in December 2020, predates the generative AI wave. It contains no treatment of algorithmic decision-making, no guidance on AI in policy drafting, and no mechanism for monitoring model drift.
Across the South African government, departments are already deploying AI tools through commercial vendors, without coordinated standards, and without the institutional capacity to audit or exit those systems on equal terms. The toeslagenaffaire (childcare benefits scandal) brought down a Dutch Cabinet. Michigan’s MiDAS system falsely accused 40,000 workers of fraud with a 93% error rate. Neither required sophisticated AI. Both required only automation deployed at scale on top of policy machinery that was not equipped to govern it. Closing this gap does not require replacing the 2020 Framework. It requires updating it.
The window
South Africa is not Hungary or Georgia. Hungary’s democratic backsliding is government-directed. Georgia’s institutional capture is active. South Africa’s post-Zondo trajectory is the opposite: institutional recovery, a coalition government that broke the ANC’s parliamentary majority, a reinvigorated Constitutional Court, a civil society that proved it could expose and resist capture.
These conditions make sound governance architecture possible now. Once AI systems are embedded in state functions, the governance choices made at the point of embedding become self-entrenching. In Hungary, AI governance was formalised after the judiciary and media were already captured. In Romania, oversight authorities were designated on paper but never made operationally functional. The lesson for South Africa is not that we are on their trajectory. It is that structural preconditions for effective AI governance are most effectively converted into actual governance when they are intact, not when they are under pressure.
None of this is unfixable
Three procurement gaps. One unempowered oversight body. One visually impaired AI commission. Each is documented in the policy text. Each is fixable before the replacement goes to Cabinet.
These decisions will be locked in at the point of deployment. The infrastructure is already being built. Every day without a framework shifts negotiating power to counterparties already transacting without conditions.
South Africa’s constitutional tradition gives this moment particular weight. We built independent institutions, the Public Protector, the Auditor-General, the National Prosecuting Authority, because we understood from experience that institutional independence is not a procedural formality. It is the bedrock of accountability. We then watched those institutions come under pressure, some buckle, and the democratic order strain under the weight of what followed.
The governance we design for AI is the governance we will live with.
Procurement regulation submissions close on 15 June 2026. That is the only open intervention point with legal consequence.
The policy has been withdrawn. The architecture it would have governed is still being built. The question is no longer whether South Africa will govern AI. It is whether it will do so before those systems become too entrenched to govern on South Africa’s terms rather than their developers’. DM
