Dailymaverick logo

Opinionistas

This article is an Opinion, which presents the writer’s personal point of view. The views expressed are those of the author/authors and do not necessarily represent the views of Daily Maverick.

One hand gives justice and the other takes it away — AI in the legal system

In the South African justice system, AI can both empower unrepresented people and challenge legal integrity by generating fabricated cases. Courts face a dilemma balancing access and professionalism.

Artificial intelligence (AI) is doing two opposite things to the South African justice system at the same time.

In one courtroom, it is advancing access to justice by helping an ordinary person (who cannot afford a lawyer) to make an argument good enough to win (this happened in Makunga v Barlequins Beleggings (Pty) Ltd t/a Indigo Spur (WCC) (unreported case no 19733/2017, 1 December 2023) (Bishop AJ)).

In another, it is threatening the administration of justice by generating fictional court cases that lawyers are placing before judges as if they were real (this happened in Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others [2025] ZAKZPHC 2, and Northbound Processing (Pty) Ltd v The South African Diamond and Precious Metals Regulator (2025/072038) [2025] ZAGPJHC 661).

Makunga had no lawyer. His attorneys had walked away from his case in 2020, leaving him to represent himself in the Western Cape high court. He filed his own written legal arguments (what lawyers call heads of argument), and those arguments cited case law extensively. When he was asked how he had prepared them, he said he had relied on Google. The lawyer in the matter expressed disbelief. So did others in the profession.

The judge, Bishop AJ, noted that he had seen worse arguments from trained advocates, and added something that has since attracted considerable attention: “Lawyers need to watch out for artificial intelligence. One day soon, the computers are coming for our jobs.”

What the Makunga case actually showed (though it is something the judge did not spell out) is that technology gave an unrepresented person the tools to participate in a high court hearing on something close to equal footing.

Makunga won his point, and technology helped him to do that. For anyone who has ever tried to navigate a legal system without money for a lawyer, would realise that this is not a small thing.

A different story

Yet technology, in different hands, tells a different story entirely. In the KwaZulu-Natal Division of the High Court, a legal team filed a supplementary notice of appeal that cited several cases, not all of which exist.

The judge tested ChatGPT herself and found it “blatantly incorrect” on the very cases at issue. The lawyers were referred to the body that regulates the legal profession. In the Gauteng Division of the High Court, the same problem surfaced in an urgent application where fictitious cases, cited as real, were submitted in heads of argument. The judge again referred the lawyers for investigation. No excuse was accepted. Neither the urgency of the application nor the sincerity of the apology was sufficient.

The Mavundla and Northbound cases were not honest mistakes in the way that misreading a case or misquoting a judgment might be. The cases simply did not exist. AI had invented them, and they had been placed before courts as genuine authority without anyone checking whether they were real.

The South African court in Northbound adopted the English position, in that lawyers have a professional duty to verify what AI produces before it goes anywhere near a court. That duty is not mitigated by the pressure of time, good intentions, or the fact that the error was unintentional.

Here is the problem. Makunga appears to have used technology to help him think and write. He checked his work, or at least nothing in his work was found to be fabricated. The lawyers in the other cases used AI to generate citations and then filed them without verification.

Two realities

What the courts have not yet worked out is how to respond to both realities at once. If the answer to fictitious AI-generated cases is to ban or heavily restrict AI (or advanced technologies) use in legal practice or the justice system, the cost is borne disproportionately by people such as Makunga, being people without lawyers, without legal training, without money for research databases, for whom technology may be the only practical means of preparing an argument.

A blanket prohibition protects the integrity of court records, but it does so partly by keeping unrepresented litigants out of the conversation.

What is clear is that the courts cannot carry this indefinitely. Each time a judge uncovers a fabricated citation, the court must investigate, adjourn, issue costs orders, and refer lawyers to the regulatory body. That is time and resources taken from other litigants, in a justice system that is already under pressure. This has a direct bearing on the administration of justice and for people to receive justice.

Bishop AJ’s remark about the computers coming for lawyers’ jobs was half-joking. But it pointed at something real.

AI is already in the justice system. The question is not whether it belongs there. It is whether the legal profession will decide, before the next invented case reaches a judge, whose interests it is there to serve. DM

Michele van Eck is Associate Professor, School of Law, University of the Witwatersrand.

Comments

Loading your account…

Scroll down to load comments...