/file/dailymaverick/wp-content/uploads/2025/10/label-Op-Ed.jpg)
Ever played two truths and a lie?
This morning, while I boiled the kettle, I scrolled through my newsfeed and learnt that sharks in the Bahamas are taking drugs, purple foods can prevent microplastics from entering your blood and that “selfish chromosomes” destroy rival sperm. Three big learnings - all in the minute it took the water to boil!
Unfortunately, only two of them were true.
Which was the lie? It would take some effort to find out.
Perhaps not surprising, with a recent study warning that “fake research is now spreading faster than real science”. But even without blatant science fraud, information to the public no longer arrives filtered through safety mechanisms we have come to rely on. Like the news editors who subscribe to codes of good practice, or skilled journalists who are experts at getting to the heart of a matter despite complexity and nuance (500 journalists were laid off at the Washington Post recently), and many more over recent years locally and globally.
No, it floods in through AI-driven algorithmic feeds where truth and fake news travel a little too closely together for my liking, or for public safety.
The problem is no longer access to information, but the collapse of rigour.
By the time the kettle clicks off we have absorbed a dozen assertions, reacted to half of them and verified none. All this “AI slop” is not harmless. Beyond the memes, it is dense with biased claims and often commercially motivated narratives that seep into our subconscious and gradually inform our behaviours.
It reminds me of “infodemiology”, a field coined during Covid by the World Health Organization to study how “too much content” in disease outbreaks leads to harmful health outcomes, including misinformation amplified and distorted by AI.
What became clear then is now evident in any field: an abundance of content does not equal an abundance of truth.
As researchers and trained knowledge workers, the active spread of misinformation is deeply concerning. But we need not be passive observers. We are custodians of credible knowledge, and we hold the reins in making sure it finds its way to where it’s needed.
However, in an AI-driven landscape, we must acknowledge its drawbacks before we benefit from its efficiencies – particularly in the African context.
Ever heard of algorithmic apartheid?
AI systems are largely developed and deployed by for-profit giants, few of which boast African ownership or sensitivity to our beautiful histories and cultures of knowledge.
Recent high-profile resignations from major AI companies over ethical concerns, including the testing of advertising within chatbots, reveal that corporations with huge marketing budgets rule our gateways to knowledge. Hard to swallow.
The Human Sciences Research Council recently warned that “in South Africa – where 10% of the population owns [86%] of the wealth – AI could further entrench this imbalance”. Another study on Artificial Intelligence: A New Global System, a Genocidal Project or the Revival of the Tokoloshe Culture? cautions that AI may erode African philosophical values like ubuntu.
“Ubuntu emphasise(s) shared humanity, while AI systems are often driven by individualism and profit maximisation. Replacing community-based workers in healthcare or social services with automated systems may boost efficiency, but it risks weakening the relational bonds of trust and reciprocity that are central to African cultural life.”
And AI systems trained on Western-biased datasets will tend to exclude or misrepresent African languages and lived realities. These biases can manifest what some scholars call “algorithmic apartheid”, where automated systems reinforce marginalisation through predictive policing, welfare allocation or credit scoring.
Clearly, getting involved in how scientific information flows through our society is no longer optional – it is integral to scientific practice and good stewardship of African innovation. So, what does that responsibility look like?
Five ways for scientists to cut through the AI sludge
1 Prioritise clarity without oversimplification
These days, complex arguments compete for attention as 200-character posts. Researchers need to know their audiences and make knowledge snappy and relevant to them. But we must avoid the flattening of nuance – that is, get the point across quickly and sharply, without losing robust details. Jive Media’s The Art of Research training programme teaches this technique to researchers every day, helping define their audiences so they know who they’re talking to, and how they like to be engaged.
2 Institutions must actively support communication
Universities and research councils have a duty to equip scientists with communication training, funding and strategic partnerships that help them make their efforts heard. Structured programmes such as FameLab – a robust (yet fun) comms training programme for young researchers – prove how effective science communication can be when researchers are enabled to speak compellingly and accurately about their work.
3 Embrace two-way engagement
Knowledge is only truly “useful" when it enters the world of humans and is tested, wrestled with and refined. AI is not doing this. It is simply reprocessing information and matching what it comes up with to what already exists.
So, participatory engagement through workshops, community-based studies, and showing up wherever science engages society can give human researchers the upper hand, and improve public trust. So can sharing outcomes from these spaces: the products of real human interactions, not AI-reprocessed words.
4 Be very, very visible
An interesting study in the Guardian noted that “a mere 0.1% of users share 80% of fake news. Twelve accounts – known as the “disinformation dozen” – created most of the vaccine misinformation on Facebook during the pandemic. These few hyperactive users produced enough content to create the false perceptions that many people were vaccine hesitant.”
The more power we give to a few, the more divided our knowledge systems become. So, scientists must become more visible, not less. We need more accurate voices out there to combat, balance and shape the discourse. Opportunities for visibility are everywhere – from your own social media platforms and traditional news sources, to platforms like Science Spaza that provide an innovative way to share your research with the next generation.
5 Join the front lines of emerging disciplines
A recent study published in Nature described a “tool revolution”, noting that new fields of research arise in response to the invention of new tools. When particle detectors landed, the field of high-energy physics was born, and so on. That same “scholarly engagement” is needed to address the ever-evolving tools of AI and social media, and their impacts.
If we don’t shape these powerful systems, others will – often without commitment to rigour, equity or cultural integrity. So, let’s get stuck in.
Credibility can no longer be assumed.
It must be actively constructed, and continually defended.
An African seat on the AI roller coaster is essential not only to aid global innovation, but to ensure we are represented accurately and empowered equitably within digital systems, with our cultural, ethical and moral values upheld.
We’ve been through too much as a community not to take AI seriously – both its opportunities and its risks.
And as scientists, we are best positioned to share real solutions – ones that don’t follow the “developed” route of destroying people and the planet as we go.
There is an attentive audience waiting – one constantly opening up their phones and browsers to new ideas and solutions. In this era of infodemics, scientific excellence must become the signal that rises above the noise.
P.S. Sharks really are ingesting drugs in the Bahamas. And selfish sperm does exist. The University of Utah proved it. DM
Robert Inglis, co-founder and director of communications agency Jive Media Africa, explores how researchers can help to keep credibility and rigour alive in an AI-driven world where misinformation often wins out over excellence in the race for our attention.

Illustrative image | Binary code. (Photo: iStock) | South African flag. | Magnifying glass.(Photos: Freepik)