The latest annual report of the Internet Watch Foundation (IWF) paints a stark picture. In 2024 alone, the IWF identified and removed more than 275,000 webpages containing child sexual abuse material, the highest number since records began.
Alarming, too, is that more than 90% of the child sexual abuse webpages identified showed children aged 13 and younger — children whose digital lives had barely begun before they were exploited.
This is not a distant problem. Africa’s digital transformation is connecting millions of young people every year, many of them accessing the internet for the first time on a mobile phone. With this progress comes new exposure to risks — not just from harmful interactions with strangers, but from the proliferation of abuse material that travels across borders at the speed of a click.
The most effective ways to disrupt this cycle are through the design of safe platforms, using hash matching to detect known child sexual abuse imagery and to have robust reporting mechanisms that are trusted, accessible and integrated into the platforms people use every day.
Reporting suspected child sexual abuse imagery means harmful material can be identified quickly and dealt with before it can spread further. People who accidentally stumble across suspected child sexual abuse imagery can play an important role in detecting criminal imagery.
Removal of this content at scale through international cooperation is key to disrupting criminal operations and ensuring that those who would profit from the abuse of children have nowhere safe to hide. By working closely with organisations like the IWF, platforms can keep their own web spaces safe.
Read more: Cape Town G20 event pushes for new global laws to keep children safe online
This work also has an important role to play in the healing process for survivors. Working to remove child sexual abuse imagery is an important step in allowing victims to know that what was once shared without consent can be taken down, reducing long-term trauma.
The knowledge that images or videos of them being abused and at their most vulnerable are in circulation online can have a seriously detrimental effect on those who have suffered abuse.
We need to see companies, regulators and civil society align to make the reporting of such pages visible and easy for users. In far too many African markets today, children and parents simply do not know where to turn for help.
This is why MTN has partnered with the IWF to embed reporting portals across its footprint, enabling millions of customers to raise the alarm in a safe and confidential way. But one operator is not enough.
To create a fundamental shift, more mobile operators, internet service providers (ISPs), and tech companies must join forces with IWF and similar organisations. The logic is simple: the wider the reporting net, the harder it is for abusive material to spread and proliferate online.
Initiatives like MTN and MTV Base’s Room of Safety demonstrate how awareness-raising campaigns can spark conversations among young people and caregivers. But awareness must be coupled with action – robust and scaled reporting systems that give those conversations practical impact.
For Africa, the stakes are high. The Global System for Mobile Communications Association (GSMA) forecasts that by 2030, half a billion children under 18 will be online across the continent. Without robust reporting systems, the same connectivity that unlocks opportunity could also magnify risk.
Read more: An idiot parent’s guide to keeping your kids safe on Roblox
Urgent steps must be taken to ensure that the safety of young people as they interact with technology is not optional, but an absolute imperative.
We need to scale reporting portals across telcos, platforms and ISPs, making them visible and accessible in every market.
Embedding obligations in policy and regulation so that content moderation and reporting is not optional, but a standard requirement for digital service providers, is another necessary intervention to consider.
We must also invest in awareness and literacy so that young people, parents and educators know how to report harmful content, and most importantly, the consequences for not doing so.
It is also important to align with global initiatives such as WeProtect Global Alliance and IWF’s own international reporting networks to ensure African voices are part of global solutions.
The spread of child sexual abuse material is not just a regulatory issue or a corporate risk. It is a profound human crisis that inflicts long-term scars on children and societies. It is also a problem we can solve together — with the right partnerships, strong reporting systems, and a commitment to putting children’s safety at the heart of Africa’s digital future.
The lesson is clear: connectivity without protection is not progress. Only by combining universal access with universal safety nets — including robust reporting — can we ensure that Africa’s digital transformation empowers children rather than exposes them. DM

