South Africa

RIGHT TO PRIVACY OP-ED

SA must take steps to protect vulnerable children and ensure their safety in the global digital space

SA must take steps to protect vulnerable children and ensure their safety in the global digital space
(Photo: Unsplash / Annie Spratt)

Online platforms pose a range of risks to a child’s privacy, including the processing of personal data, online surveillance, and creating extensive digital profiles that record a child’s entire online presence, as well as exposure to inappropriate content.

The widespread availability of the internet and other digital technologies has greatly expanded the opportunities for new forms of communication to children to engage, share, learn and develop in ways that were almost impossible before. The internet has also provided a vast amount of information and resources that have been used for teaching and learning in a manner that enhances their social and cognitive development.  

However, for all the opportunities digital technologies have provided, the proliferation of online platforms has made it easier for predators to target children, leading to a surge in reported cases of children being victimised. This trend highlights the urgent need to protect vulnerable children and ensure their safety in the digital space.

According to Unicef’s 2022 Disrupting Harm household survey for South Africa, 58% of the 1,639 children aged nine to 17 who were surveyed reported going online at least once a day; this was regardless of gender or whether they lived in urban or rural areas. 

The survey also revealed that a concerning number of internet-using children in South Africa had been subjected to sexual exploitation and abuse. Specifically, 9% of the surveyed children reported being offered money or gifts in exchange for inappropriate images or videos, 9% were approached for in-person sexual encounters, and 7% reported that their sexual images had been shared without their consent.

Even when data protection and privacy law violations are identified and monetary fines imposed, these penalties may not be significant enough to deter companies from engaging in such practices.

In addition, 7% of the children reported being threatened or coerced into inappropriate activities, further highlighting the grave risks online predators pose.

Due to their lack of awareness about potential online dangers and their limited skills to protect themselves, children are vulnerable to threats that can compromise their privacy, safety and security. Such threats can have negative impacts on their mental and physical well-being. With an increasing number of children gaining access to the internet, it is crucial to prioritise their safety and ensure that they can fully leverage the opportunities and benefits the online world offers.

This calls for urgent action to protect children from online threats and provide them with a secure online environment that fosters healthy development.

Online platforms pose a range of risks to a child’s privacy, including the processing of personal data, online surveillance, and creating extensive digital profiles that record a child’s entire online presence.

In addition, the use of biometric data, as well as existing risks such as online stalking and harassment, can further compromise a child’s privacy. Moreover, children are also at risk of exposure to inappropriate content, which can have severe and lasting effects on their emotional and psychological well-being.

Many online platforms and services which currently collect children’s data may not fully comply with relevant protection and privacy laws. Additionally, some companies may not properly verify parental consent before collecting children’s data or may collect data in ways beyond necessary for the service provided.

Furthermore, even when data protection and privacy law violations are identified and monetary fines imposed, these penalties may not be significant enough to deter companies from engaging in such practices.

This can be due to several factors. First, some companies may be so large and profitable that the fines imposed on them are relatively small compared with their revenue. For these companies, the potential benefits of engaging in data protection violations outweigh the risks of being caught and fined. To some extent, criminal penalties may be just another cost of doing business.

Second, it can be difficult to accurately quantify the harm caused by data protection violations, particularly regarding children. The long-term effects of such violations of children’s privacy and security may not become apparent until years later, making it challenging to determine the appropriate compensation or fine.

Enforcement mechanisms

These challenges notwithstanding, we want to stress that effective enforcement mechanisms should still be established to ensure compliance with guidelines and regulations while considering cultural, social and economic factors that may impact on the effectiveness of these regulations.

The UK has shown a good example of how to protect children’s rights online. Rules have been created to guide developers of websites and apps in designing platforms to be safe for children. These rules became known as the Children’s Code and came into use in September 2021. The code has 15 detailed rules that ensure that websites and apps do not do anything that might be harmful to children.

In South Africa, the government could consider developing a similar set of guidelines or regulations that apply to online services used by children under 18. This could involve consultation with stakeholders, civil society organisations and children themselves, with the goal of ensuring that the guidelines are appropriate for the South African context and take the needs and perspectives of local communities into account.

Read more in Daily Maverick: TikTok bans: what the evidence says about security and privacy concerns

The guidelines or regulations could set out clear standards for online services to follow, such as providing age-appropriate privacy information, minimising data collection and sharing, and implementing default settings that prioritise the safety and well-being of children.

Age-appropriate design frameworks can significantly contribute to creating a safe and suitable digital environment for children and young people. This is especially crucial as technology becomes more prevalent in children’s and young people’s lives. By incorporating age-appropriate design frameworks into digital transformation strategies, the government can ensure that digital services are designed with the needs and capabilities of children and young people in mind.

South African policymakers, researchers and educators could study frameworks developed by other countries, including goals, strategies and implementation plans.

When designing and developing digital services for children and young people, designers and developers should prioritise the child’s best interests and ensure that the design of online services considers the intended audience’s age, maturity and capabilities. Online services must safeguard children’s privacy and personal data and comply with applicable data protection legislation.

Useful AI

In today’s interconnected world, where information flows across borders seamlessly, it is essential to address these challenges and establish a harmonised global approach for age-appropriate digital services. A harmonised global approach would provide a framework for countries to align their regulations, standards and enforcement mechanisms.

The rise of the Fourth Industrial Revolution (4IR) and technologies around it such as artificial intelligence (AI) can potentially improve the effectiveness of child protection efforts.

Read more in Daily Maverick: It’s high time we made the virtual space safer for children with age-appropriate digital services

One key area where AI can be particularly useful is in identifying children at risk of abuse, neglect or exploitation. For example, machine learning algorithms can analyse large datasets of social media posts, online chats or other digital communications to detect patterns that may indicate potential abuse or exploitation.

AI can moderate user-generated content to ensure that it is appropriate and does not contain harmful material, such as hate speech or cyberbullying. It can also personalise online services for children and young people based on age, interests and preferences, while ensuring that the content and features are age-appropriate and safe.

Deep learning models can be trained to recognise inappropriate content in images and videos, enabling automated detection and filtering. This technology has the potential to prevent children from accessing harmful or age-inappropriate content online.

Online platforms and apps can use the IEEE 2089-2021 Standard by the Institute of Electrical and Electronics Engineers to ensure that they are only collecting and using data from children in a way that is age-appropriate and in compliance with applicable laws and regulations. They can also use the standard to publish clear and transparent privacy policies for children, and to provide parents with a way to contact the organisation with any questions or concerns.

The use of deep learning technology to implement the IEEE 2089-2021 standards has the potential to create a safer and more enjoyable digital world for children.

It is important to note that incorporating AI into age-appropriate design frameworks must be done responsibly and ethically, considering potential biases and the need for transparency and accountability.

AI systems should also be regularly monitored and evaluated to ensure that they effectively promote the safety and privacy of children and young people online.

South African policymakers, researchers and educators could study frameworks developed by other countries, including goals, strategies and implementation plans. By analysing these frameworks, South Africa could identify best practices that have been successful.

For example, they could look at how other countries have addressed issues such as cyberbullying, online grooming and harmful content, and identify approaches that could be adapted to the South African context.

South Africa could also investigate how they can integrate 4IR technologies into age-appropriate frameworks and standards. DM

Mduduzi Mbiza is a Research Associate, University of Johannesburg. Professor Saurabh Sinha is an electronic engineer and Deputy Vice-Chancellor: Research and Internationalisation, University of Johannesburg. The authors write in their personal capacity.

Gallery

Comments - Please in order to comment.

Please peer review 3 community comments before your comment can be posted

X

This article is free to read.

Sign up for free or sign in to continue reading.

Unlike our competitors, we don’t force you to pay to read the news but we do need your email address to make your experience better.


Nearly there! Create a password to finish signing up with us:

Please enter your password or get a sign in link if you’ve forgotten

Open Sesame! Thanks for signing up.

We would like our readers to start paying for Daily Maverick...

…but we are not going to force you to. Over 10 million users come to us each month for the news. We have not put it behind a paywall because the truth should not be a luxury.

Instead we ask our readers who can afford to contribute, even a small amount each month, to do so.

If you appreciate it and want to see us keep going then please consider contributing whatever you can.

Support Daily Maverick→
Payment options

Daily Maverick Elections Toolbox

Feeling powerless in politics?

Equip yourself with the tools you need for an informed decision this election. Get the Elections Toolbox with shareable party manifesto guide.