Maverick Citizen


The onus to protect the 2024 elections is on social media giants, experts say

The onus to protect the 2024 elections is on social media giants, experts say
Several civil society organisations have called on social media companies to act against spreading violence, disinformation and hate speech to protect the 2024 elections. (Photo: / Wikipedia)

The 2024 general elections are gearing up to be a watershed moment in South African politics, so much so that many have dubbed 2024 the new 1994. With so much at stake, several civil society organisations have gathered to call on social media companies to act against spreading violence, disinformation and hate speech to protect the polls.

The Legal Resources Centre (LRC) and Global Witness, an international campaign organisation focused on protecting human rights and the environment, are taking social media giants to task over the threat their platforms pose to democracy.

In 2024, South Africans will head to the polls to vote in the general elections, which many anticipate will be the most significant since 1994.

To this end, the LRC took to Constitutional Hill in Johannesburg on Friday, 15 September – International Democracy Day – to launch its Year of Democracy Campaign, to hold social media giants accountable for ensuring free and fair elections. A panel of experts discussed the threats and opportunities created by different social media, a conversation that Media Monitoring Africa (MMA) moderated. MMA is a nonprofit organisation that acts as a media watchdog in Africa.

The LRC and Global Witness believe social media companies should take more responsibility for the content posted on their platforms, which occasionally run the risk of undermining the smooth running of elections and thus democracy as a whole. 

Social media’s potential to undermine democracy was seen in the US when more than 2,000 people stormed the Capitol Building in Washington on 6 January 2021. While the “siege on the capital” was sparked by a call by former president Donald Trump for protest after he lost the elections, social media posts quickly became the driving force behind the attempted siege.

The far right used social media to organise themselves, posting about which streets to use to avoid the police and the best tools to carry to pry open the building’s doors.

Naomi Hurst, the team lead on digital threats to democracy at Global Witness, said: “With the elections fast approaching, we think the onus is on big tech companies themselves to safeguard elections wherever they are in the world.”

Candidate attorney Kristen Abrahams sounded the alarm on the threat social media poses, especially when used with nefarious intentions. “When social media is put into certain hands, it becomes a tool for undermining democracy and democratic principles. We’ve had many examples in South Africa in particular. We don’t have to look far to see how social media is being used in a negative way.” 

July 2021 unrest

South Africans don’t need to look further than the unrest that erupted in KwaZulu-Natal and spread to Gauteng in July 2021 for evidence of how social media can be used to cause real-world harm. Dubbed the “social media-fuelled unrest”, the widespread looting, vandalism and violence were sparked by the arrest of former president Jacob Zuma and snowballed into a week-long national crisis. 

In February 2022, an expert panel appointed by President Cyril Ramaphosa to probe the unrest found that social media played a significant role in mobilising public violence.

The most engaging content is that which stokes fear and anger. So, big tech’s profit motive is toxic for democracy.

The 154-page report said that in the build-up and during the unrest, X (formerly Twitter), WhatsApp, Facebook and other forms of media provided a fast, inexpensive and efficient way of spreading news and galvanising the public to unrest.

Hurst and Abrahams fear that with so much at stake in the 2024 elections,  there is a significant possibility that social media will be used to create similar chaos across the country.

Content moderation policy pitfalls on social media

In light of the significance of the 2024 elections, the LRC and Global Witness conducted an investigation to test how platforms such as Facebook, TikTok, YouTube and X moderate content in the Global South, particularly South Africa.

The organisations tested the platforms’ ability to detect outrageous content that blatantly violated its policies on posting content that is hateful, spreads disinformation or has the potential to incite violence. Thirty-eight adverts were created in English, Afrikaans, isiZulu and isiXhosa, and included xenophobic hate speech targeting refugees and migrants in the country. The content was then sent to the platforms as an advert and scheduled for publication.

The study found that YouTube and TikTok approved all 38 adverts, and none was flagged for using inflammatory language. On Facebook, only one advert was flagged, in English and Afrikaans.

Hurst said: “We think that the problems that arise from social media and the warping effect it has on our collective information ecosystem stem from big tech’s business model that they profit from engagement, and we know from academic research that the most engaging content is that which stokes fear and anger. So, big tech’s profit motive is toxic for democracy.”

While the test in South Africa was targeted at xenophobic sentiment, Global Witness and LRC are concerned about the implications for the elections of a lack of adequate content moderation.

2024 elections

Looters on Spine Road behind Pavilion Mall in Durban during the widespread unrest on 12 July 2021. (Photo: Gallo Images / Darren Stewart)

Global Witness handed the result of the investigation to the social media companies tested. These are the responses:

Meta (Facebook) said: “These ads violate our policies and have been removed. Despite our ongoing investments, we know that there will be examples of things we miss or we take down in error, as both machines and people make mistakes. That’s why ads can be reviewed multiple times, including once they go live.”

South Africa usually falls into the trap of copying policy from regions like the EU, which have vastly different cultural and political landscapes.

TikTok said hate has no place on the platform, that its policies prohibit hate speech and that ad content passes through multiple levels of verification before it’s approved.

Google did not respond.

Rekgotsofetse Chikane, a lecturer at the Wits School of Governance, pointed to the shortcomings of South Africa’s policy to regulate social media. He said the country usually falls into the trap of copying policy from regions like the EU, which have vastly different cultural and political landscapes.

“So, a really good example of this would be South Africa’s Personal Protection of Information Act, right? [The act] is a fascinating piece of legislation because if I put it through like a turn-it-in score as an academic paper, that turn-it-in score would probably be like an 80% hit of the EU’s GDPR policy on data.”

Chikane said the government may fall prey to doing something similar when developing a policy to regulate social media companies, which may fall short of addressing concerns in the South African context.

The challenge to big tech

During the launch, Media Monitoring Africa revealed that it is working with the Electoral Commission of South Africa (IEC) to address disinformation in the build-up to the elections. It has lobbied social media platforms to sign on to a framework of cooperation signed by Facebook parent company Meta and Google. While the framework is not legally binding, it has a clause about disinformation and misinformation online, through which the companies acknowledge the potential threat disinformation poses to the public’s ability to meaningfully participate in the public process, including the ability to remain publicly informed.

Read more in Daily Maverick: Civil society gears up for South Africa’s 2024 polls: Vote. Participate. Activate

By signing the framework, the tech giants agreed to work with the IEC to stem the tide of disinformation and the impact it may have on the elections.

Global Witness and the LRC have joined the new Global Coalition for Tech Justice and have challenged social media companies to equitably invest resources in safeguarding the integrity of the 2024 polls.

While several campaigns and public calls are in place to push social media companies to take more responsibility for their impact on democracies, Chikane acknowledged that there is no incentive for big tech companies to be responsive to public calls. He said that one way to force social media companies is to hurt them on their bottom line and target their profits, which in and of itself is undemocratic. DM


Comments - Please in order to comment.

Please peer review 3 community comments before your comment can be posted


This article is free to read.

Sign up for free or sign in to continue reading.

Unlike our competitors, we don’t force you to pay to read the news but we do need your email address to make your experience better.

Nearly there! Create a password to finish signing up with us:

Please enter your password or get a sign in link if you’ve forgotten

Open Sesame! Thanks for signing up.

We would like our readers to start paying for Daily Maverick...

…but we are not going to force you to. Over 10 million users come to us each month for the news. We have not put it behind a paywall because the truth should not be a luxury.

Instead we ask our readers who can afford to contribute, even a small amount each month, to do so.

If you appreciate it and want to see us keep going then please consider contributing whatever you can.

Support Daily Maverick→
Payment options