Defend Truth


Unfriended: The truth behind Facebook’s non-appearance before Parliament


Phumzile van Damme is a former MP, a misinformation expert and a communications strategist.

Facebook has issued a statement denying that it had refused to appear before the committee. It had only ‘postponed’ the meeting because the committee had not invited other tech giants. This is a blatant untruth.

In the aftermath of the storming of the US Capitol on 6 January, much was written about the role of social media platforms Facebook, Twitter and YouTube in amplifying misinformation about the US election results and the violence that ensued.

I was struck by the words of Sarah Miller, the director of the American Economic Liberties project and a member of Joe Biden’s transition team, in an interview with the BBC later that month. 

She said, “Facebook is broadly seen as the most prominent villain among all the tech monopolists.” 

A tweet by Biden’s deputy head of communications, Bill Russo, emphasised the point: “If you thought disinformation on Facebook was a problem during our election, just wait until you see how it is shredding the fabric of our democracy in the days after.”

Two months later, Facebook CEO Mark Zuckerberg appeared before Congress’s House Committees on Communications and Technology and Consumer Protection alongside the CEOs of Twitter and Google. There was rare consensus from legislators on both aisles that social media companies needed far greater regulation. An amendment of Section 230 of the US’s Communications Decency Act gained traction. The section provides platforms with the ability to moderate content posted and insulates them from liability over what users share or post. Without the protection from Section 230, companies could be held liable for anything that their users post.

The words of Mike Doyle, the chair of the House subcommittee on Communications and Technology, reflected the views of experts in the new and constantly evolving terrain of misinformation combating: 

“You can take this content down. You can reduce the vision. You can fix this. But you choose not to. You have the means. But time after time, you are picking engagement and profit over the health and safety of users.”

And it is not just in Washington DC that Facebook is under fire.

In Europe, the changes to the privacy policy of Facebook-owned WhatsApp have been met with major resistance. Germany has banned Facebook from processing personal data from WhatsApp and is seeking a European-wide EU ban. 

It is within this climate, one of acrimonious relationships with governments and legislators across the world, that I requested of fellow members of the Communications and Digital Technologies Committee that Facebook be invited to Parliament. This would be an ahead-of-the-curve opening of the door to constructive engagement with the hopes of encouraging Facebook to present a South Africa-focused plan to tackle misinformation and explain the steps it would take to limit private data harvesting.

I found it particularly important, and a point I intended to raise, that Facebook content moderators be provided to cover all 11 official languages in the lead-up to this year’s local elections. Facebook largely relies on artificial intelligence (AI) to moderate content. AI is often unable to detect nuance in posts, and certainly not in South Africa’s languages other than English.

We should not look away from the bigger picture. In a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement, the longer, the better, and all with the goal of collecting as much data as possible.

Within an hour of the committee agreeing to invite Facebook, I received a phone call from a man who informed me that he was from a company lobbying for Facebook. I was annoyed and told him that I do not appreciate Facebook thinking it can hire lobbyists to spin on its behalf. If Facebook had anything to say to me, it would have to contact me directly. He assured me Facebook would present itself to the committee and it would be in touch.

The weeks that followed were tense. I feared Facebook would pull out.

In the intervening time, I was interviewed by’s Nick Cowen about my reasons for requesting the meeting with Facebook. I mentioned that I would like to meet with Facebook executives to provide reassurance that the aim of the meeting was to build a constructive relationship rather than being an inquiry.

A few days later I received a terse email refusing any engagement, from Nomonde Gongxeka-Seopa, Facebook Southern Africa’s head of public policy. Gongxeka-Seopa had previously served on the Independent Communications Authority of South Africa Council, and I had grilled her on several occasions. I knew the Facebook ship was sailing.

A few days before the meeting with Facebook, set for 25 May, I received a phone call from the chairperson of the committee, Boyce Maneli, informing me that Facebook had pulled out. Facebook’s concern, he informed me, was that the committee had not invited other tech giants to provide similar briefings. My heart sank. But, he told me, Google Africa had responded positively to the committee’s request and had even submitted its presentation, and this had been communicated to Facebook to ease its worries.

The following day, despite being informed of Google’s attendance, Facebook still refused to appear before the committee.

This week, Facebook issued a statement denying that it had refused to appear before the committee. It had only “postponed” the meeting because the committee had not invited other tech giants. This is a blatant untruth. Facebook had been informed of Google’s attendance and despite this knowledge it still withdrew.

My sources told me the withdrawal was on instruction from headquarters, given the resistance to Facebook’s updated WhatsApp privacy policy across the world, and at home, the South African Information Regulator considering litigation against Facebook. It feared having to answer tough questions.

Facebook’s backhand to Parliament, and by implication the people of South Africa and the greater continent, is an act of unprovoked self-mutilation, to use Prince Mashele’s famous description of a gun-aimed-at-feet situation. Perhaps with the looming international backlash it would face for its contempt of the South African Parliament in mind, Facebook issued a press statement, “clarifying its position”. A dollar short and a day late.

The decision to withdraw feeds into the narrative that Facebook considers itself above reproach, and will add impetus to calls for its regulation. I am perhaps becoming one of the lone voices in the misinformation field standing in opposition to government regulation of social media platforms. I err on the side of caution and against giving the government any powers to regulate social media. This could be an opening to the limitation of freedom of speech.

I will admit some idealistic naivety in my belief that self-regulation can still work if improved. It would require platforms to tweak social media algorithms to prevent the amplification of harmful misinformation, and to better protect users’ private data.

I stand at the precipice, willing to consider suggestions of regulation based on international best practice.

I intend to continue to lend my voice and expertise to efforts to combat misinformation on the global level as well as at home. In South Africa, the effects of Bell Pottinger live on, and I do not intend to stand on the sidelines. A team is being assembled to tackle misinformation during the election to ensure a free flow of ideas without manipulation of the public discourse.

As Apple CEO Tim Cook put it, “We should not look away from the bigger picture. In a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement, the longer, the better, and all with the goal of collecting as much data as possible.”

Google’s openness should serve as an example to other Big Tech companies in South Africa. It is far more desirable to operate hand in hand rather than in a litigious and acrimonious space. DM


Comments - share your knowledge and experience

Please note you must be a Maverick Insider to comment. Sign up here or sign in if you are already an Insider.

Everybody has an opinion but not everyone has the knowledge and the experience to contribute meaningfully to a discussion. That’s what we want from our members. Help us learn with your expertise and insights on articles that we publish. We encourage different, respectful viewpoints to further our understanding of the world. View our comments policy here.

All Comments 12

  • Well said! Regulating social platforms will lead to a slippery slide into communism-style control, and yet the impact of misinformation / blatant fake news can not be ignored. Please keep up the good fight Phumzile.

  • Facebook are coming under enormous pressure. Considering the role they play in negatively affecting people’s mental health and how misinformation “educates” people, they need to be held accountable.

  • The unsocial side of social media is a real problem in that it allows a few to spoil its potential important role in communications. These few use social media to boost their flimsy egos as they are unwilling to meet those they abuse, face to face. I am not even referring to the darker side of social media and the internet. Good regulation is beneficial if it can be enforced – sadly our country’s enforcement agencies are generally not up to scratch. So personally I reduce social media to the bare minimum, mainly with family. perhaps freezing these gaints out by not using them will make a difference!

  • The platforms make money from advertising, just like publishers do. Hold them to the same legal liability standard.

    Too hard, too much work, will cost too much? Shame, I thought these were fourth industrial revolution high technology companies…

  • I found the article frank and concise. Less politicking speak & just statements of facts & factors. However, I believe there has to be a middle road whereby there’s self-regulation coupled with government oversight of that self-regulation including the powers to force things to happen much quicker in this context.

  • Please stay on this urgent matter, Ms van Damme. And remember, we need people like you in this beloved but wobbly country.

  • Should we be taking it as natural and inevitable that the ‘social media’ should be run for profit? Perhaps we can start imagining what it would be like if they were regarded as a ‘commons’.

  • This meeting should be pursued. Too many of our residents do not understand the technology, the use of their personal data, sanctioned or not. We are becoming more exposed to these giant conglomerates where there is a complete lack of competition, and the PR machine spins tales that are blatantly untrue. the one thing we have also forgotten, is the advent of POPIA on the 1 July 2021. Far more intense than GDPR and again, left too late or neglected by the majority of organisations. Is this yet another case of we simply don’t believe that the regulator will be able to enforce his regulation and so we carry on regardless?

  • It is virtually impossible to “regulate” private opinion. This is the space in which media manipulation [read Bell Pottinger] happens. The disease is built-in. It is in the private-opinion space where the line between fact and opinion gets blurred. It is also the space that can be intentionally abused to manipulate the gullible, as we saw the Analytica debacle, the Russian bot-meddling in various elections, the Bell Pottinger saga, etc etc. These are certainly not accidents or unfortunate incidents or aberrations, and they must be called what they are: blurring a clear definition of fact for the intentional purpose of brazen political manipulation.

  • Please peer review 3 community comments before your comment can be posted