Maverick Life

THE CONVERSATION

Algorithms are moulding and shaping our politics. Here’s how to avoid being gamed

Algorithms are moulding and shaping our politics. Here’s how to avoid being gamed
South African protestors march against South African president Jacob Zuma in Cape Town, South Africa, 07 August 2017. The protestors were from 'Unite Behind Coalition' a group of civil society organizations and religious leaders all calling for president Zuma to step down ahead of the 08 August 2017 motion of no confidence which will be voted on in parliament. EPA/NIC BOTHMA

The political content in our personal feeds not only represents the world and politics to us. It creates new, sometimes “alternative”, realities.

In 2016, evidence began to mount that then-South African president Jacob Zuma and a family of Indian-born businessmen, the Guptas, were responsible for widespread “state capture”. It was alleged that the Gupta family influenced Zuma’s political appointments and benefited unfairly from lucrative tenders.

The Guptas began to look for a way to divert attention away from them. They enlisted the help of British public relations firm Bell Pottinger, which drew on the country’s existing racial and economic tensions to develop a social media campaign centred on the role of “white monopoly capital” in continuing “economic apartheid”.

The campaign was driven by the power of algorithms. The company created over 100 fake Twitter bots or automated Twitter accounts that run on bot software – computer programs designed to perform tasks and actions, ranging from rather simple ones to quite complex ones; in this case, to simulate human responses for liking and retweeting tweets.

This weaponisation of communications is not limited to South Africa. Examples from elsewhere in Africa abound, including Russia currying favour in Burkina Faso via Facebook and coordinated Twitter campaigns by factions representing opposing Kenyan politicians. It’s seen beyond the continent, too – in March 2023, researchers identified a network of thousands of fake Twitter accounts created to support former US president Donald Trump.

Legal scholar Antoinette Rouvroy calls this “algorithmic governmentality”. It’s the reduction of government to algorithmic processes as if society is a problem of big data sets rather than one of how collective life is (or should be) arranged and managed by the individuals in that society.

In a recent paper, I coined the term “algopopulism”: algorithmically aided politics. The political content in our personal feeds not only represents the world and politics to us. It creates new, sometimes “alternative”, realities. It changes how we encounter and understand politics and even how we understand reality itself.

One reason algopopulism spreads so effectively is that it’s very difficult to know exactly how our perceptions are being shaped. This is deliberate. Algorithms are designed in a sophisticated way to override human reasoning.

So, what can you do to protect yourself from being “gamed” by algorithmic processes? The answers, I suggest, lie in understanding a bit more about the digital shift that’s brought us to this point and the ideas of a British statistician, Thomas Bayes, who lived more than 300 years ago.

How the shift happened

Five recent developments in the technology space have led to algorithmic governmentality: considerable improvements in hardware; generous, flexible storage via the cloud; the explosion of data and data accumulation; the development of deep convoluted networks and sophisticated algorithms to sort through the extracted data; and the development of fast, cheap networks to transfer data.

Together, these developments have transformed data science into something more than a mere technological tool. It has become a method for using data not only to predict how you engage with digital media but to preempt your actions and thoughts.

This is not to say that all digital technology is harmful. Rather, I want to point out one of its greatest risks: we are all susceptible to having our thoughts shaped by algorithms, sometimes in ways that can have real-world effects, such as when they affect democratic elections.

Bayesian statistics

That’s where Thomas Bayes comes in. Bayes was an English statistician; Bayesian statistics, the dominant paradigm in machine learning, is named after him.

Before Bayes, computational processes relied on frequentist statistics. Most people have encountered this method in one way or another, as in the case of how probable it is that a coin will land heads-up and tails-down. This approach starts from the assumption that the coin is fair and hasn’t been tampered with. This is called a null hypothesis.

Bayesian statistics does not require a null hypothesis; it changes the kinds of questions asked about probability entirely. Instead of assuming a coin is fair and measuring the probability of heads or tails, it asks us instead to consider whether the system for measuring probability is fair. Instead of assuming the truth of a null hypothesis, Bayesian inference starts with a measure of subjective belief which it updates as more evidence – or data – is gathered in real-time.

How does this play out via algorithms? Let’s say you heard a rumour that the world is flat and you do a Google search for articles that affirm this view. Based on this search, the measure of subjective belief the algorithms have to work with is “the world is flat”. Gradually, the algorithms will curate your feed to show you articles that confirm this belief unless you have purposefully searched for opposing views too.

That’s because Bayesian approaches use prior distributions, knowledge or beliefs as a starting point of probability. Unless you change your prior distributions, the algorithm will continue providing evidence to confirm your initial measure of subjective belief.

But how can you know to change your priors if your priors are being confirmed by your search results all the time? This is the dilemma of algopopulism: Bayesian probability allows algorithms to create sophisticated filter bubbles that are difficult to discount because all your search results are based on your previous searches.

So, there is no longer a uniform version of reality presented to a specific population, like there was when TV news was broadcast to everyone in a nation at the same time. Instead, we each have a version of reality. Some of this overlaps with what others see and hear and some doesn’t.

Engaging differently online

Understanding this can change how you search online and engage with knowledge.

To avoid filter bubbles, always search for opposing views. If you haven’t done this from the start, do a search on a private browser and compare the results you get. More importantly, check your personal investment. What do you get out of taking a specific stance on a subject? For example, does it make you feel part of something meaningful because you lack real-life social bonds? Finally, endeavour to choose reliable sources. Be aware of a source’s bias from the start and avoid anonymously published content.

In these ways, we can all be custodians of our individual and collective behaviour. DM/ML 

This story was first published in The Conversation. 

Chantelle Gray is a Professor in the School of Philosophy at North-West University.

Gallery

Comments - Please in order to comment.

  • Anri-Jacques Smuts Smuts says:

    Thank for an insightful article. I suspect that most of us underestimate the extent of biased or bespoke information that reaches us, even when one sincerely regards oneself to be objective and open to argument.
    Maverick: why not create a standard option for articles expressing opinions that links to private (or at choice, public) reputable sites expressing contrary or divergent views from those in the article?

Please peer review 3 community comments before your comment can be posted

We would like our readers to start paying for Daily Maverick...

…but we are not going to force you to. Over 10 million users come to us each month for the news. We have not put it behind a paywall because the truth should not be a luxury.

Instead we ask our readers who can afford to contribute, even a small amount each month, to do so.

If you appreciate it and want to see us keep going then please consider contributing whatever you can.

Support Daily Maverick→
Payment options

Daily Maverick Elections Toolbox

Download the Daily Maverick Elections Toolbox.

+ Your election day questions answered
+ What's different this election
+ Test yourself! Take the quiz