Dailymaverick logo

Opinionistas

This article is an Opinion, which presents the writer’s personal point of view. The views expressed are those of the author/authors and do not necessarily represent the views of Daily Maverick.

When speed becomes the enemy of truth: why societies are losing a new information war

In an age where information floods our senses at lightning speed, societies face a critical challenge: the information rate problem. In a nutshell, the velocity of misinformation today disrupts institutions, erodes trust and undermines democracy. The lesson is that understanding and managing this velocity is vital for the survival of truth.

For most of human history, information travelled slowly. It was slow enough to be filtered by institutions; slow enough to be debated, contested, absorbed; and slow enough to allow meaning to form before reaction was required.

That slowness was a feature of historical progress. As societies generated more knowledge, innovations evolved in step to control how and where information was distributed. The town crier, the printing press, the newspaper editor, the nightly broadcast bulletin – each acted as a throttle on the rate at which information entered society.

These innovations were not guarantors of truth. Rather, each acted – in their historical time – as a governor on speed.

Today, it is not uncommon to hear that societies, especially democracies, are threatened by both misinformation (false, inaccurate or misleading information that is spread regardless of whether there is intent to deceive) and disinformation (false or misleading information deliberately created and spread with the specific intent to deceive, manipulate or cause harm).

It is not, however, the mere fact of encountering false information that creates serious social and political challenges. The problem is its speed. I call it the information rate problem: when the rate of information exceeds the rate at which humans and institutions can metabolise it, societies falter.

This was always a risk – but not always a crisis

False information is as old as politics itself. Rumour, myth, propaganda and manipulation are not new. During the 20th century, propaganda was central to two of the most evil systems invented: Nazism and Stalinism.

What is new about misinformation in this century is scale multiplied by velocity.

For most of history, information shocks were localised. A false rumour might destabilise a court, a city, a kingdom. But notwithstanding the examples above, it rarely overwhelmed a whole society.

Even the invention of the printing press in the 15th century, often cited as a revolutionary rupture, did not break the system. Why? Because distribution was still slow, literacy was uneven, and institutional buffers adapted.

The first tipping point: the internet (1995-2005)

The internet removed the physical constraints on distribution. Suddenly, information could travel globally, instantly, and at near-zero marginal cost. But for a time, this did not feel destabilising. Why?

Because early internet content was still curated by friction. And this “friction” was manifest in various forms: limited bandwidth; technical barriers, specialist communities; delayed feedback loops.

The information rate increased – but not catastrophically.

The second tipping point: social media (2009-2016)

Social media did something fundamentally different. It collapsed production, amplification and consumption into a single feedback loop. This loop was optimised not for truth, but for engagement.

At that moment, the information rate ceased to be an emergent property of technology and became an explicitly engineered variable. The resultant system created a perverse set of rewards: outrage over nuance; speed over verification; and volume over coherence.

It was at this point that weaknesses in human cognition were brutally exposed. We didn’t fail morally, but rather thermodynamically: our processing capacity to discern meaning at this rate was insufficient.

The third tipping point: flooding the zone

This brings us to the present.

Steve Bannon, the far-right US political strategist who helped Trump win the Presidency in 2016 and continues to be an influential figure among his supporters, aptly described the strategy by which information can be used to overwhelm societies in the service of political ends: “flooding the zone with s**t”.

This strategy is not primarily about persuasion. It is about overload.

In overloading the system, you create a number of debilitating dynamics: when everything is contested, nothing can be resolved; when information arrives faster than it can be verified, truth becomes irrelevant; and when every signal is drowned in noise, the system loses coherence.

What makes the current moment uniquely dangerous is not merely that this strategy exists, but that it is being deployed by actors at the very centre of the global information ecosystem.

The United States once functioned – however imperfectly or hypocritically – as a high-credibility anchor in the global information system. Its institutions acted as stabilisers. This stabilising role has weakened.

The US mediascape has become ground zero for conflicts over “facts”. But these fights between political figures, talk show hosts, podcasters and others are not just media spectacles. They are stress fractures in the epistemic infrastructure.

Each operates at a high information rate, but with radically different truth-filtering philosophies. The result is not pluralism – it is interference.

Why institutions fail under excess information rate

Institutions are not designed to process infinite throughput. Consider, for example, some of the institutions with which we are all familiar, such as judicial courts, parliaments, regulators, universities or newsrooms. Each of these institutions relies on a common set of processes, such as sequencing, deliberation, evidence accumulation and procedural delay.

These are features, not bugs.

But when the external information rate exceeds institutional processing capacity, institutions appear “slow”, “corrupt” or “out of touch” – even when they are functioning exactly as designed.

At that point, public trust collapses not because institutions are lying, but because they cannot keep up.

Entropy, energy, and meaning

There is a useful parallel with entropy, a fundamental concept in science measuring disorder, randomness, or energy dispersal in a system.

Information and energy are linked.

High-entropy systems are noisy, disordered and incoherent. Order requires work. Meaning requires energy.

When information floods a system faster than energy can be applied to organise it, entropy rises. The system fragments, clusters form, extremes dominate.

Importantly, entropy does not care about intent. A system can collapse even if every participant believes they are acting in good faith.

Why this matters now

It should be obvious that, unless something is done, things will only get worse.

But what is to be done? More censorship? A return to gatekeeping elites? Neither are likely to produce the desired result.

Suffice to say, the first step in coming up with solutions that work and are sustainable is to recognise that there is a problem. An increased awareness of information rate is where I would start. But make no mistake: the threshold has now been crossed.

If democracy is to survive in a high-bandwidth world, it must rediscover mechanisms that strengthen our ability to determine truth. Among other functions, they should slow information where necessary; restore editorial function (human or machine); and privilege coherence over virality.

Ironically, only systems with non-human processing capacity – AI among them – may be able to act as effective information governors, reintroducing pacing, context and hierarchy into an otherwise chaotic flow.

What comes next depends on whether we learn to govern information not just by content, but by speed. DM

Dudley Baylis is a director of Bridge Capital, an independent M&A advisory, corporate finance, renewable energy and property advisory house.

Comments

Scroll down to load comments...