Defend Truth

Opinionista

The curious case of blockchain in the dubious queue to prop up ‘Fourth Industrial Revolution’ mumbo jumbo

mm

Dr Ian Moll is research fellow at REAL (Centre for Researching Education and Labour) at the University of the Witwatersrand, having been retired from the Division of Educational Information and Engineering Technology at the end of 2021. His interests lie in learning and pedagogy, the network society, and educational technology. His PhD is in cognitive science from the University of Geneva. He is a former visiting professor at the Universities of Makarere and Witwatersrand (where he was lead researcher in the Panafrican Agenda on the Pedagogic Integration of ICTs). His latest publication is ‘The myth of the Fourth Industrial Revolution: Implications for teacher education’, in Maringe (ed.), ‘Ideological disruptions in higher education’ (2021).

Advocates of the ‘Fourth Industrial Revolution’ have their work cut out to keep us all convinced. They do so by continually trotting out lists of the allegedly magical, transformative ‘technologies’ that supposedly prove the existence of the 4IR. But they are all 3IR technologies deeply rooted in the previous century.

We live in a time that most people call the Fourth Industrial Revolution (4IR). It is the “age of the smart machine”. Every aspect of our lives is supposed to be transformed, as we get plugged into the ubiquitous, electronic, networked, computerised, information-driven, global universe. Soon, our biological selves are supposed to mesh with our digital selves, so that we become what Ray Kurzweil calls technological singularity.

Remember we are talking here about a revolution, not just a speeding up of the evolution of the computer age, which started around about the end of World War 2.

The first digital computer, which we might call the ABC of computers (the Atanasoff–Berry computer), was invented in Iowa, US, in 1942. The first cyber-system came into being during the war, when Norbert Wiener designed technology to aim and fire anti-aircraft guns automatically. He specified it (and named it) in 1948 in his famous book, Cybernetics.

Artificial intelligence (AI) got going with the Turing machine — which was not a machine, but a mathematical model, or algorithm — in 1950. By then, the computer age was fully under way. The internet appeared in 1969, linking computers at various universities via standard telephone lines. At almost the same time, two computers — one at Stanford and one at UCLA — connected for the first time via satellite, making the first move towards what would become the world-wide-web (or simply, the Web). Digitised, networked computer technology has continued to evolve, more and more quickly, ever since.

However, researchers of various kinds (scientists, social scientists, historians of technology) are realising more and more that there is not much of a revolution in the “4IR”.

Despite all the hype pumped out by organisations like the World Economic Forum, there is not really any fundamental social and political change associated with the evolution of information and communication technologies.

It is easy to imagine a prize-winning robot proclaiming that she wants to end poverty and achieve world peace — perhaps Siri saying this as she exits our iPhones into her much more mobile bionic body; or a self-driving Uber vehicle being able to “read” the intentions of a Joburg taxi driver like a smart (human) Mzansi driver can. But we are light years away from anything like this, if ever it comes about.

The Turing machine suggested to us in 1950 that these kinds of computers are probably impossible in principle, and we know now that robots — for all their positional accuracy, and dexterity, and ability to work for long hours without pay — are not capable of common sense.

Advocates of the “4IR” have their work cut out to keep us all convinced. They do so by continually trotting out lists of the allegedly magical, transformative “technologies” that supposedly prove the existence of the 4IR. There’s a kind of formula that they apply, which goes something like this:

  1. List between seven and 15 digital technologies that sound smart, make us feel outdated and leave us in awe of the future;
  2. Even if they are not of the 21st century, declare them to be so;
  3. Declare that there is amazing, unprecedented convergence between them; and
  4. Suggest that they produce changes that will disrupt and transform every part of our lives.

The list that they come up with usually includes the following popular “technologies”: artificial intelligence (AI), robots, machine learning, the internet of things (IoT), cyber-physical systems, 3D printing and blockchain. Yes, there’s blockchain!

What on earth is blockchain doing on this list? Blockchain is really boring.

The others are not boring. AI is about the digital brains that run in intelligent machines. No matter what generation we come from, there is remembered delight at the idea of AI — Isaac Asimov’s three laws of robotics, “Star Wars”, your robopet Poo-Chi, “Black Panther”, all cool stuff.

Or there is Siri in your phone, knowing just about anything, and soothing tired male souls. And other robots? Well, a lot of us are scared that the robots are coming for our jobs. Still, one only has to think of popular robotic movie heroes to realise how important the idea of robots is for those trying to mobilise us to join the 4IR.

Machine intelligence is what makes robots cool. The thinking, talking, walking, running, jumping, feeling machine is what the hype is all about.

The IoT is about dreams of ultimate comfort. Everything in our world is connected in our service. The proverbial case is the digital alarm clock that wakes you up, switches on the coffee machine, runs a hot bath, checks the weather for the day and suggests an outfit from an inventory of clean clothes while simultaneously checking traffic patterns and planning your best route to work, also reminding you of your appointments for the morning, booting up your favourite soundtracks to soothe you awake, prompting you to use your cellphone to choose your breakfast menu for the day, and letting your hard-working, early-rising maid in the kitchen know what she should start cooking, all connected via the internet. Networks and devices and data that we are deeply interested in.

Cyber-physical systems are next-generation IoTs, with even more clever AI that runs all the connections between things. But the language itself, “cyber-physical”, is enough to keep us interested when the 4IR prophets come around.

Even 3D printing is cool. I remember my daughter, on first hearing about it: “so it’s like, Princess Leia or Obi-Wan Kenobi being printed in 3D in front of us, I want one!” She was thinking 3D holograms, but that’s the fantasy that draws us to the idea. Even the real thing is cool — have you ever watched them printing a 3D baby?

But blockchain is boring. It is boring like accountancy is boring. An accountant is a woman or a man in a grey suit who does calculations and stores things in files. Blockchain is a database in a grey suit that does calculations and stores things in files. It stores information digitally, but differs from past databases (vague interest aroused?) in that it structures information in discrete “blocks” rather than in tables. These blocks are closed when filled, and linked in a chain that constitutes a secure, shared, distributed ledger. The sequence of each block is irreversible — it is given an exact timestamp and a hash (a digital fingerprint or unique identifier).

You see what I mean, blockchain is boring? It has none of the hype and excitement associated with it that the robots and stuff do.

Think of the career counsellor in Monty Python trying to encourage someone to become a blockchainer:

Counsellor: Well blockchain is rather exciting isn’t it?

Jobseeker: Exciting? No, it’s not. It’s dull. Dull. Dull. My God, it’s dull, it’s so desperately dull and tedious and stuffy and boring and des-per-ate-ly dull.

Counsellor: Well, er, yes. But you see, our experts describe you as an appallingly dull fellow, unimaginative, timid, lacking in initiative, spineless, easily dominated, no sense of humour, tedious company and irrepressibly drab and awful — in work with blockchain, a positive boon.

It is hard to imagine how blockchain could be made more exciting. Could we mobilise the hidden metaphors within it? It brings to mind a toilet bowl — a chain and a blockage, if you get my drift — but no it can’t be that. Is it about throwing off one’s chains and being free, unblocking one’s potential, so to speak? Maybe.

Nowadays, the 4IR people spice it up with talk about using “non-fungible tokens”, which is the initial storage of the datum of an “indivisible” thing like a Bugatti, or artwork, or intellectual property in a blockchain block. But they don’t get much mileage out of that — attention wanders too quickly to the car, the sculpture or the idea. It wanders away from the 4IR.

So blockchain really is a curious case. It is not clear at all why it is so high up in the queue of “technologies” that prop up 4IR mumbo jumbo.

Ah! My brother, who is an accountant, has just told me why. “The recent explosion of investment in and then collapse of the value of Bitcoin”, he says, “has brought into scrutiny the blockchain architecture that drives cryptocurrencies.”

This sounds good. It sounds like a boring reason to believe that there is a 4IR.

Why, by the way, is the 4IR mumbo jumbo? Why is it not a revolution? To go back to where this article started: the reason is that none of the cool technologies mentioned above is “4IR”. They are all 3IR technologies deeply rooted in the previous century:

  • AI is a field of research that seeks to conceive artificial humans. Its central questions are, “can a machine think?” and “can a machine act like a human being?”. It commenced with the emergence of modern high-speed digital computers in the 1950s, in the work of scientists like Alan Turing and Marvin Minsky;
  • Robotics is concerned with computerised machines that replicate human action. The first digitally programmed robot appeared in a Connecticut factory in 1961. Today, 2.7 million industrial robots operate globally — the process has been one of continual evolution. The first humanoid robot, WABOT-1, appeared in Japan in 1973. ASIMO, the most advanced humanoid, still uses sensor, actuator, bipedal, and language processing technologies with a lineage straight back to WABOT-1;
  • Machine learning refers to the ability of computers to learn and act automatically as humans do, without explicit programming. The term was coined by Arthur Samuel, who invented a computer programme to play draughts in the 1950s. By 1997, the IBM computer “Deep Blue” beat the world chess champion;
  • Obviously, the core technology of the IoT is the internet, which appeared in 1969, and “went live” with the web from 1991. IoTs also employ analogue/digital converters, invented in the 1960s. The first IoT was built in the early 1980s, when university techies installed micro-switches in a vending machine to check on cooldrinks from their desks;
  • Cyber-Physical Systems appear very 21st century — the term was coined in 2006 — but their technology goes back to computational models outlined by Wiener in Cybernetics. These were developed through the 20th century in embedded and hybrid systems in computer programming. The Apollo guidance system in 1969 employed the first example of a modern, concurrent, embedded computing program. Hybrid systems were widely researched in the 1990s;
  • Big Data storage, and associated analytics, enable a massive coming together of information in extensive, global networks. Big Data for Dummies tells us, “It would be nice to think that each new innovation in data management is a fresh start and disconnected from the past. However, most new stages of data management build on their predecessors. … data management waves over the past five decades have culminated in where we are today: the initiation of the big data era”; and
  • 3D Printing is a process where successive layers of plastic are fused together to build solid objects from digital models. The Japanese inventor Kodama invented it in 1980. By the 1990s, fused deposition modelling had been developed. This has been dubbed “desktop 3D printing” because it is the most commonly used form of the technology today.

Blockchain, by the way, also builds on core 3IR components — the Merkle-Damgård hash function specified in the 1980s, and the Goldwasser–Micali cryptosystem first proposed in 1982.

The conclusion from these accounts of proclaimed “4IR” technologies is clear. None of them is a radical, ground-breaking invention of contemporary times. There simply is no Fourth Industrial Revolution. DM

Gallery

Comments - Please in order to comment.

  • Michiel Erik Moll says:

    And so a cherished proclamation of many an article, paper and even conference is exposed to be what it really is – unwarranted hype. More sophisticated automation and computerization is merely that – better 3IR. Catchphrases like robotics and Ai hide an inability to see that they not a new revolution but new means of achieving an old one – and not all that new. A little talking robot answering users queries? Must be 4IR, a revolution, something to prove how cutting edge we are and talk about at conferences in a show of 4IR one upmanship when this, like so many of the good examples given, is a full blown 3IR object.

    However we should not turn away from these ideas and applications just because we are brought to the realization that they are not Hyper modern 4IR constructs. Let us be pragmatic and celebrate what they can do – just leaving the unwarranted and unnecessary 4IR tag behind.

  • Chris Crozier Crozier says:

    The real problem with 4IR is that too many people – marketers, politicians, journalists – latch onto it as the latest buzzword and misuse it to glorify or dramatise often mundane things. Consequently the term gets degraded and rightly earns this sort of scorn. But I think it’s the degraded usage that deserves scorn. There are real 4IR developments budding or on the horizon, but very few of those tagged with it really qualify. The WEF (credited with coining the term) have some concise words on this, which I quote:
    [https://www.weforum.org/agenda/2016/01/what-is-the-fourth-industrial-revolution/]
    “The Fourth Industrial Revolution can be described as the advent of ‘cyber-physical systems’ involving entirely new capabilities for people and machines. While these capabilities are reliant on the technologies and infrastructure of the Third Industrial Revolution, the Fourth Industrial Revolution represents entirely new ways in which technology becomes embedded within societies and even our human bodies. Examples include genome editing, new forms of machine intelligence, breakthrough materials and approaches to governance that rely on cryptographic methods such as the blockchain.
    “As the novelist William Gibson famously said: ‘The future is already here – it’s just not very evenly distributed.’”

  • Hennie Visser says:

    The “The Fourth Industrial Revolution” is simply based on the ability of machines to communicate with each other bidirectionally, with the added capability of AI algorithms which can only be performed on the servers in the cloud as it needs massive amounts of data to compute and determine the most likely accurate probability that something might fail or has failed, or an event will occur which most likely will need human intervention…

    In other words, it is the previously hyped-up acronym M2M (“Machine-to-Machine”) which was purely bidirectional Telemetry – now on steroids.

    Nothing more, nothing less.

    However we need to be careful that we do not unleash this technology and place it in the hands of your plumber, gardener, car mechanic or even accountant or lawyer, because with all due respect, if they do not grasp the fact that somewhere in this chain the “data” came from a real-life sensor that measures some real-life physical signal – which continuously relies on calibrations for accuracy as well as physical wires that could come loose or deteriorate – as it could lead to catastrophes that could cause injuries or even death to real-life humans and animals.

Please peer review 3 community comments before your comment can be posted