John McCarthy coined the term “artificial intelligence” in 1955. Most algorithms, such as the perceptron, back-propagation and deep learning that make modern artificial intelligence, including generative AI, were already in place in the 1980s.
So why did AI only emerge recently, with Chat GPT, if it is 73 years old and all the algorithms were in place?
It was because we did not have adequate computational power to handle large amounts of data and run large-scale algorithms. So, the inadequacy of semiconductor chips delayed the birth of AI as it existed in the popular imagination.
But we always knew through Moore’s Law, which states that the processing power of semiconductors doubles nearly every two years, that this moment would come. But the crisis of computer or semiconductor chips marches on!
Recently there has been intensifying competition over computer chips. The New York Times reported last week about the desperate hunt for computer chips to power AI. The magazine Quartz reported last week that the Nvidia computer chip is now globally the most sought-after computer hardware.
Because of the importance of computer chips in the era in which artificial intelligence is redefining everything around us, there is a race to acquire computer chips globally.
For example, according to the Financial Times, Saudi Arabia and the United Arab Emirates are reportedly racing to acquire Nvidia chips. The race for graphics processing units (GPUs) chips made by Nvidia is so intense that it is widely questioned whether the 550,000 GPU chips it plans to sell in the rest of this year are enough, given the accelerating investment into generative AI.
GPUs and parallel processing
Why are GPUs important? To understand the power of GPUs it is essential to comprehend central processing units (CPUs). CPUs are the brain of a computer responsible for carrying out the instructions given to it by the program. They are designed to handle various tasks, e.g., running Microsoft Excel or Word files. The CPU is usually housed on a single chip known as the microprocessor, created using a process known as semiconductor (computer chip) manufacturing.
Initially built for graphics in computer games, the GPUs have been improved for parallel processing. They are perfect for activities that can be split down and processed concurrently, such as producing images or executing complex computations in some scientific simulations.
GPUs are used in deep learning, large-scale simulations, and other applications that benefit from parallel processing, and this is essential for large language models (LLMs) such as Chat GPT.
While CPUs handle broad computations and system activities, GPUs excel at specialised jobs that require massive volumes of data to be processed in parallel. However, both CPUs and GPUs are critical for current electrical gadget functioning.
Computer chips, both CPUs and GPUs, are currently in low supply, which has increased the price of goods that depend on these chips. For example, chip scarcity has struck the automotive industry particularly heavily, causing production delays and car shortages. The chip scarcity is also impacting the electronics industry resulting in higher prices for devices such as smartphones and laptop computers.
Also affected is the medical device industry, and due to the shortfall, the production of medical devices such as pacemakers and insulin pumps has been delayed. The chip scarcity impacts the industrial equipment industry resulting in the production of industrial equipment such as robots and machine tools being delayed.
Global reach of chip scarcity
This scarcity has occurred since 2018, resulting from various reasons, including the surge in cryptocurrency mining which relies heavily on GPUs. This got even worse during Covid-19.
A similar issue is occurring with AI and LLMs. Chip manufacturing is a complex and capital-intensive process, and it can be difficult to ramp up production rapidly enough to satisfy escalating demand. Geopolitical tensions have impaired chip supply since countries prohibit export to specific countries.
The worldwide economy has suffered as a result of chip scarcity. It has resulted in higher pricing for electronics and other chip-based items and disruptions in production in various businesses. The chip scarcity has also had a strategic consequence, ushering in the possible fragmentation of the global supply value chain.
The semiconductor market reflects the intensifying rivalry between the United States and China, the two largest producers of semiconductors. The two countries are vying to become the dominating player in the global chip market, and this battle is expected to heat up in the coming years.
The US is concerned about China’s increasing dominance in the chip market, and to address this risk, the US government has taken several initiatives. For example, the Chips and Science Act was passed in the US, offering $280-billion in new funding for semiconductor research and development and constructing new chip manufacturing in the United States.
Furthermore, an executive order was issued on semiconductor supply chain resilience which directs the federal government to take actions to lessen the United States’ reliance on foreign chip suppliers while ensuring the global chip supply chain’s stability.
The Chinese government is also investing in chip production and encouraging Chinese enterprises to do the same. China is also making efforts to lessen its reliance on foreign chip suppliers. The semiconductor market rivalry between the United States and China is anticipated to continue in the coming years.
Furthermore, Taiwan Semiconductor Manufacturing Company Limited (TSMC) and South Korea’s Samsung Electronics are key players in global semiconductor manufacturing. This rivalry is a complicated topic with far-reaching consequences.
Balancing the semiconductor market
One issue we would need to address is what this means for the global South, especially Africa. Many of the minerals used in semiconductor chips are indeed from Africa, and therefore Africa is embedded in this fragmentation of the global value chain.
Does the African Union need to take a stand on this fragmentation? The fragmentation of the global value chain will be economically devastating for the African continent. Africa is far behind in semiconductor technology, despite some glimmer of hope in countries such as Kenya and South Africa.
There is currently no international agreement on the governance of semiconductors. Nonetheless, several organisations are attempting to create such a treaty. The World Trade Organization (WTO), the International Trade Commission (ITC), and the United Nations Conference on Trade and Development (Unctad) are among these organisations.
First, the treaty should ensure the global semiconductor market is competitive and equitable. Second, the treaty should prevent anti-competitive practices in the semiconductor market. Third, the treaty should foster the semiconductor industry’s growth in developing nations. Fourth, this treaty should have standards that preserve interoperability and security requirements. And fifth, the treaty should ensure that semiconductor manufacturing is ethical, secure, and competitive, leading to the distribution of these devices worldwide.
To achieve this treaty, commercial, developmental, and political considerations must be balanced. Nevertheless, the negotiations for a treaty have been challenging and unclear.
The production of semiconductors can be resource-intensive and potentially damaging to the environment. Each country needs to implement and enforce laws to reduce the environmental damage of chip production.
Chip shortages have highlighted the importance of resilient supply systems. Recently, governments are increasingly considering measures to diversify sourcing and potentially repatriate chip manufacturing to protect national security and economic stability. This can damage global trade, fragment the global economy, and impoverish many people.
Computer semiconductor governance requires a careful balance of economic policies, security considerations, environmental restrictions, and international collaboration to ensure the industry’s long-term growth, integrity, and resilience.
Therefore, Africa cannot afford to be left behind in this new semiconductor world and must invest in education and introduce semiconductor education at universities.
Furthermore, African universities must partner with overseas universities to promote semiconductor-focussed partnerships. These partnerships must entail the movement of staff and students interested in semiconductors between African universities and universities abroad. DM