It looks increasingly like we're in the midst of a 21st-century industrial revolution. And it's not just about AI.
If you didn't already know what "exponential" meant before the pandemic, you almost certainly do now. It's what happens when things start to multiply. When they do, and especially if they keep doing it, the numbers grow very quickly indeed. The word "exponential" comes from the "exponent": the little number that signifies if a value is squared, cubed or raised by a fractional multiplier.
The curious thing about exponentiality is that it's almost invisible in the early stages. But it's around almost everywhere you look. Humour me while I tell you about my friend's great-great-grandfather, a chair leg maker. Her great-grandfather was also a chair-leg maker. They lived in a part of the English countryside renowned for its woodworking craft. My friend's grandfather was a furniture maker. Her father was a woodwork teacher. She is a psychotherapist and studied philosophy at university with me. And her children might go to Mars.
You can see what's happening there. It starts slowly, over generations, and then, smoothly and often imperceptibly, it speeds up. Then, suddenly, you're unable to predict what will happen next.
Anyone interested in electronics and computing will have seen the "Moore's Law" effect of shrinking component sizes. This phenomenon has been remarkably consistent for the last fifty years, but while it's not so marked a trend now, it doesn't matter because other things have taken over from it.
What things? Software almost has a life of its own. If hardware development stops entirely, software can still keep improving. A single breakthrough in software can open new avenues that spawn new types of code. What's clear is that progress has a multiplicative effect when there are several strands of research simultaneously.
Today's technology headliner is AI. It, too, appears to be showing exponential growth. But there's much more to it than that. It's easy to find reports that assert this, but what AI is doing right now is more than merely exponential: it's compound exponential. What that means is that multiple exponential phenomena are spawning new AI models and techniques at a rate that is beyond simple exponentiality.
This is both an opportunity and a threat. Let's look at the threat first.
Straightforward, classic exponentiality also has the potential to be a threat because when the rate of progress approaches vertical, it becomes impossible to predict what's coming next - even for experts. A tiny increment on the time axis means an incomprehensibly colossal leap on the progress axis. It's a threat because if you can't predict what's happening, it only takes a competitor (or, in war, an enemy) to be even slightly ahead of you in time, for their technology to be entirely beyond your comprehension, with little chance of ever catching them up. Compound exponentiality is like this but is itself subject to a multiplying effect. A very likely outcome of that is chaos and confusion.
The opportunity comes from understanding the trends and looking in the direction of progress. That's likely challenging when it comes to AI.
Why is AI suddenly behaving like this?
Partly because, now that the results of AI are becoming so impressive, many more people are working in the field. It's always encouraging when an experimental technique starts yielding useful results.
But it's also because of the very special nature of AI, which is that it has the potential to learn from itself.
People talk about "AI algorithms" as if they were identical to algorithms written in conventional code. They're not. They have the capacity to learn from training data but also from their own results and those of other AI software. A single instance of ChatGPT has remarkable abilities. But what happens when the AI model starts talking to or interrogating another? The signs are that what emerges is often unexpected. Only this week, there have been reports of LLMs (Large Language Models) developing a "Theory of mind". The term "theory of mind" is a bit of a curve ball. If you think you understand what another person is thinking, then you must possess the concept of someone other than you having their own mind. Having that concept is called having a theory of mind, and it's a crucial concept as AI develops.
If indeed some LLMs have started to show this characteristic, it could be extraordinarily important because a theory of mind is necessary to "understand" other people and the world they live in. This ability seems to have emerged spontaneously.
But while this is remarkable in itself, what's even more astonishing is that when you put AI models together so that they can exchange data, it seems likely that they will learn from each other. Arguably, if they all "spoke" English (or some other common language) together, they could exchange even the most abstract concepts. What would emerge from that is anyone's guess.
So, what is the new industrial revolution? Is it AI? Nobody would disagree with that. But it might be more accurate to say that the new revolution is simply the rate of change itself. We've never lived through a period when radical and significant breakthroughs emerge almost daily. It's probably downplaying it to say that it will change everything. Every day that we wake up in the morning, the world isn't merely different. It's more different than every other day before it. If that's not revolutionary, then I don't know what is.