Phil Rhodes on why it might be a while yet before we're all stacking shelves at our local supermarket.
There are probably three main reasons not to worry – yet - about the AI that’s coming for your job. First among those reasons is that there’s absolutely nothing anyone can do about it. Second and third, though are some realities which are unfashionable, inevitable, and mean it might be a bit longer than most people think before a whirring, bleeping box of silicon starts to force you out of your seat.
The first of those is something that most people are aware of: AI hasn’t yet replaced every musician, artist and (gulp) writer in the world because right now, it isn’t good enough to do so. Recent examples have infamously included shots of electoral hopefuls surrounded by potential voters, which would have worked quite well if some of those voters didn’t have somewhere between three and six neatly-braided fingers and hats labelled in Martian.
It’s always slightly distressing to learn that these pictures, which are invariably afflicted with a sort of slightly uncanny pod-people glossiness, are convincing to anyone. As ever, the simulation of reality doesn’t have to be perfect. The simulation of reality achieved by movies isn’t perfect even now, at least not in any reality in which any two people outdoors at night both have the moon behind them when they’re facing one another.
AI can increasingly draw a convincing picture, so long as people don’t look too closely at any of the text or start counting nostrils, but asking it to draw a sequence of five, fifteen or fifty frames illustrating a story, with consistent characters and environments, is still a lot of work. In much the same way, it’s easy to find examples in which AIs are willing to say some extremely dubious things, such as the recent controversy involving Google’s Gemini AI and a pointed question about end-of-the-world nuclear holocaust.
Progress in this area has been rapid, and it’s the jet-propelled (or rather, Nvidia-propelled) improvement in the last few years which has raised concerns for anyone who makes a living in the information economy. As recently as a year ago, it wasn’t hard to find examples of people being very willing to predict mass unemployment and a Terminator-style AI apocalypse in – well – less than a year. It didn’t happen, mainly because AI just wasn’t good enough fast enough.
Will it happen? Presumably, although you can answer “presumably” to questions about the existence of alien life. But that leads us on to the second problem: the better AI gets, and the bigger a job it becomes capable of tackling, the more resources it requires.
The resource hungry monster
AI uses a significant amount of power and resources. Pic: Shutterstock
Similar concerns have provoked frowns in more or less every branch of technology since the end of regular rises in per-core CPU performance in the late 90s. AI does well now because it demands precisely the kind of computing power than early twenty-first century hardware is very good at – the vector processing, parallel computing, single-instruction-multiple-data stuff adored by neural networks. Those are things modern hardware happens to do incredibly well, if only because modern hardware has been forced to prioritise that due to fundamental limits on the stuff we used to value.
Big neural networks are a big problem in that context, and seem more than happy to use up all of the world’s GPU manufacturing effort, and a lot of its power generation effort.
As such, a second stagnation of hardware performance is an alarming prospect. Yes, Nvidia is still making a lot of money on its 4000-series devices, but they’re often more expensive than the rest of a modern workstation put together. It’s also long been the case that a modern workstation is essentially a life support system for a GPU, which generally consumes more power, therefore requiring more cooling. Both money and power are limits we may not quite have reached yet, but we’re not far off.
These concerns attend several technologies which have recently been proposed as the future of hardware, particularly things like programmable logic. Intel’s interest in this was such that it actually acquired FPGA specialists Altera in 2015. FPGAs in PCs might people to configure very fast, very capable hardware more or less by writing code. This approach works. It’s also expensive and power hungry, and Intel spun Altera back off again at the end of last year. The company will continue, but the idea of programmable logic becoming a mainstream adjunct to Intel’s existing CPUs seems to have been put aside for reasons of sheer practicality.
That doesn’t necessarily have much directly to do with AI, although FPGAs can certainly be used to implement vector processors. What this suggests is that no matter how we build them, data centres, which are already hyper-aware of power and cooling requirements, might struggle to provide sufficient resources for AI to put every professional creative out of business in the next few years. Even the sort of AI artwork we’ve already seen has been characterised in terms of kilowatt-hours of energy per picture. Practicality steps in.
Those problems are potentially solvable, either via smarter AI code or – more probably – via the enormous R&D effort available to computing in general which has given us so much performance for so little power and money in the last decade or three. Even so, there is another reason to think AI may not take over the world in quite the way that’s been proposed.
The Nolan Effect...
We might call it the Nolan Effect. People are willing to fund Christopher to create his zero-CGI movies on the basis… well, that people want zero-CGI movies to exist. We already do a large number of things in film and television that don’t make financial sense. Whether the sheer audience desire for reality is enough to save any of us from the AI onslaught remains to be seen. The answer, as so often, is likely to be a complex mixture of the two. Still, it’s been a while, now, since the doomsayers started predicting the end of artistry in general, and it hasn’t happened yet.
Tags: Technology AI
Comments