The best way to save the planet is to persuade governments to act responsibly. But we can all do our bit. And our choice of technology can play an important part in the future.
The first thing I want to say is that I'm not an expert on climate change. I suspect that many of the protesters on the streets of London this Easter Weekend know a lot more about it than me. At the point where climate change activists are gluing themselves to trains, you have to admire their determination to get the attention of the media.
They certainly did that. And it left us sympathising on the one hand with the holidaymakers whose plans were disrupted, and, on the other, wondering how the planet can possibly survive in a habitable form while governments essentially do nothing about it, if not actively promoting planet-poisoning activities.
Only recently the BBC changed its policy of "balance" between suppliers of climate change facts and, well, purveyors of unscientific drivel. The broadcaster has decided that climate change is now proven. You can't have an opinion about a fact. So there's no need for balance, because it's meaningless in the face of established, justified and proven knowledge.
So by now, you've probably worked out that I think man made climate change is a thing. It will only cease to be a fact either when decisive action averts it, or when radical new scientific evidence shows that either the climate isn't changing, or that some factor other than human activity is causing it. (That's how science works, and it's what distinguishes scientific statements from dogma).
Not all ecology is about climate change. Giant turtles being strangled by plastic beer can ties would still happen without rising global temperatures. But the energy used to make that plastic (and to fabricate the beer cans and transport them across the world) still probably came from burning fossil fuels, which release carbon dioxide, and which therefore contribute to global warming.
I think most people are aware of climate change. Very few are indifferent to it. But as an individual, it's very hard to know what to do about it.
One thing we can do is buy less and use less. But when technology is changing so fast, that seems like an impossibility. And it would be disastrous for manufacturers. Until we live in a society that doesn't need money, and where we work only to improve ourselves and the lives of others, we're going to need manufacturers. It won't help them if we stop buying their products.
So, it might be that they have to help themselves. Here's how they can do it. Let's take camera makers as an example that we can all relate to, but this doesn't apply only to them.
My theory is that manufacturers should make more expensive cameras. This is not to deter us from buying them: quite the opposite. It's to make them more upgradable.
There's not much new about the idea of upgradability. I remember going to the launch of a computer brand - decades ago - that was designed to be completely upgradable. Rather rashly, the company claimed you'd never need another computer. For the early 1990s, that was certainly bold. With a certain inevitability, the company folded shortly afterwards. Some hardware is intrinsically upgradable, but, ultimately, almost the sole holder of this crown is the nineteen inch rack.
(I'm being a bit harsh here. There are many other examples, but the point is that technologies usually quickly outgrow frameworks designed to encapsulate them).
But software is immensely upgradable, especially if it has the space to grow.
So, here's what I'm thinking.
If we had cameras with upgradable software at their core, then they can be improved, even after you've bought them.
And indeed many of today's cameras do have some kind of upgradability. Especially the ones that have FPGAs at their centre. These chips - Field Programmable Gate Arrays - are immensely powerful and fast arrays of logic gates that can be configured at boot-up into bespoke processing units. Imagine a software algorithm that's able to run on hardware that's designed ONLY to run that specific code. FPGAs are like this, but they're reconfigurable too.
It seems too good to be true. But these adaptable chips are at the heart of most professional equipment. They allow extra features and more refined processing to be added significantly after the product has been sold. They extend the lifespan of a product by years. At least one device, the Odyssey recorder/monitor from Convergent Design, has stayed current for six years - with almost continuous upgrades. It's now an almost completely different device. And this was only possible because it had a forward-looking specification (including an OLED screen) with an over specified FPGA. It also had the internal bandwidth to deal with multiple video streams.
This kind of over capacity is a very good thing in the face of climate change. It keeps the current hardware useful for a long time, and yet leaves the customer happy, because it just keeps getting better. It also means that manufacturers can spend more time on developing radically new technology, instead of dealing with the burden of having to release a new physical product every year.
This, I think, could be the paradigm for cameras and imaging technology in the future, especially as computational optics continue to get better.
You'd have to pay more, but if the cost could be spread, there would be massive advantages. Foremost of which is that you don't have to throw away your kit every time there's an advance in processing. What's more, you won't have to learn completely new menus.
And don't forget that manufacturers can charge for big upgrades. Why shouldn't they? And that will finance their next products, which themselves will last years longer.
Ultimately, I hope that this way of thinking about products will spread to consumer goods like smartphones. We're surely past the point where - except for professional purposes - smartphones are good enough. They have high resolution screens, processors as fast as most laptops, enough bandwidth to stream movies, and they work pretty well as telephones too. If we paid more for software upgradability, maybe we could keep them for five or six years instead of one or two. Maybe smartphone manufacturers should devote some of their R&D budgets towards making a battery last for seven years.
There's much more to this than I can cover in this article. But just think about it. Does this sound like a good approach to you?