It's all happened in plain sight. Social media has changed minds and manipulated millions. If it meddles with elections, it's bad. But when it raises money for an oppressed or threatened country, then it's good. That's the crux of the ethical problem: that you can use technology for good or bad.
Maybe it's always been the same. Bank robbers can use family saloons as getaway cars. Does that mean we should ban them? No, of course not. Is nuclear technology always bad? Not if you save someone's life with it.
You can blackmail people with video, but it can also be a lifeline. Right now, we're seeing it used as evidence for possible war crimes. Everyone's got a video recorder in their pocket. As much as that's a threat to our privacy, it can protect us from overbearing authority.
Social media is undoubtedly a medium, just like radio or TV. But social media, as we have found, is not exactly like the other media in that that it is goal-seeking. Humans might set up the goals, but the social content that reaches viewers will often find its own way to them.
Social media may not (yet) be a pure example of artificial intelligence, although it's certainly assisted by it, but what it shares with AI is that it's the unintended consequences of goal-seeking that pose the biggest threat to us. Except that there's an even more significant threat, which is when malign actors notice "unintended" consequences that might be useful to their own cause, and start to exploit them, refining and amplifying them in the process.
Recent documentaries have made explicit what many have suspected for a long time: that the primary purpose of social media, from a business point of view, is to change people's minds: to alter outcomes, to influence decisions. That's almost the general case for any form of marketing. But social media goes beyond that. It can track your activity, and when it spots a trend - even (or especially) a nascent one, it will try to reinforce it. And it doesn't matter whether you've changed your behaviour because you love something or because you hate it: it's the strength of the response that matters. Either you want to see more of the content that will reinforce your opinions, or you want to get angry at content that opposes them. Either way, there's the sort of active engagement, complete with detailed metrics, that conventional marketers dream about.
But surely no one would use this to influence an election? We now know that isn't true. In the wrong hands, social media is a weapons-grade influencer, for good or bad.
Recent world events have shown another face to video technology in general and social media in particular: It can help.
The war in Ukraine is the first conflict - and certainly the biggest - to be streamed live. The president of Ukraine may be new to politics, but he's media-savvy. Every time he pops up in a live stream, he speaks directly to the Ukrainian population or governments. It's powerful. There's never been a time when a leader has more influence under siege than they ever could in peacetime.
So is there a balance between the good and the evil potential of technology? Maybe it all cancels out?
I don't think so. It looks like the opposite is the case.
Social media is a "meta broadcaster". It soaks up virtually every type of video content, tags it, sorts it, and pushes it to any user with a statistically significant pattern of behaviour. Nobody outside the social media companies knows precisely how it works (even if they used to work there: it's probably changed). But you can see it everywhere. In the early days of social media, I lost my job when the company I worked for ran out of money. I wrote to my colleague, saying, "it looks like we're for the axe". For weeks after that, I was inundated with advertisements about tools for chopping firewood. Today, it's infinitely more subtle. Sometimes I learn stuff about myself from Facebook's content choices.
You can use social media's openness to integration and automation if you're clever. For example, you can flood it with misleading content. Or you can just plain lie.
But there's a bigger problem than that. It's that more and more of us know less and less about how today's technology works. We don't know what goes on behind the scenes. Most of the time, the back-end is a mystery to us. So while I remain optimistic that technology will ultimately give us better lives - and in countless ways it already has - I'm also frankly concerned that it will manipulate us. How? In a spectrum of methods. From modifying our opinions to ruining our ability to function in society. Google "social credit", and you'll see how far this has already gone.
Meanwhile, don't get paranoid. Just make sure you know how stuff works.