Slave to the algorithm.
As human beings, we like to think we’re in control. That somehow we’re in charge of our destiny, where we go, what we see, what we do and who we meet. This is, of course, complete bollocks as our lives are now run by something called The Algorithm. It was designed to serve up appropriate advertising and reduce marketing wastage, but as Moore’s Law doubles the capacity of computing power every year and artificial intelligence rapidly approaches a level of complete self-awareness known somewhat scarily as ‘The Singularity,’ its capacity to process raw data and predict our behaviour could now mean it will anoint the next President of the United States. Take my hand as we go down the rabbit hole, because this shit’s getting real.
Let’s go back five years. The King’s Speech has won Best Film and Borders has filed for chapter eleven in the US. Like most great bookshops, Borders was a tiny microcosm of the world – a celebration of the written word in its myriad forms. Its vast range meant your purchases were often beautifully random. But beautifully random doesn’t help the vast marketing machine so it had to be replaced by something far more quantifiable. The mighty Amazon soon taught us that browsing was a waste of our precious, precious time and choices should be based on what we liked yesterday and the day before, not by some beautiful connection of synapses, memory and emotion. The Algorithm was watching.
Facebook’s version supposedly has well over 147 markers that make you, you. This digital DNA is constantly growing. It knows the bands you like, the politics you hate. It reads your messages. It knows when you’re having a baby. And if you’ve ever wondered how your suggested friends seem eerily close to the strangers you met at that function, you’ll witness the first evidence that our Algorithms are now talking to each other.
When you take away the dystopian horror of what’s going on, what it does is ‘kinda neat.’ We go to work with The Algorithm’s take on what we want to hear on Spotify. Our news feeds are populated by stories that just seem, well, interesting and at night, The Algorithm serves up a quality line of entertainment on Netflix. We don’t have to search. We won’t be bothered by movies that we might not want to see or be bothered by opinions that we may not agree with. Even friends are categorised and prioritised until some just fade away until they become – to paraphrase ‘Stand By Me’ – just more faces in the halls.
We live in a comfortable world of our own making, fed a constant diet of entertainment and news mathematically designed to mirror our own worldview – no matter how extreme that happens to be. Psychologists would describe this as a potentially dangerous mix of both confirmation bias and positive reinforcement, where one’s opinions are constantly validated, seen as ‘evidence’ that supports an already held belief system.
In our hunger to reduce marketing wastage and serve up quality targeted advertising we’ve created personalised digital bubbles through which alternative viewpoints are unwelcome. The Algorithm is now our teacher. Our friend. Our bodyguard.
The implications for this are largely (and terrifyingly) unknown, but one example might be the almost impossible rise of Donald Trump. Is he a true reflection of the dark underbelly of American society or is it just that his supporters can’t see an opposing worldview? After all, if you’re fed the constant and unrelenting message that ‘America is broken’ on your social feeds, maybe that’s why you feel the need to make it ‘great’ again.
Do technology companies have a responsibility to show all sides of political debate, or will it take a xenophobic madman becoming commander in chief of the world’s most powerful military before it becomes a necessity?
Trump hasn’t ruled out using tactical nuclear weapons, but even if he does smash the nuclear button with his famously tiny fist, The Algorithm will be just fine, buried deep underground in servers capable of surviving a direct hit.
Everything will be just fine.
This article originally appeared in Adnews.