The first week of parliament for the new year was a noisy one. Cory Bernardi’s decision to leave the Liberal Party has created a flurry of speculation about whether this is the “splitting” of the political right or an indication of the end of business-as-usual politics.
There is much of media and public interest in the dazzling party machinations at hand. But what may be of greater significance and yet receiving less attention is the deployment of big data technologies to manipulate political margins to world-changing effect. This may have been the case in Brexit and Trump’s rise, and needs to be considered in terms of the current political movements in Australia.
To an extent, none of the factors at play are new. The playing of political margins through demographics and targeted political messaging is a strategy as old as the hills. Election after election sees voters in marginal seats massaged with promises.
Those promises are tailor-made according to assumptions about voter concerns based on demographics, focus groups and think tanks. There is also careful consideration of how far particular local concerns can be pandered to without losing the support of the larger voting public.
Likewise, there is nothing new in using data to work out which issues to play where, and how hard. What seems to be new, and creating the monumental events that we have witnessed, is the type of data being used, and how it is being used. Recent developments suggest that the combination of “big data” and voter disillusionment can be stirred into a pretty potent brew.
It goes something like this:
It has become almost hackneyed to say that we live in a digital world. Most individuals who are frequently online leave enormous trails of data about themselves.
Individuals’ awareness of the size and scope of their so-called “digital footprint” varies immensely. But think about all those likes on Facebook, YouTube, Instagram and so on; all the retweets and shares. Then add in all the personal and demographic data from the user profiles on those platforms. Then for good measure, there are all the credit cards, loyalty cards and apps, the places and stores where you’ve checked in, the amassed purchase histories and Google searches that are logged to your oh-so-convenient multi-platform user profiles.
The scale of this starts to get a bit unsettling, and these are just some of the databases that fall under the umbrella term of big data. It is an inconceivably vast mass of information, which at first glance would seem a giant mess; just white noise.
Unless you know how to decipher it.
According to a story first published in Zurich-based Das Magazin in December and more recently taken up by Motherboard, events such as Brexit and Trump’s ascendency may have been made possible through just such deciphering. The argument is that technology combining psychological profiling and data analysis may have played a pivotal part in exploiting unconscious bias at the individual voter level. The theory is this was used in the recent US election to increase or suppress votes to benefit particular candidates in crucial locations. It is claimed that the company behind this may be active in numerous countries.
The technology at play is based on the integration of a model of psychological profiling known as OCEAN. This uses the details contained within individuals’ digital footprints to create user-specific profiles. These map to the level of the individual, identifiable voter, who can then be manipulated by exploiting beliefs, preferences and biases that they might not even be aware of, but which their data has revealed about them in glorious detail.
As well as enabling the creation of tailored media content, this can also be used to create scripts of relevant talking points for campaign doorknockers to focus on, according to the address and identity of the householder to whom they are speaking.
This goes well beyond the scope and detail of previous campaign strategies. If the theory about the role of these techniques is correct, it signals a new landscape of political strategising. An active researcher in the field, when writing about the company behind this technology (which Trump paid for services during his election campaign), described the potential scale of such technologies:
Marketers have long tailored their placement of advertisements based on their target group, for example by placing ads aimed at conservative consumers in magazines read by conservative audiences. What is new about the psychological targeting methods implemented by Cambridge Analytica, however, is their precision and scale. According to CEO Alexander Nix, the company holds detailed psycho-demographic profiles of more than 220 million US citizens and used over 175,000 different ad messages to meet the unique motivations of their recipients.
So what does this mean for Australian politics?
If the claim is true, it means that voter disillusionment with mainstream political parties, and fears about a range of societal and economic issues, have been carefully identified, mapped and nurtured to bring people’s hidden, unconscious biases into political play. This includes those subconscious leanings that we generally know better than to act upon, but which our data tracks minutely and which, in the right hands, can be deployed to shocking effect.
Australia may not be beyond the reach of this. The company behind the technology claims to have fielded enquiries from Australia, so we may already be being played.
Bernardi is being portrayed as a renegade and the far right as marginal. But Brexit and Trump show us how easily and finely the numbers can be played and how we can be fooled into thinking our desires will be met. While we might think we are above all that and that business-as-usual politics will inevitably moderate the extremes, our data silently amasses the tools for us to be played by whoever can buy and decipher its code.
Ultimately, whether this is actually steering elections will be impossible to determine until research is undertaken. However, the circumstantial evidence seems to imply the technology combined big data, behavioural science and targeted advertising to enable Brexit and Trump’s rise.
Regardless, as Sandra Matz argues, the bigger question is whether such fine-grained analysis and manipulation can be steered in more ethical and transparent directions, in ways that engage and inform the public. There is also then the question of whether regulators have the will and capacity to make it so.
The technology is clearly among us, to an unknown extent and to unknown ends. But that extent and those ends do not need to remain unknown.