Regular reports of data collection by poll campaigners, lobbies, political parties, interest groups should worry us. Terribly
By Pramod K Nayar
The Digital Personal Data Protection Bill has raised the hackles of privacy activists and legal experts. Not only does it give unlimited powers to the state to collect and retain data even after it has been used, it proposes that consent may be ‘deemed’ in certain cases which involve ‘public interest’, but which broadens the very ambit of ‘public interest’. Even employers can process sensitive personal information — such as religious identity or sexual orientation — without the express consent of the individual, according to the new draft of the Bill.
Object lessons in how data was employed by the now-infamous Cambridge Analytica should not be forgotten as the state assumes the features not only of totalitarianism, but a totalitarianism reliant on data.
The Analytica Manoeuvre
Cambridge Analytica was involved in the US presidential race in 2016, and what Analytica did then should ring warning bells.
It produced a nationwide voter profile dashboard, segmenting voters organised around the states, and predicting, calculating, planning on how many thousand voters need to be pushed a little to the right, or just away from the left, in order to change voting patterns for Ted Cruz, the only serious rival to Trump, based on what it called ‘partisanship value’.
The data compiled was based on the OCEAN model: O = openness or, how open you are to new experiences; C = conscientiousness, or whether you prefer order and planning; E = extroversion, or your degree of sociality; A = agreeableness, or whether you put other peoples’ needs ahead of yourself; and N = neuroticism, or whether you are a worrier.
In a now well-known presentation, ‘The Power of Big Data and Psychographics in the Electoral Process’ (2016, now available on YouTube), Alexander Nix, Analytica’s CEO, made the role of data abundantly clear. He termed OCEAN ‘the cutting edge in experimental psychology’. He showed images of the ‘universal female personality’ and side by side, that ‘universal African American personality’ — generalisations that he dismissed as ineffective. What was needed, he said, was psychological profiling so that individuals are classified into ‘agreeable’, ‘conscientious’ and ‘neurotic’.
Personality, Nix said, drives behaviour and behaviour determines how you vote. By getting ‘hundreds and hundreds of Americans to take part in the survey’, said Nix, Analytica was able to build a model to ‘predict the personality of every single adult in the United States’ (later in the presentation he declares that the company has 4,000-5,000 ‘datapoints on every individual in the United States’). He elaborated the model with some examples.
To the neurotic individual, he said, one makes a rational appeal: where ‘the threat of a burglary and the insurance policy of a gun is very persuasive’. Nix’s PPT in the video says: ‘the second amendment isn’t just a right. It’s an insurance policy’. But for those who are in the closed and agreeable group and who ‘value tradition and habits’, he showed a picture of an older man and a boy together, both armed. Nix said: ‘This could be the grandfather who taught his son to shoot and the father who will in turn teach his son…. Talking about these values is going to be much more effective in communicating your message’.
Neither of these two groups of voters would, therefore, not be in favour of gun control – a key debate in American electoral politics, and, therefore, a group the presidential candidate could address in specific ways, defending the right to bear arms as a rationally arrived at right and as established tradition respectively. With data, he said, ‘we know exactly which messages would appeal to which audiences’ because we have already profiled them. He called these ‘persuasion messages’.
Nix concluded by saying that of the two contenders left in the presidential race, one was using Analytica’s data. Nix’s presentation, even after the Analytica case has been discussed endlessly, is fascinating for what it is brazenly implying.
End of Mass Communication
Nix states that the era of mass communication — what he calls ‘blanket advertising’ — is over. Communication is targeted, individualised. In the same house, Nix said, the husband and wife will receive different kinds of mail/advertising ‘possibly on the same issue’ because they have been profiled for their attitudes and behaviour. Recording TV shows an individual watches, for example, can be used to decide what advertising the individual can be shown.
Geographic and demographic factors like age, gender, ethnicity, income, religion, education are collated along with consumer and lifestyle habits (yes, your purchases on Amazon and your searches on Google!) to map psychographic data, or attitudinal data. Then there is behavioural data collected in accordance with the OCEAN model. Finally, the question to be answered by analysing this data is: what actually persuades an individual or a group — fear, authority, reciprocity, scarcity? We are looking at personalised communication that caters to our fears, anxieties and desires, with the hope that this kind of communication, when directed by politicians and candidates up for election, will persuade us to vote for the person we think is addressing ‘our’ concern directly.
The key point in all this Big Data play is whether the individual is aware that personal searches, purchases, emails are being monitored. And this is where the privacy question and the ethics of data gathering begin. As data subjects, we are also electoral subjects in the kind of large-scale manipulation being done through targeted advertising playing to our fears. All political rationality is now data-driven.
Data Discrimination
The instrumentalisation of data enables the profiling of individuals and collectives, group by group. What such an instrumentalisation could be used for by the state, corporate bodies or any agency/individual is anyone’s guess.
Wendy Hui Kyong Chun, Research Chair in New Media at Simon Fraser University in her book Discriminating Data, following the Analytic example and others, writes:
“in an attempt to destroy any and all senses of commonality, “communities” are being planned and constructed based on divisions and animosities… these social networks perpetuate angry microidentities through “default” variables and axioms … pattern discrimination 2.0 — makes older, deterministic, or classically analytic methods of control through direct discrimination seem innocuous.
Welcome to the swarming of the segregated neighborhood, spread through eugenic methods to cultivate futures based on mythical pasts.”
Hate-strewn messages, the appeal to ‘tradition’ — which could mean anything from a particular view of our pasts to practices of cuisine to constructed ‘historical’ antagonisms — and the persuasive messaging do not only address underlying anxieties. These generate such anxieties — over, for example, the (mythic) population growth of certain ethnicities, or the ‘fundamental’ untrustworthiness — read ‘anti-nationalism’ — of others. Regular reports of data collection by poll campaigners, lobbies, political parties, interest groups (and not just advertisers of detergents), should worry us. Terribly.
The psychographic turn aims to divide. To divide is often to derecognise. And discriminate.