Data as the new Doxa

We trust data that pops up on screen because it validates us and gradually it becomes our belief system

By Author Pramod K Nayar   |   Published: 24th Aug 2019   12:05 am Updated: 23rd Aug 2019   11:49 pm

Doxa means simply a set of beliefs, values and ideas we have internalised and according to which we then behave, take decisions, plan for the future and even our everyday. Doxa is a taken-for-granted belief system. Its constituents come from the social setting and practices – jobs, social relations, education, cultural practices – that we are engaged in routinely. The important thing is: once we have internalised these beliefs and values we unconsciously perform the practices to fit those beliefs and ideas. In other words, doxa becomes a part of our identity, our very unconscious self.

In the 21st century, we willingly submit ourselves to being tracked, coded, traced, quantified, recorded throughout the day with the devices we wear and pass through. But what we never stop to ask is: who ordered the collection of this data? Who accesses it? And who benefits from it? This is the data doxa of our lives, where we have come to trust (i) the necessity of providing and collecting voluminous quantities of data (because we get some material benefits from it, like discount coupons or offers) and (ii) the authenticity of the data itself (this last is the crisis we now know as ‘fake news’).

Data

Being Defined
Data doxa is the determining condition of the new millennium: the trust in data when everything else is in flux. The government collects data (remember Snowden?), as do social media, our personal computers, health providers and what not. This data, when available in large quantities (called, unimaginatively, Big Data), helps corporations, service providers, weather watchers and economists see patterns across millions of users.

It also helps them predict behaviour (‘predictive analytics’) and offers predictive suggestions in your search box. The suggestions are derived from the search engine company’s databanks that show how many millions searched for X and also purchased/looked at Y. Thus the prediction is given to you based on the data accumulated by the company’s machines from various searches/searchers like you and me: the effect of tracking data in huge quantities.

But the thing to recognise here is: my reality, in terms of the books I acquire, the movies I watch, the clothes I buy, the people I follow, the music I listen to, the politics I espouse, begins to be defined for me through these ‘suggestions’. My newsfeeds are customised and, therefore, what I get to know of the world has been tailored for my tastes, political inclinations and acceptance. Over time, I begin to trust the data that pops up on the screen searches because it validates who I am, what I do and think. The data has become my doxa, my belief system.

Article of Faith
The risk, as Deborah Lupton, Sarah Pink, Gavin Smith and others tell us, is that we stop thinking critically about data itself. Data enters our very consciousness in ways that it becomes naturalised, and hence trusted. It becomes an article of faith. Gavin Smith summarises this as follows: “…a doxic sensibility whereby individuals develop a dependence on the affordances of digitech and a narrow understanding of the political economies in which data circulate as core assets.”

Smith is pointing not just to the monetisation of the data gathered from our devices, our searches and our buys, but also to the absence of any interrogation from us users as to what we are contributing to when we upload data/feeds. Consumer behaviour, understood and influenced through Big Data analysis of patterns of consumer behaviour, transforms quantity into quality: our quality of life, our material lives are determined by the quantity of products and services the corporates/manufacturers recommend to us.

Data Anxieties
But the rise of data doxa is also linked to forms of governance that do not appear to be governance. When our realities, as noted above, and our choices are being manipulated by the data that pops up on our devices, we are, in fact, being governed. But whether Big Data has improved the state’s public services, say researchers (Sounman Hong et al 2019), is still debatable! People are sorted, organised and targeted for promotional campaigns, election engineering and services based on data: this is governance.

This amplification of trust in data leads, ethnographic studies by Sarah Pink and others in an appropriately titled essay, ‘Data anxieties: Finding trust in everyday digital mess’ (2018) tell us, to ‘data anxieties’. Trust is what enables us to act in the world, based on a set of calculations of outcomes of the action. The calculations are themselves founded on familiarity with processes.

Today, first and foremost, we experience a surveillant anxiety: who is watching our data? But more than this, we are aware that personal data may have to be retrieved by us for future use, and hence should remain uncontaminated, safe and accessible.

To phrase it differently, we experience anxieties around ‘data futures’ where we worry about our future life: what is the future if my data is lost or corrupted? Since we live increasingly dependent upon prosthetic memories (can we recall phone numbers without recourse to a device now?) and our decisions and plans are founded on data we have secreted away in various memory banks, our future realities depend on being assured of safety of the data. This, as Pink et al point out, is a shift in how we experience time/the future itself: time determined by the ‘continued accessibility and safety’ of data.

Fitting the Pattern
There can be other reasons to critically examine how we subject ourselves to data-sharing. Take loyalty reward points. Stores and service providers now reward consumers for ‘scanning’ or ‘liking’ items via their smartphones. This means, we as consumers become data-providers and are also defined by what we ‘like’ when we click on the brand’s logos.

Eventually, as Mark Andrejevic et al put it, consumers will be defined in advance based on what they have liked in the past, and didn’t: the ‘new tracking technologies work to naturalise and reward the practices of surveillance, they develop new forms of social discrimination predicated entirely on consumer behaviour’. To avert this social discrimination, consumers will modify their behaviour, clicking on logos and brands so as to ‘fit’ the pattern the data seeks.

Data anxieties stem directly from data doxa. We need to understand that whatever we share is monetised, leads directly to our own behaviour changes, governance, and determines our reality.

(The author is Professor, Department of English, University of Hyderabad)


Now you can get handpicked stories from Telangana Today on WhatsApp / Telegram everyday. Click these links to subscribe and save this number 9182563636 on your contacts.

Click to follow Telangana Today Facebook page and Twitter