Two stories illustrate the revolution in government that is coming about from the use of big data in government.
First, in the early days of the Iraq war hundreds of data analysts scoured the airways looking for snippets of electronic data that might lead to the bomb-makers who were causing havoc to allied forces. When a new Director of the National Security Agency (NSA) was appointed, a new approach was introduced. Every text message, phone call or e-mail in the region was vacuumed up by the NSA’s powerful computers. Rather than searching for a single needle in the haystack, the new approach was designed to collect the whole haystack. Advances in computer technology allowed all this data to be managed in new ways. The security services’ algorithms and pattern-recognition techniques worked-like a magnet on the needles hidden in the haystack of internet and phone traffic and led to significant steps in breaking up Iraqi insurgent networks. As the Snowden revelations demonstrate this approach became standard practice across intelligence services but it was an approach that was only made possible by new ways of handling data made available on a previously unimaginable scale.
Secondly, the newly announced Data-Driven Justice (DDJ) initiative from the White House commits 67 city, county, and state governments across the USA to using data-driven strategies to divert low-level offenders with mental illness out of the criminal justice system.[i] Every year in the USA more than 11 million people move through America’s 3,100 local jails at a cost of $22 billion. On any given day about 63% of the local jail population are non-convicted people awaiting trial. To counter the cost and suffering involved in this, the new programme draws together huge data sets from across the criminal justice and health systems to identify the individuals with the highest number of contacts with police, ambulance and emergency departments, and link them to health, behavioural health, and social services in the community. Data is being used within a predictive programme to connect vulnerable individuals with welfare services with a goal of reducing encounters with the criminal justice system. Trials in North Carolina suggest that the jail population can drop by up to 40%, with no increase in reported crime.
These are just two examples of a new trend in the development of algorithmic government, where big data is enlisted in a new project of government based on prediction of patterns. While there may be many gains in this approach, there are also risks in bringing big data into government. (The Snowden revelations suggest that the security services are using this technology to know more about us than perhaps we might like. Meanwhile we might wonder if the DDJ technology might be applied to anticipate who might commit a crime and suggest a programme of ‘pre-arresting’ likely offenders?)
But beyond these examples, the use of big data offers numerous other challenges to how we understand the relations between states and individuals. In a world of big data, algorithms are deployed to analyse the unimaginably huge data sets provided by Internet of Things (where everyday objects collect information about almost every facet of the world around us). This can range from shopping habits to traffic movements. It can be gathered from sensors on everything from inanimate objects to animals. Biometric data from fitness devices and location data from smartphone apps, as well as information provided by smart houses and smart cities, build up a system which connects everything to everything else. This radically changes how we see the world, and how government works. In particular – and this is the focus here – it would seem to suggest that if we know in great detail how people act everyday we need not take the trouble to ask them, either through consultation or even by way of representative systems based on elections. This is explored further under three headings.
1. The computational turn: towards a brave new tomorrow
There is a clear trend in the literature on civic engagement, deliberative democracy and digital activism to pick up on big data, open data, data sharing and algorithmic government, and start seeing enhanced communication, digital democracy and open, accountable government. The so-called Facebook or Twitter revolutions provide the most eye-catching example of this wider trend. Big, open data, along with various tools for exploring it, are seen as at the forefront of a revolution in transparency and openness which will lead inevitably to enhanced accountability. In contrast to old-fashioned voting systems this is a fantasy of abundance that leads to a fallacy of enhanced participation. There are a couple of simple equations:
More data (on budgets, procurement etc.) + widely available techniques to interrogate it = more accountability.
Enhanced consumer data + Feedback loops = better public services
There are obvious attractions to this ‘Technological Solutionism’. It seems to promise transparent, accountable and improved democracy. It suggests there is a technical answer to lots of the problems of unruly politics with its difficulties of public disengagement and the rise of simplistic solutions to break through the torpor induced by electoral systems. The exercise of finding the ‘right’ algorithmic tool for a problem is so much easier than actually understanding it.
There are of course some voices of dissent. Some like Vincent Mosco suggest that Big Data is not a technology but a ‘mythology’.[ii] It is important to remember the following points.
· The apparatus of big data, the Cloud, has a technological infrastructure that exists within a context of industrial / business / government / information complex. Big players such as Google and the NSA have different (and not always altruistic) roles within that infrastructure.
· Even within the direct technical context there are limitations, biases, errors, programmer errors and mistakes, in both gathering and interpreting information, as well as providing access to it. Data is not just ‘there’ – someone has gathered it and put it there.
· Sometimes the limitations are not accidental: there is a dark side to digital politics and governments can engage in ‘techno-authoritarianism’.[iii]
This isn’t just a battle of technologies. As Rob Kitchin and others have argued, Big Data and the turn to algorithmic government requires that across all applications we interrogate properly and fully all the alterations in the spheres of epistemology, ontology, politics and ethics that are involved.[iv] The sphere that seems to me to be particularly interesting relates to the idea that algorithmic government can be seen as some sort of ultimate democracy – something that is beyond existing formats and is technically better than any traditional consultation – however digitally enhanced and inter-connected it may be.
2. Algorithms and democracy – ‘the computer says….’
All the data garnered from the IoT, and mined by machine learning, is (potentially) much more than the simple ‘democracy’ obtainable from ratings such as PageRank, EdgeRank, and What’sTrending. These ratings merely show what is popular. The new algorithms purport to do much more and with much more data. This cues in the whole debate about whether sampling and the very notion of the sample is over. In a world where ‘everything’ is captured does ‘n = all’? Or is such an idea an illusion because, as Hildebrandt has argued, ‘the flux of life’ can be translated into machine-readable data in lots of different ways and the technical choices made have a major impact on the outcome of any data mining operation?[v]
I am less concerned with the technical feasibility of ‘bulk data ingestion’ than what it does to our understandings of governance and the component elements of that process. In essence algorithmic government draws upon and seeks to involve governable subjects who function not as real individuals but rather as temporary aggregates of infra-personal data gathered at, and exploitable on, an industrial scale. Information is collected in a ubiquitous manner, even before the use that it will be put to is fully determined. A ‘human-algorithm relationship’ is created where trust is given to relevant algorithms to seek correlations routinely. The knowledge that algorithmic government draws upon is not created by individuals or given meaning by political or other frameworks of reference. Instead it appears ineluctably from the data. It is to be found simply present (if hidden) in the data. It creates a new and constantly-updated reality, and with it a new normality that is reinforced by being, seemingly, the expression of everyone.
In this sense, the world of algorithmic government is something that is not comprehensible naturally: there is no self or individual or relationship with the natural world as presently understood by us. We no longer seem to need politicians to interpret our wishes: our every action is contained in the data and the algorithms can determine what it is that we really want. At the same time, this whole process seems to offers a false emancipation by appearing to be, by its very nature, all-inclusive, and therefore the expression of everyone.
The promise of algorithmic government means that the project of government is changing: it is now about predicting, and responding to predictions, within a digital world that exists uncoupled from natural experience. This produces what might be termed, following Foucault, a new ‘truth regime’.[vi] It is one which is centred around what is visible from the data. Data-mining now reorganises how we see the world, with the compelling certainty of ‘science’ and statistics. The task here is to construct meaning out of meaningless information, and this involves the disappearance of the individual subject whose only point of interest is how he/she exists in a relational context with other individuals as they themselves appear massed up into huge data sets, and how their conduct affects others. It also perhaps involves, along with the departure of individual agency, the disappearance of politics. If we already know what everyone does by virtue of their capture and representation within the data set, why does it matter what any individual says or votes?
3. Government without politics
What is on offer here is perhaps an entirely new form of government. Algorithmic government is about extracting facts, entities, concepts and objects from vast repositories, and, as calculative devices create subjects and objects of interest in this way, make those subjects and objects perceptible and amenable to decision and action. This becomes the governable reality – the ‘everyone’. In this way, within this particular governing conjecture, the way in which individuals see themselves, and are constructed as units of governance, changes radically along with the means of governance. The target of this governmentality is the future, what people might do, what they might buy, how they might act or react. There is no self in this, only a predicted group action, and with this the end of privacy through the irrelevance of such individualistic ideas within the compass of algorithmic governance.
What is of particular interest is how the practice of algorithmic government has the potential to undermine, and then transcend, many of fundamental attributes of citizenship which presently appear as part of the bargain within the government-governed relationship. While many of these are anchored in ideas of the individual, privacy, and indeed selfhood, they spill over into wider conceptions of civicness, community, and citizenship – and indeed the whole idea of the liberal state. Algorithmic government is, what some theorists see as a new ‘technology of government’ – but it is perhaps more. It is a technology of government that wins the arms race started in the mid-18th century by the statisticians who rendered modern forms of government power feasible by naming, numbering and controlling the world through the deployment of statistical power. Now there is a new governmentality, achieved through the accumulation and interpretation of vast reserves of data. It is one where people as the subjects and objects of government are simultaneously present – through their every measurable action – and absent in the sense of not having any individual agency beyond the data.
Big brother is here – and it is ourselves. A curious, brave new world indeed.
John Morison is Professor of Jurisprudence at the School of Law, Queen’s University Belfast
[i] White House, Fact Sheet: Launching the Data-Driven Justice Initiative: Disrupting the Cycle of Incarceration.
[ii] V. Mosco, The Digital Sublime: Myth, Power and Cyberspace (MIT Press 2005).
[iii] See for example, E.Morozov The Net Delusion: How not to liberate the World (Penguin 2011) ; S. Zuboff, ‘Big other: surveillance capitalism and the prospects of an information civilization’, Journal of Information Technology (2015) 30, 75–89; and E. Treré , ‘The Dark Side of Digital Politics: Understanding the Algorithmic Manufacturing of Consent and the Hindering of Online Dissidence’, Institute of Development Studies Bulletin, Transforming Development Knowledge , Volume 47, Number 1, January 2016
[iv] R. Kitchen, The Data Revolution. Sage 2014; L Amoore and V. Piotukh (eds.) Algorithmic Life: Calculative devices in the age of big data (Routledge 2016).
[v] M. Hildebrandt, ‘Slaves to Big Data. Or are we?‘ IDP Issue 17 (October, 2013).
[vi] A. Rouvroy. ‘The end(s) of critique: data-behaviourism vs. due-process.’ in Ed. M. Hildebrandt and E. De Vries (ed.) Privacy, Due Process and the Computational Turn., Routledge, 2012.