Credit: Veronika Dee (via Unsplash)

Science and intimacy: The troubling relationship between data surveillance and care

At the end of November, The Daily Mail published details of an investigation into how UK councils are working with a private company to ‘harvest’ citizen data. The findings highlight how local authorities are able to build up a detailed picture of a huge number of people thanks to the help of data surveillance company Xantura. This data, the piece suggests, can be used to predict a whole range of different things, from individuals’ sex lives to how likely they are to break lockdown rules.

To a casual observer this could be shocking. But to anyone with even a passing awareness of our data-driven society this isn’t particularly novel. Indeed, public services have actually been working with Xantura for a number of years. The relationship has supposedly been instrumental in helping organizations to more effectively identify people and communities that need support and resources, from medical care to social services.

However, while this fact is an important counterpoint to the way the findings have been presented in the Mail, this doesn’t mean we should simply brush aside concerns and critique of this system of intense data surveillance. We shouldn’t let the political and cultural right own mainstream discourse on this issue.

Sure, we can argue that government can and should play a part in supporting those in need. But we also need to be cautious of letting governmentality become the singular logic through which civic society is permitted to operate.

Doing so not only risks ceding ground to the right on privacy issues, but also has serious implications for how we frame and tackle systemic and deep rooted problems. In trying to help vulnerable and marginalized people, it can easily further disempower them, estranging them from the decision making structures that determine and shape their lives.

If the populist turn has taught us anything, it’s that technocratic solutions to problems can have long lasting consequences. They’re also often impossible to predict. The actual effectiveness of an obsession with ‘delivery’ and ‘outcomes’ notwithstanding, the use of technological solutions to complex problems also raises fundamental questions about care, power, and autonomy in British society.

If we don’t address it, it could become impossible to unpick the results. That would be dangerous for society, and in particular for those that companies like Xantura are supposed to help.

Automating care

Xantura’s success at winning public contracts isn’t down to sheer luck or the malevolence of local government. It’s a straightforward response to economic challenges. It helps organizations do more with less.

Xantura’s CEO highlights that this is one of the key benefits of the company’s services. “On-going financial pressures and changing demographics mean that local authorities need to do a lot more with a lot less,” he says in a brief Q&A on the London Councils website.

From this perspective, you might even argue that it’s incorrect to see data surveillance as a purely oppressive or intrusive exercise of technology. Perhaps it is a technology like any other; it simply unlocks efficiency. True, this could be risky in the hands of a more nefarious actor, but used by local government, it makes the identification of need much easier and faster.

Xantura presentation screenshot
A screenshot from a Xantura presentation (via The Daily Mail)

Of course, the question that needs to be asked is who ultimately benefits from efficiency. Yes, it’s important that people get the support and help they need as quickly as possible. But the idea that the resources to do this are somehow ‘stretched’ is ultimately political. Pulling resources out of local government and encouraging private sector enterprise encourages the use of technologies that are able to provide abstractions of given demographics. Once you have these abstractions, it becomes much easier to automate the process of identifying, categorizing, and pathologizing people.

It’s ironic that one of the politicians quoted in the Daily Mail report, Steve Baker, also happens to be one of the most economically right wing voices in Parliament. ‘Is this really how we want to live, under the gaze of machines deciding what public services we shall be entitled to?’ he said. In a world where the role of civic society is forced to conform to the logic of the market, it seems almost inevitable. Perhaps Baker only has himself and his party to blame.

The existence of companies like Xantura is a symptom of a drive to automate care. As it has become more difficult to deliver services effectively and efficiently to the people that need it most, Xantura is able to position itself as a useful solution in a landscape that is replete with challenges.

But it’s wrong to see everything through the lens of financial challenges. Of course more funding for public services would help, and it’s critical that those arguments continue to be made. However, the extent to which data analytics technologies have become embedded in public services, is also indicative of a worrying technocratic turn that is decidedly anti-human.

What I mean by this is that despite adopting the language of ‘people-centred’ solutions, the apparatus of large scale data mining and analytics works in a way that undermines the autonomy of both those that are subject to it, and those who leverage it. The impact of this might not be particularly visible at the moment, but it will slowly chip away at the structure of civic society. It will diminish trust, entrench marginalization, and reaffirm a sense of helplessness across communities.

Read next: What is surveillance capitalism?

Marginalisation and alienation

Ironically, ‘people centered’ technologies often only work to further marginalise people.

The reason for this is that because we’ve allowed certain technologies – like AI – to dominate our problem-solving imagination, we begin to treat people as standardized. This is, of course, antithetical to the reality of what personhood actually is. People are bound to context at every level, from their personal history to their present environment.

Os Keyes touches on this in an essay about the challenges that data science poses queer people. Keyes argues that because data science necessarily involves standardization in order to count and record the things it takes as its object of analysis, it inevitable imposes certain limitations on queer people, a group that, while heterogenous in terms of specific identities, is nevertheless linked by a refusal of categorization. By demanding clear definitions, data science inflicts a subtle violence on those for whom doing so is impossible.

Data science is, according to Keyes, “the inhumane reduction of humanity down to what can be counted.” In the context of the Xantura story, Keyes’ thinking underscores a tension at play in the delivery of care and support to potentially vulnerable people. The humanity that is implicit in the act of care is bound up with a technology for which things like ‘context’ is inimical. It’s hard – maybe even impossible – to understand context through computational logic.

Read more about tech policy and its impact on society.

Technology and the democratic deficit

The impact of this on those receiving support is about more than just notions of dignity or humanity (although of course those are important). It actively disempowers them.

In a sense, the use of data surveillance technologies like Xantura undermine the place of democracy in everyday life. Indeed, it could be argued that this isn’t that remarkable – this has been happening for years. The decline of community and collective organizations in place of effective managerialism has ensured that democracy is a concept that is viewed as one bound up with elite institutions, rather than something as prosaic as day to day decision making.

But it’s important to note that despite things like artificial intelligence being presented as a solution to many contemporary challenges, it doesn’t reverse the current democratic deficit – it consolidates it.

Some might argue that this is just a trade off we need to make for progress. If we want a more effective society, in which people’s needs are met quickly and painlessly, we need to embrace cutting edge innovations.

But the counter argument to this is that replacing citizenship with passivity – turning people into users – can have dangerous consequences. It leads to distrust and division, further undermining our ability to build a society that works for everyone.

Optimization and alienation

The consequences of technologies like Xantura for vulnerable people are clear. Indeed, although the populist denouncement from the Daily Mail might be motivated by a reactionary attitude, it’s one that anyone with a belief in individual privacy will share to some degree.

What is a little more subtle is how technology can impact labour and the practice of social support. Just as it disempowers receiving care, it does the same to those in a position to deliver it.

This isn’t to say that technology makes people lazy. Instead, it’s that the logic on which it is founded – counting and sorting – can be internalized. It places a limit on how people are able to engage with their labour. More importantly, it also inhibits the ways in which people can engage with people they’re trying to help.

There might be a counter argument here that data science and artificial intelligence tools are merely intended as things that augment the work of skilled professionals. This is a theoretical possibility, but it’s an unlikely state of affairs in our present situation. A lack of resources in public services and optimism about emerging technology by those in decision making positions means that computational thinking is privileged over other approaches to problem solving.

There’s a chapter in Xiaowei Wang’s Blockchain Chicken Farm that speaks to the reductive logic of AI in the context of labour. In it, they explore the use of artificial intelligence in pig farming (a critical part of China’s food supply). Wang writes that the increasing use of AI in pig farming “naively assumes that the work of the farmer is to simply produce food for people in cities” and that it rests on an assumption that “feeding humans is no different from feeding swaths of pigs.”

This might seem like a long way from the issues facing social work in the U.K. However, I noticed some interesting parallels when Wang notes how traditional farming work “is borne out of commitment and responsibility: to their communities, to their local ecology, to the land.”

Is it not the case that for those in the care sector – and indeed, many parts of the public sphere – that their work is built on the same sense of commitment and responsibility? This will be eroded if we allow artificial intelligence to take over with no critical eye on its deployment.

Data surveillance limits the horizons of possibility

It’s easy to dismiss fear of technology. But to fend off the techno-optimists, critics need to be clear that the potential issues that artificial intelligence could pose aren’t imminent and obvious ones. We’re not about to be plunged into a technologically induced crisis. Instead, the real concern is how products like Xantura can impose what I’d call an ‘interpretive lock’ on how we’re able to solve problems.

As I’ve explained, this is true for everyone, not just one group. It turns citizens into passive users, simply waiting for resources and support to be administered, and it turns those in the care and support sector less agency over their work.

In a 2018 essay, Anna Lauren Hoffman writes that “if we continually ignore the social dimensions of a problem in favor of technical solutions, we risk trapping ourselves in a continual game of technical whack-a-mole.” In other words, not only does it become harder to solve the problems we thought we were solving, but we end up spending time and energy trying to solve other problems that weren’t even there in the first place.

We don’t have time or energy to waste. If we’re to build a society that is founded on a spirit of both commitment and responsibility, we need to be completely focused on what really matters. Sacrificing meaningful care – care which is attuned to individuals’ specific needs – for efficiency, is ultimately sacrificing our humanity as well as our privacy.