Activism, art, research – and housing: Tom Keen at Data Justice 2018

By Paul Bradshaw on June 12th, 2018

Data Justice is one of those conferences with so many parallel tracks that every room you choose to enter means understanding that you have also chosen to ignore four others. My choice appeared particularly unpopular – I was one of only three people to walk into room 0.031a for the workshop on ‘How to save a home’ – but I was left in no doubt by the end that it was the right one.

Tom Keene is an artist, activist – and academic: in 2015 he successfully applied to turn his work into a PhD at Goldsmiths University of London, and the results so far – exploring databases and their role in urban regeneration – are hugely illuminating.

Why databases? It was a Lambeth housing officer’s remark — “The database told me” — which prompted Tom to start to look at the role that databases play in society – particularly in those corridors of power where decisions are made with the potential to affect the most vulnerable in society.

In the two years since, Tom’s research has not only explored an under-researched area through its literature, but also contributed over a dozen artefacts from pieces of art to online tools that help support an ongoing campaign to protect a housing estate under threat from its own council.

The relational machine and other concepts

The conceptual framework underpinning those artefacts represents a collection of concepts which carry particular meaning for the very ‘datafied society’ this conference is testament to.

Graham Harwood‘s concept of the relational machine, for example (see video below), aims to articulate how humans perform as “sentient cogs” within that machine, forced to operate within the systems proscribed by its operation.

So, for example, residents are forced to engage with a call centre database in order to begin the process of getting something fixed, while the person doing the repair might say they ‘have to get a code’ from the database in order to do so, or that they are ‘only doing a job’.

Foucault‘s concepts of power and knowledge become important in focusing on how technical objects instantiate particular flows of power and knowledge: the PDF format, for example, instantiates a flow of power whereby information becomes difficult to extract, while the spreadsheet format instantiates different flows depending on the literacies of those wishing to use the information contained within.

This power is not given, or held, a point which becomes problematic when technologies such as database systems are abused or flawed. This was the case when G4S claimed payments for tagging offenders who were already in prison or who’d left the country, or for the NHS 111 lines reliant on virtual checklists rather than clinial experience to diagnose callers, and which manipulated response times to obtain bonuses for hitting targets.

In Keene’s own case, Lambeth had invested power in a database whose knowledge was flawed: when residents tested the council’s claims that the estate was too expensive to run (by undertaking a forensic audit of the records in the database) they found that 50% of repairs had either not been completed, of poor quality, or should have been claimed on insurance.

The technical individual

The third leg of Keene’s conceptual framework is Gilber Simondon‘s technical individual, the idea (PDF) that the abstract concept of a technical object already has an impact before that object is realised, and whose essence shapes their relationship to other objects.

So, for example, the concept of what a brick is shapes the collection and transport of materials that are needed for its creation (which processes then shape other things, such as the muscle power of the workers charged with collection activity).

Or, when it comes to the study of databases, the concept of the survey in turn necessitates the creation of the surveyor, the regulations that must be followed, the technology to print and distribute the survey, and so on.

Connections between these technical individuals can have long histories, from the creation of the first forms to systems of data collection, and so on.

A way of articulating database and algorithmic power

What emerges from this framework is a way of articulating the power wielded by technologies such as the database – and, for my purposes, machine learning and algorithms – and Keene’s work provides a portfolio of just some methods for critically interrogating those.

So, for example, the forensic audit is one way of attempting to hold a database’s power to account; Keene’s fictional ‘Housing Asset Repairs Management System (HARMS)’ database is a way to expose the workings of a database to scrutiny and assert transparency (a method particularly salient with regard to the field of algorithmic accountability).

Online calculators provide a way to engage citizens in the exercise of power, or indeed to empower them, while the use of video to illustrate database records provides a human/real world context to the typically decontextualised records that inform decision-making.

More broadly, the use of Freedom of Information and Data Protection laws, alongside the rights granted to a Special Purpose Vehicle, provide valuable avenues for extracting the information needed for such a project.

And while Keene works on the databases that have informed (as his timeline helpfully logs) the decisions of power for decades, a new wave of big data-informed algorithms promises to create a new frontier for the same work. Is it possible to create algorithms that perform a similar holding-to-account, transparency-increasing, or empowering role?