The Technology and Democracy Conference

Jeni Tennison

The Technology & Democracy Conference was organised by the Minderoo Centre for Technology and Democracy, which is an independent team of academic researchers at the University of Cambridge, who are radically rethinking the power relationships between digital technologies, society and our planet.

I spoke on the Firestarter Panel (on the first day) entitled “What are the Stakes for Technology & Democracy?”

Several of the people on the Firestarter Panel focused on the impact of technology on the elections process, particularly at the creation and dissemination of misinformation. I think it’s also important to look more widely at our ability as a society and democracy to assert power over technology. I talked about three aspects to this.

Democratic control

Data and AI magnify and embed power inequalities. Those with power get to shape technologies such that they can increase their power. They can afford data storage, compute, and the high salaries of expert workers. Greater access to data and the tools that process it enable them to further embed that power by making ubiquitous tools that gather yet more data and make more money. We’ve seen this happen with social media; the recent CMA report into foundation models shows it is happening with AI systems too.

The pattern of self-reinforcing power is manifested in multiple arenas, from surveillance capitalism, through generative AI labelling, to governments targeting benefit claimants with AI fraud dectection, and workplaces putting in place algorithmic performance management systems. In each of these cases the companies and governments with power are the ones deciding how and where data and AI systems are designed, developed and deployed.

What’s at stake here is democratic control over data and AI.

Individual agency

Our disempowerment is enhanced by the kinds of narratives that are current amongst technologists and policymakers. We are told that AI is complex, the preserve of experts, and that AI might destroy humanity, or at least take our jobs. This turns AI into a threat that we can’t hope to benefit from.

Our fear, anger and resistance to AI has side effects in our ability to take advantage of technology. We see people opting out of NHS data sharing, when (properly used) medical research might benefit them. We see people avoiding digital platforms that can enable us to learn and connect. Our disempowerment means that AI is left to those who already feel competent.

What’s at stake here is our individual agency around data and AI.

Collective action

Analysing the use of data and AI from the perspective of feminist care ethics highlights that increasing use of digital technologies is changing the way we relate to each other.

This manifests in multiple ways. Postal workers can’t stop on the doorstep to have a chat with lonely elderly people, because they have to meet algorithmically generated standards for delivery time. Chatbots are taking over from real people in customer service. Our relatoinships with teachers, doctors, counsellors and care workers are being mediated by digital tools. Even the tools that are being developed to democratise technology leave us separate from each other.

All this removes opportunities for care and connection that we need both to flourish as humans and to organise and take collective action in solidarity with each other.

What’s at stake, then, is our ability to take collective action around data and AI.

Do you collect, use or share data?

We can help you build trust with your customers, clients or citizens

 Read more

Do you want data to be used in your community’s interests?

We can help you organise to ensure that data benefits your community

 Read more