Annual Report 2023–24

Welcome to our second annual report! Here, you can learn more about what we’ve been up to, and the impact we’ve had, in a year when AI has become a headline topic.

We took advantage of global attention on AI governance to amplify our messages about the importance of public voice and collective action. Around the UK’s AI Safety Summit, we partnered with the TUC and ORG to drive civil society and worker voice in AI to the top of the agenda, through an open letter that gathered over 100 signatories. We also organised a People’s Panel on AI to highlight the lack of public input; and had multiple high-profile media appearances.

We brought together over 40 civil society organisations working around data and digital rights in the UK in a network that has successfully campaigned against the Data Protection and Digital Information Bill, ensuring that there was sufficient opposition to see it dropped during the wash-up process prior to the 2024 General Election period.

We have targeted sectors that matter to people. We partnered with the TUC and the Wales TUC to emphasise union and worker voice in the adoption of data and AI in the workplace. We collaborated with Just Treatment to bring patient voices to the fore in the debates about health data. We joined forces with Defend Digital Me to explore governance gaps in the edtech ecosystem.

We convened, catalysed and contributed to initiatives around the world. During RightsCon in Costa Rica, we brought together groups working to influence G20 conversations on collective data governance. During the Open Government Partnership Summit in Estonia, we facilitated the codesign of commitments for meaningful transparency, participation and accountability in data and AI governance.

Finally, through our Connected Conversations, Design Labs, Fellowships, presentations, and social media, we have brought the idea of democratic and participatory decision making closer to reality. These have ranged from digging into the details of collective data rights, through developing a game that draws attention to the range of methods for bringing in public voice, to presenting to the Women’s Institute in Buckinghamshire.

I am immensely grateful to our team, alumni, associates, fellows, partners, collaborators and funders for all their work to advance our vision of communities having a powerful say in decisions about data and AI.

If you want to learn more about our plans for next year, do take a look at our Strategic Roadmap 2024-25.

Jeni Tennison, Founder and Executive Director

We want communities to have a powerful say in decisions about data and AI so that it is used to create a just, equitable and sustainable world.

We campaign to put community at the centre of data narratives, practices and policies by advocating for collective and open data governance.

We have three main strategic goals arising from our theory of change.

  • Change data narratives: we develop compelling narratives of collective data governance, placing stories in the media and other key venues that influence communities and decision makers.
  • Change governance practices: we surface and share examples of collective and participatory governance, convene a community of practitioners who lead and support collective data governance activities, and work alongside diverse communities to help define how data governance should work for them.
  • Change public policies: we provide evidence-based public policy recommendations and language to create an environment for collective and participatory data governance.

These goals are enabled by a strong community and an effective organisation.

Responding to the AI Safety Summit

In November 2023, the UK Government hosted the first AI Safety Summit. This was an opportunity to deliver on our mission at a national and international level.

In the run-up we held a private Connected Conversation on AI Safety narratives to support common understanding and purpose amongst civil society organisations. We also led two activities to influence this and future Summits: an open letter, and the People’s Panel on AI.

An Open Letter

With the TUC and Open Rights Group we led on an Open Letter to Prime Minister Rishi Sunak, calling out the Summit’s marginalisation of civil society in favour of a select few companies and governments. The letter gained over 100 high profile signatories, including leading experts, international human rights organisations and unions representing millions of workers. It gained international media coverage, challenging the narrative of the Summit, lending weight to civil society actors who were present, and forcing politicians to address the role of workers and civil society in shaping the future of AI.

The People’s Panel on AI

We organised the People’s Panel on AI, which brought together 11 representative members of the public randomly selected by the Sortition Foundation to attend, observe and discuss key events at the AI Fringe, which was held alongside the Summit.

Through a deliberative process facilitated by Hopkins Van Mil, the panel worked towards a public report giving their verdict on AI and their recommendations to government, industry, civil society and academia for further action.

By the end, we had a report by Hopkins Van Mil of the outcomes of the deliberation process, a written evaluation by the Independent Evaluator, a video promoting the process and Panel recommendations, a flyer displaying the recommendations, and a final report that included reflections and connections to other related participatory practice work. In addition, at the end of the week we hosted an open ‘verdict session’.

At Tech UK we’re very keen to look directly at these recommendations and really think hard about what they mean in terms of policy formation around A.I.

— Antony Walker, techUK

As well as the recommendations themselves, the Panel started to prove to big tech, public sector and private sector that public participatory practices can be organised quickly, (relatively) cheaply, and with people knowing nothing of the (complex) subject in advance. It showed that the outcomes of such processes can be meaningful, insightful and supportive of better data and AI governance in the future.

The impact of the Panel continued through the year. For example, TechUK asked representatives to speak about the experience and their recommendations at the top of their Digital Ethics 2023 national conference agenda, and a Panel member spoke at an ‘AI in Education’ conference in Cornwall. We also held a Connected Conversation on involving the public in AI policymaking, as part of the Turing Institute’s AI UK conference, which explored lessons from the experience.

Members of the People’s Panel on AI deliberating on the AI Summit Fringe

We didn’t stop talking about AI. It was exhausting. It overtook us. A number of us refer to the experience as life changing as the impact hasn’t gone away. During the week we were waking up at 3am and making notes about what we wanted to ask the next day. … In one session someone suggested to us that we didn’t know the tech and ordinary citizens can’t understand [ AI ]. I think the panel will come up against this all the time but the panel can ask the questions. What’s the value? Does society want it?

— Janet, People's Panel on AI member

The People’s Panel on AI was organised with support from the Mozilla Foundation, the Accelerate Programme for Scientific Discovery, the Kavli Centre for Ethics, Science, and the Public, and the Ada Lovelace Institute.

We develop compelling narratives of collective data governance, placing stories in the media and other key venues that influence communities and decision makers.

Understanding narratives and power

Many organisations recognise the need to shift narratives about data and AI. Doing so requires an understanding of how those narratives currently work, and what they should be changed to.

In May 2023, we hosted a ‘Future Narratives’ Design Lab, bringing together invited experts to start the co-creation of a strategy for shifting the inaccurate, damaging way data is currently framed & understood in media, policy and industry narratives.

In the end it’s people coming together and demanding a powerful say that will ensure AI and data driven systems work for all of us. This is about ensuring systems that add positively to our lives and help build a caring society, not one constrained by unintelligent machines predicting, dividing and overseeing us for the benefits of those with the power.

Jonathan reflecting on what he learned

Jeni and Tim explored the way current narratives can disempower the public in conversations about AI in an essay for the Joseph Rowntree Foundation. We also used these insights in our work with Brickwall to research and produce scripts for AI explainer videos for the Local Government Association and influence local authority adoption of AI.

We need to shed light on the power of mainstream AI narratives, how they impact us, and how we can reshape them. Connected by Data doesn’t only understand these narratives but offers vital practical alternatives that centre those affected by AI.

— Yasmin Ibison, Joseph Rowntree Foundation

Amplifying people’s stories

Narratives about data and AI can feel abstract and removed from day-to-day concerns. We aim to bring out individual stories that surface people’s lived experience and provide a way in to policy discussions.

In April 2023, our first Our Data Stories report amplified worker’s experiences of data and AI, placing the human firmly in the story. We worked with the Trades Union Congress (TUC) to host a parliamentary event for workers to be heard directly by politicians and policy makers, and with workers and unions to boost stories in a range of media, including the FT, the Guardian and Huck. This work was also quoted by the TUC in their ‘Work Intensification: The impact on workers and trade union strategies to tackle work intensification’ report.

Our Parliamentary event, co-hosted with the TUC, discussed the worker experience of the AI revolution (June 2023). Photo credit: Stephanie Peacock MP.

Our Parliamentary event, co-hosted with the TUC, discussed the worker experience of the AI revolution (June 2023). Photo credit: Stephanie Peacock MP.

Our second Our Health Data Stories report, in November 2023, produced in partnership with Just Treatment, focused on how people experience data when interacting with health services, especially in light of the controversy of the award of the NHS’s Federated Data Platform contract to Palantir. Jeni joined Hope from Just Treatment in a Byline Times podcast on the topic, based on the report.

Telling our own story

We aim to provide the media with a different perspective on issues of data governance and AI: one that centres the voice of affected communities, confronts power and challenges received wisdom.

Determined to get beyond cliches of androids and lines of code, we created an explainer video that centres images of day-to-day life and key messages on how data is pervasive, yet about political choices.

In the last year, we have significantly increased our profile and been quoted in a range of media, with repeat requests for comment from TalkTV, Channel 4 News and Politico. Here is a sample:

Our campaigning work around the UK’s AI Safety Summit established Connected by Data as a go-to authoritative voice. We had over 50 media hits within a week, from regional BBC to international broadcast and newspapers.

Along with mainstream media, we’ve used social media accounts to share our responses to news items including a NHS contract being awarded to Palantir and the Post Office Horizon scandal. We also published a work of speculative fiction, called ‘Grab A Byte’, in which the reader takes a “choose your own adventure story” style exploration of how data and AI impacts on us.

As we reflected on the short story that emerged, we thought about how this could be used in classroom or discussion group contexts to draw out different issues, so we added some prompts for discussion as opportunities to foster collective conversations around data governance in lectures, workshops and among groups.

Maria on reclaiming the narrative through speculative fiction

We surface and share examples of collective and participatory governance, convene a community of practitioners who lead and support collective data governance activities, and work alongside diverse communities to help define how data governance should work for them.

Even if Sam Altman’s intervention was not an entirely serious call for global deliberation on AI, such deliberation and debate is not only desirable, but is eminently possible. So what might it take to be effective?

Tim on effective global deliberation on AI

Supporting organisations with data and AI governance

We aim to help organisations who are interested in working with their communities as they develop data and AI tools and services.

We started the year commissioned by the Joseph Rowntree Foundation (JRF) to improve their understanding of the current landscape and stakeholders they want to support, work with, influence and inspire as they create a new insight and analysis infrastructure to help improve how the ecosystem understands inequalities, and how to solve them.

The case for public engagement around data and AI decisions is only growing stronger. We’ve been working hard to help organisations move from aspiration to action, and to take approaches that support informed engagement and real community power.

— Tim Davies (he/him), Research and Practice Director

We have also developed an interactive card game that provides a creative way for individuals and organisations to think about planning, running and evaluating collective and participatory data governance activities.

The Good Governance game being tested at CPDP LatAm2023 with Idec, InternetLab, ITS Rio, Instituto Fogo Cruzado, Ericà Bakonyi, Larissa Chen, Lab Jaca, Aqualtune Lab, and CyberBRICS Project.

The Good Governance game being tested at CPDP LatAm2023 with Idec, InternetLab, ITS Rio, Instituto Fogo Cruzado, Ericà Bakonyi, Larissa Chen, Lab Jaca, Aqualtune Lab, and CyberBRICS Project.

Developing resources for public deliberations

Informed, empowered and deliberative dialogues about the impacts of data and AI requires capable practitioners. We aim to support people delivering these approaches with the tools and resources they need.

In November 2023, we partnered with Iswe to convene a group of thinkers and practitioners who work on public engagement, data and AI to explore interventions that could support greater deliberative governance of technology. We explored three main challenges:

  • Decision makers are often not convinced that they should be listening to public voices in the shaping of data and AI
  • Running an effective deliberation on AI requires good background materials - focusing on the right topics, and recognising the different frames applied to discussion of AI
  • Running a powerful deliberation on AI requires tailored facilitation

You can access the full workshop notes here or review the summary on our website.

Photos of the Design Lab exploring resources for public deliberation.

Photos of the Design Lab exploring resources for public deliberation.

We provide evidence-based public policy recommendations and language to create an environment for collective and participatory data governance.

Our strategy to shape the UK’s data and AI policy towards equitable outcomes consists of three workstreams: catalysing a powerful civil society community, policy development and deep dives into strategic verticals. We take what we learn in the UK into the wider international community to develop joint approaches and action.

Catalysing civil society

An effective civil society community is vital to counter big tech and government power. We aim to bring together diverse civil society organisations and campaigners to build relationships and collaborate on policy, campaigns and advocacy around data and AI.

To that end, we convene the UK’s Data and AI Civil Society Network. Along with current affairs, the network digs into key issues with dedicated sessions such as a Connected Conversation on Impact Assessments, and showcases the work of its participants. We maintain the infrastructure of this network, with shared resources such as our publicly available list of Data Protection and Digital Information Bill resources, an active Signal group, a popular data policy newsletter and regular meetings and strategy sessions to support joint action and skill sharing within a growing community.

This year, we expanded and diversified engagement with the Network. A workshop in February 2024, targeted at charities and campaigners who are new to these issues, surpassed our expectations with around 100 participants. Many of these have since joined the Network.

As a human rights policy organisation, it’s been incredibly valuable to join the Network. Data protection is a relatively new policy area for us, so the ability to be able to connect with like-minded organisations and learn more has been helpful to ensure our work is aligned to the rest of the policy landscape.

— Adam Freedman, Policy, Research & Influencing Manager, National AIDS Trust

Centering community voice in data and AI policy

The legislative and policy environment determines how and whether community voice is brought to bear in decisions about data and AI. Our aim is to advocate for evidence-based public policy recommendations for collective and participatory data governance.

A key focus throughout the year has been the Data Protection and Digital Information Bill (DPDIB). We developed policy interventions as the Bill moved through the legislative process over the last twelve months, including written evidence, briefings and suggested amendments. In May 2023, we were invited to give oral evidence to the House of Commons Bill Committee, which was subsequently referenced by Lord Clement-Jones, a leading voice in the debate, in the Lords Grand Committee. Thanks to this opposition, alongside that of many other members of the Network, the Bill fell during the wash-up process prior to the 2024 General Election.

The Post Office scandal has some stark lessons for how data and AI systems should be integrated into our lives and society. The Government’s Data Bill fails to learn them, and in multiple places makes it even harder for victims of computer errors to get justice.

Jeni on the Post Office 'Horizon' scandal

Beyond the Bill, we responded to the Government’s AI White Paper consultation and multiple parliamentary inquiries including Transforming the UK’s Evidence Base and Large Language Models, as well as regularly engaging with senior politicians across the political spectrum.

Questions of data and AI policy are cross-cutting and far from settled. We have an opportunity to steer it towards equitable and democratic outcomes.

— Adam Cantwell-Corn (he/him), Head of Campaigns and Policy

Along with reacting to policy developments, we’re shaping perspectives for a progressive approach to data and AI policy. Working with a range of stakeholders including Labour Together, we sought to clarify high level thinking on the direction and purpose of data and AI policy into a progressive vision. As well as feeding into Labour policymaking, this helped inform the first Green Party of England and Wales policy on AI, which now includes a commitment to centre community voice. We launched the document at the Labour Party Autumn Conference (with Matt Rodda MP) and the Green Party Autumn Conference (with Baroness Natalie Bennett).

Diving into verticals

We need both cross-cutting and sector-specific data and AI policy. Our aim is to dive into strategically relevant verticals to develop context specific policy and boost capabilities of various stakeholders.

Having identified ‘work and workers’ as a source of key questions about data, power and participation we developed relationships within the trade union movement. A research collaboration with Wales TUC on how AI is impacting workers is feeding into the Welsh Government’s pioneering social partnership agenda. Adam has joined the Wales TUC reference group on AI to support trade unionists within these processes regarding the public sector workforce.

We took part in the Special Advisory Group on the TUC’s AI and Employment Bill project, a powerful intervention to emphasise collective data and AI rights at work, and collaborated on multiple joint events and media outputs.

We also led on a project with the TUC to draft a strategy to take forward their work on AI. The strategy sets out how the TUC should innovate its processes to develop targeted policies, training and campaigning on AI.

TUC Cymru worked with Connected by Data to produce a joint report on workers’ understanding of AI. The report led to extensive media coverage and the establishment of a government committee in Wales set to introduce guidance on the use of AI for key workers in the public sector. This would not have been possible without Connected by Data’s support.

— Ceri Williams, Wales TUC

We laid the groundwork for other verticals, health and education. We worked with Defend Digital Me to deliver a two part EdTech Design Lab to explore the growing use of data-driven educational technologies (EdTech) in UK schools; challenges around who has, and who does not, have a say in EdTech adoption and possible approaches to give affected communities a more powerful voice in EdTech decision making. Participants included teachers and school counsellors, teaching union representatives, and education researchers. We followed this up with a later Connected Conversation on the topic “How can affected communities have a powerful voice in shaping the adoption of data-driven technology in schools?”

Amid the furore about Palantir’s NHS contract, the Our Health Data Stories report produced in partnership with Just Treatment didn’t just expose how patients think about data on a day-to-day basis; it also recommended further work around data opt-outs, data service procurement, community engagement, and digital health apps. We explored these in a Connected Conversation that asked “What would it take to secure full public support for NHS data re-use?”

As we campaign for health justice, we need to navigate the complex impact of data and AI. Connected by Data has been a key ally in that work.

— Diarmaid McDonald, Director, Just Treatment

Politicians, press and public are becoming more aware that decisions about data are political, and fundamentally about power and priorities. But there’s still work to be done to make this democratic and support people, institutions and ideas to make data and AI work for us.

— Gavin Freeguard, Policy associate

Establishing international partnerships

We want our work to be useful outside the UK, and build international partnerships to help that happen.

In June 2023, in partnership with Aapti Institute, Research ICT Africa, and The Datasphere Initiative, we convened a workshop in Costa Rica to explore global policy agendas on collective data governance. The session explored building shared language, mapping policy landscapes, developing policies for collective data governance, and prioritising local and global advocacy opportunities.

Thanks to this workshop, we collaborated with Mydata Global, Aapti Institute, and The Datasphere Initiative, to publish In This Together: Combining Individual and Collective Strategies to Confront Data Power, a think piece that looks at how advocacy for better data governance can draw upon the tools of both individual data rights, and collective data governance. This was the topic of a Connected Conversation on Combining Strategies to Confront Data Power in March 2024.

The Connected by Data workshop as a Fringe of the OGP Summit (September 2023)

The Connected by Data workshop as a Fringe of the OGP Summit (September 2023)

We have undertaken a significant amount of work relating to the Open Government Partnership (OGP). In September 2023, we brought together a diverse group of civil society, government and academic stakeholders on the fringes of the 2023 Open Government Partnership Summit in Tallinn, Estonia, to co-design model policy commitments that could deliver meaningful transparency, participation and accountability in data and AI governance. We followed this with a Connected Conversation to share these resulting commitment areas in more detail, and invited responses from a range of experts to explore the opportunities and challenges for putting these ideas into practice.

Connected by Data plays an important and unique role connecting organizations working on different aspects of data governance and from different parts of the world, facilitating innovation and collaboration that wouldn’t happen otherwise. This is essential for helping the advocacy community adapt to a rapidly changing relationship between individuals and data.

— Christopher Wilson, MyData Global

Later in the year, Tim attended a global peer exchange workshop in Nairobi, organised by the OGP on governance of new and emerging technologies. The Governing New & Emerging Digital Technologies global peer exchange workshop brought together government officials and civil society representatives from twelve countries leading digital governance reforms, along with international experts, to address the challenges and opportunities presented by digital innovations, particularly Generative Artificial Intelligence, and their impact on democracy, human rights, and the economy. Building on this workshop we produced a response to the interim report of the UN Advisory Board on AI. And we worked with the Data Values Project and Global Data Barometer to explore approaches to benchmarking national frameworks for participatory data governance.

We have participated in a number of other international initiatives. Jeni was part of the Council for a Fair Data Future, organised by the Aspen Institute, which met in New York in July 2023, as well as continuing to chair the GPAI Data Governance Working Group until December. Maria attended a number of C20 events (linked to the G20 presidency in Brasil) and Adam attended the Summer School on the Law, Ethics and Policy of Artificial Intelligence hosted by KU Leuven, an international community based in Belgium. With Liz Steele, we also started to explore how we might engage more actively in Europe, following the passing of the EU AI Act.

Although the draft bill in Brazil has some way still to go if it is to become law, the journey outlined above demonstrates the vital importance, and impact of a process that looks beyond industry interests when regulating data and AI.

Maria on building trust through civil engagement in data and AI

Jeni Tennison meeting with Minister Ed Husic from the Australian Government Department of Industry, Science and Resources alongside Abeba Birhane and Rachel Coldicutt (November 2023)

Jeni Tennison meeting with Minister Ed Husic from the Australian Government Department of Industry, Science and Resources alongside Abeba Birhane and Rachel Coldicutt (November 2023)

Over the last two years, we have amassed over 1,600 followers on X (formerly Twitter), 234 Mastodon followers, 933 followers on LinkedIn and have over 160 members on our Discord server. We organised, spoke at or participated in almost 50 events and gained a deeper understanding of the kinds of spaces where conversations are happening, and what’s lacking. During 2023-24 we introduced a Data Policy Digest and a Connected by Data Newsletter, which have 324 subscribers.

Jeni Tennison speaking at a Citizen-Centric AI Systems event (November 2023). Photo credit: Professor Sebastian Stein.

Jeni Tennison speaking at a Citizen-Centric AI Systems event (November 2023). Photo credit: Professor Sebastian Stein.

Convening important conversations

Thanks to funding from Omidyar Network, we have established our Connected Conversations programme. Connected conversations are a programme of informal, virtual, and usually international discussion sessions designed to support exploration and learning around the fundamentals of collective, democratic, participatory and deliberative data governance.

People who attend discussions are expected to have strong opinions, weakly held, and to critique and question their own assumptions, as well as those of other people. We publish write ups of all Connected Conversations on our website as a further demonstration of our commitment to share learning as widely as possible. Aside from those already mentioned above, these included:

Codesigning interventions

Again thanks to funding from Omidyar Network, during 2023-24, we were able to convene six design labs, all of which we’ve described in more detail above.

We deliberately organised these design labs in different ways and to different scales, to experiment and inform further design labs.

Cultivating changemakers

Our small, but global, fellowship programme grew, with four additional Fellows bringing new and diverse voices and perspectives to the group. Fellows are supported and mentored in a bespoke approach with a fortnightly collective online gathering.

Judith Townend, Fellow
Kristophina Shilongo, Fellow
Libby Young, Research associate and Fellow
Luke Richards, Fellow
Aditya Singh, Fellow
Maria Luciano, Fellow / Former Research associate
Natalie Byrom, Fellow
Nicola Hamilton, Fellow

The fellowship has been an enriching experience and a fantastic opportunity for collaboration and exchange with a community passionate about collective and participatory data governance. I’ve enjoyed the opportunity to have greater insight into CbD’s work and have learned immensely from the amazing cohort of fellows and their diverse projects.

— Aditya Singh, Fellow

In the last twelve months we’ve been joined by Emily Macaulay (Head of Delivery and Operations) and Helena Hollis (Field Building Lead). Three members of the team have moved on to new adventures and we wish all good luck to Obioma, Jonathan and Maria in those. We’ve also had the good fortune to continue to work with Gavin Freeguard, focused on UK policy, and Liz Steele, who has been exploring opportunities within the EU.

I’m proud to have joined a small but mighty organisation focused on working with care and generosity to have meaningful social impact.

— Emily Macaulay (she/her), Head of Delivery and Operations

Openness

We aim to work in the open. Our strategic roadmap for 2023-24 reflected on what we had learned in our first set and set our intentions for this year. We continue to share our own resources and amplify those of others; write up the events we attend and organise; and regularly blog on topics we’re exploring. We also produce weeknotes which give a behind-the-scenes perspective on our work.

Diversity, equity and inclusion

Our work is founded on an awareness of power, privilege and pluralism inspired by feminism, intersectionality, and anti-colonialism. We continue, as individuals and collectively, to challenge ourselves to improve our understanding and implementation of inclusive practice.

Over the last year we’ve focused on taking the time to understand individuals needs and prioritise time, effort and resources to meet those needs, with the aim of enabling them to be themselves and have everyone engage equitably. Examples have included meeting panel participants arriving on public transport, providing specific meal options, funding additional accommodation for those disproportionately negatively impacted by travel, providing information (such as meeting agendas) in advance, and providing 1:1 support to address low digital literacy.

From the inclusivity of the recruitment process, to the warm welcome, and a great year working with different communities here at Connected by Data, I have loved working at an organisation that lives by the values it espouses. I have learned about the wealth of issues which communities are engaging with around data and AI, and the innovative and inclusive ways in which they are challenging power.

— Helena Hollis (she/her), Former Field Building Lead

Learning

We are committed to continuously learning about what works and what doesn’t, and how to improve our impact. During 2023-24 we learned:

  • AI is where the conversation is at. We know that non-AI uses of data have just as much influence on people’s lives, and indeed that “AI” is used as a byword for a broad set of technologies. Nevertheless, the public and political conversation is focused on AI.
  • There is no AI governance without data governance. Notwithstanding that conversations often open with AI, governance of data remains fundamental, and there is growing recognition that AI governance can only be delivered through robust data governance.
  • We need to build collective power. There is a fundamental difference between consultation to understand public attitudes and opinion, and interventions that democratise power over data and AI. Our orientation needs to be the latter; this means paying attention to civil society empowerment as well as public participation.
  • How we work matters. We are committed to demonstrating care, generosity, agility, and openness. We’ve also seen the value of showing leadership to unite our community and amplify our joint interests, such as through Data and AI Civil Society Network and the open letter we initiated in partnership with the TUC and ORG around the AI Safety Summit.
  • Practical demonstrations are useful. We co-convened events such as the Gloucestershire Data Day and the People’s Panel on AI that have proved to be useful examples to illustrate the importance and practicality of what we are advocating for.
  • Diverse backgrounds make for the best conversations. As we have convened people during 2023-24 through our series of Connected Conversations and Design Labs, we’ve seen the most valuable conversations arising from bringing people together with experience outside data and AI, such as learning from climate campaigners.

We’ve taken these lessons into our plans for next year, which you can learn more about in our Strategic Roadmap 2024-25.

Governance

We are a non-profit company limited by guarantee and charitable objectives (including an asset lock). Our Governing Board is chaired by Karin Christiansen (Chair) with Non-Executive Directors Louise Crow; Karien Bezuidenhout; and Hera Hussain. Their Terms of Reference and meeting minutes are published on our website.

In February 2024, we updated our articles of association to be explicit about our not-for-profit operating model, thanks to pro-bono support from Hogan Lovells’ social enterprise lawyer training programme (HL BaSE).

Funders

Thanks to our funders for their generous support during this year.

Finances

Our full financial statements have been filed with Companies House and are summarised below.

  2023/24 2022/23
Income 611 325
Unrestricted grants 376 264
Restricted grants 186 15
Sales 38 46
Costs 604 315
Staff costs 508 255
Other costs 94 58
Corporation tax 2 2
Capital and reserves (cumulative) 17 10
Accrued income 224 500

Do you collect, use or share data?

We can help you build trust with your customers, clients or citizens

 Read more

Do you want data to be used in your community’s interests?

We can help you organise to ensure that data benefits your community

 Read more