Connected Conversation: Sharing progress on resources for deliberation on data & AI governance
On 22nd - 24th November 2023 CONNECTED BY DATA and Iswe convened an event to create a toolkit for distributed deliberation on Artificial Intelligence. Following this, work has continued on a series of resources and projects to support practitioners in making the case for, and scaling delivery of, public deliberation on how AI impacts our lives.
This Connected Conversation was an opportunity for participants of that event, and others. to share progress on that work and related activities.
After a short welcome Tim Davies invited Johnny Stormonth-Darling from Iswe to share progress on “The AI Deliberation Incentive Map”. This group from the November event addressed the challenge of institutions and organisations that faced constraints and barriers to adopting participation in AI. The group had been working on identifying the incentives for deliberation by identifying questions relevant actors may have under different themes. This work has become a large matrix, and following some further iterations the hope is to test it with stakeholders. Future funding would enable primary research into the incentives for conducting public engagement processes around AI.
Tim Davies then spoke about the work he, Emily Rempel and Tania Duarte had been progressing creating a pathfinder resource for designing deliberation on AI. This workstream evolved from a recognition that there will be groups of people that are experts in data and AI but not deliberation, and similarly groups of deliberation practitioners that do not yet understand the data and AI context. The group quickly identified they do not wish to create “yet another” toolkit, rather a signposting to a range of quality resources already created and available, with the addition of some data and AI context.
Helena Hollis, Octavia Reeve, Diane Beddoes and Anna Colom have been working on a funding bid for a research project about empowering publics to co-design a positive future living with AI. Helena spoke about how this work sought to address the concern that there has been an increase in participatory research approaches on AI, but these have been limited in a number of ways, e.g. limited in scope, UX focused, missing key populations (e.g. young people). Their work also recognises that participatory research is perceived to be hard to fit into policy decision making because of scale and time required. The resultant proposal is for federated peer-research that provides local insights of people’s lived experiences of AI, which can also be combined into aggregated findings to inform policymakers. The local groups would also be linked into a network so participants can continue to engage past the point of a one-off project engagement. This group would be particularly interested to chat with anyone interested in piloting (and/or funding) this proposal.
The final presentation was made by Giles Lane and Anna Beckett about ‘Visions for AI’. Giles talked about ‘The People’s AI Stewardship Summit’ that had brought together a mix of diverse voices: the public, industry, policymakers, and academics. The aim was to listen to what the people of Belfast want from AI, ensuring their preferences are heard as we shape the future of these technologies. The participants ended on seven recommendations:
- AI should bring people joy
- AI could make us feel safer
- AI could deliver health benefits
- AI must be accurate and reliable
- AI should be regulated globally
- AI must be accessible to everyone
- AI could make humans better
Giles and Anna shared insight into how having a mix of voices in the same room had helped build learning. The event was hosted by the Royal Academy of Engineering and direct engagement with engineers and business hub representatives led to some great conclusions, which are being used internally and externally to inform policy and thinking about how engineers develop and deploy AI. The next steps for this work includes running a series of sessions around the UK and support getting local public voice into the incubator and accelerator space.
Open floor
Tim then opened the floor for any questions prompted by the presentations, anyone wishing to share about their related work and/or any related reflections.
Tim started by noting that the peer research proposal was a really interesting prompt to think about where peer-research might fit into the work CONNECTED BY DATA are doing on scoping options for global deliberation on AI, specifically how can peer-research be part of setting agendas?
It was reflected that listening to the discussions suggested there’s a scepticism about the deliberative wave and everyone - in different ways - were thinking about the question “how do we encourage / persuade enable policymakers and industry to systematically involve public voices in decision-making around AI?”. Related it was mused whether a set of standards, rather than prescriptive methods, would be more effective in influencing non-users to start embracing participatory methods. It was noted that the Data Protection and Digital Information Bill drafted by the Conservative Government would not make it through “wash up” and as a result a new Government (post General Election) would find themselves with a need to address data (protection) reform. This could present an opportunity to those in this Connected Conversation interested in seeing public voice (of those being impacted by decisions about AI) being a requirement in new legislation.
There was discussion reflecting on Giles and Anna’s input about the approach the Royal Academy of Engineering was taking. It was noted that building public voice into accelerators is a really interesting approach and wondering whether this could link to “regulatory sandboxes” for which a lot of people advocate.
Other examples of good practice were shared including:
- Sciencewise who are co-funding public sector organisations that are working on science/technology questions to encourage them to engage the public (often using deliberative methods to build a good understanding).
- Patterns in Practice which run sessions for different people from different communities, asking them to speak about opportunities and risks around AI. An outcome included the challenges around AI not being able to handle British Sign Language and the impact that may have on the D/deaf community.
- CONNECTED BY DATA are working with ISWE to develop proposals around a Global Citizens Assembly and are holding a workshop in Brussels soon. More information to follow on this work.
- ISWE are also working on an assembly platform providing information and framing around a range of issues (beginning with climate) that could extend (following testing and bedding in) to AI. It will allow assemblies around the world to be able to connect on topics and share conversations and deliberative outputs.
- Ada Lovelace Institute are working with the Digital Good Network and Reema Patel revisiting the 2021 framework of participatory data stewardship, focusing on participation and inclusion.
- WeAndAI have recently finished a project (report to follow) through which they spoke to representatives of vulnerable communities that were already heavily using generative AI. There will be a JRF supported webinar sharing more on this in mid June.
It was noted that Public Voices in AI, funded by RAI UK and a partner of the Digital Good Network, will be opening a fund in late May and are holding a launch event for the project in early June.
The Design Lab that had prompted much of the work discussed in this Connected Conversation was held at Hawkwood College who are very supportive of developing understanding around this work and AI. Alicia, their CEO, spoke of their newly launched Fellowship Programme and welcomed people to contact her if they were interested in more information.
Tim closed the discussion with a round of ‘check outs’ from those that wished to share what they were taking on and an invitation to join the CONNECTED BY DATA Discord Channel on ‘deliberation on data and AI’. Anyone interested can email emily@connectedbydata.org to receive the link.