Connected Conversation: Who is at the table? How are public voices currently heard in the governance of data and AI?
Last week, as part of our Connected Conversations series, we brought together a fantastic group of panellists and discussants to look at how communities are gaining voice in the governance of data and AI in different settings around the world. The session focused on examples of efforts to engage affected communities in the governance of AI at different levels, internationally. This is particularly important and interesting in the run up to the UK AI Summit.
Our panellists for the session, sharing their experience to start the conversation, were:
- Richard Gevers: Open Cities Lab, Durban, South Africa
- Zoe Kahn: University of California, Berkeley, USA
- Safiya Husain: Karya, Bangalore, India
- Renee Sieber: McGill University, Montreal, Canada
The wide-ranging discussion, recounted in more depth below, touched on a number of themes, including:
-
Problem-centred participation. Often engagement is framed around technologies rather than the services they are used within. Instead of inviting communities to the table to discuss technical questions, switching engagement around to start from tangible policy problems can open up new discussions and solutions.
-
Literacy that empowers, not excludes. Building capacity for communities to shape decisions matters, but this should be able understanding how to influence change, rather than requiring people to obtain some detailed level of data literacy before they can engage in discussions about data.
-
Accessibility & inclusion. Paper-based resources can be used to hold meaningful in-depth discussions on data with communities: engagement does not need to be high-tech. It’s important to tailor engagement to the communities affected.
-
Ownership. Whether it is developing common ownership of data to give communities a stake in decision making, or actively working to build a sense of ownership over a process in other ways, moving up the ladder of participation requires conscious action to share power.
-
Change. Bringing community voices to the table should be seen as part of a change process, whether that is a co-operative process of organisational change-management to develop better digital services informed by community voice, or an outside campaign for change. Keeping the whole change journey in view is critical to understanding how community voice can, and should, operate.
To detail the session more thoroughly:
We started off with Richard, who highlighted the importance of starting with a problem statement, specifically a human-related problem that particular communities are facing, and allowing that to drive the decision-making and potential solutions, rather than launching into solutioning based on supply-side information. He gave a really interesting example he faced in South Africa during the pandemic where there was a need to combat the spread of Covid through handwashing. Whilst these handwashing facilities existed, and local officials were being rewarded based on ensuring these were located across the town and available to the community, many of them didn’t actually work. This meant that people living in informal settlements were struggling to access reliably functioning wash facilities. Richard’s team brought together data from the city’s service request tool, taught local community data collectors to collect data on where the facilities were/weren’t working, and brought these data sources together to get the information back to citizens using Whatsapp, allowing them to use washing facilities that worked. They also made the data available to local officials and changed the success metrics they used from the number of wash facilities available, to the number of wash facilities that were functioning, highlighting the need to measure the right things.
We then moved onto Zoe, who talked about her research in Togo focusing on developing people’s understanding and experience of the use of algorithms to determine the allocation of cash aid to people living in extreme poverty (using mobile phone metadata). She discussed how she had been using qualitative methods, particularly storytelling and an approach called “social impact scenarios”, to influence the design of systems, and how she used printed visuals as resources to support these conversations. Zoe showed the group some of the visuals she used to explain mobile phone metadata to people living in rural villages in Togo, including a map of Togo, cell phone towers, and basic feature phones. She pointed out that this type of visual can be particularly useful because they are physical, tactile, didn’t need connectivity, and enabled participants to communicate their own ideas. She also talked about how she had been bringing together experts, including ‘experiential experts’, to improve the diversity of voices in the room when technologists/policymakers are making decisions about governance.
Safiya explained the interesting background to Karya, possibly the world’s first ethical data company, which enables economic opportunities for rural Indians by capturing, labelling and annotating data and providing workers with fairly-compensated work. The data work that is currently done to power algorithms is often done by workers, for example in India, who are often underpaid and working in poor conditions, which has created a market failure that needs addressing. Karya works through a crowdsourcing system and uses AI to do three things – make the work flexible (workers can log on whenever they want), pay the highest wages in the area (a minimum of five dollars an hour), and the app has been designed to be inclusive to those who have low digital literacy or no access to the internet. Their approach is redesigning data ownership and the role of workers, in that they have a non-exclusive data structure, wherever possible the workers have ownership of the data, and the workers receive 100% of the ensuing royalties. The focus is on the humans behind the dataset.
Finally, Renee talked about her research project called “AI for the rest of us”, which has been about developing a new model of civic engagement in government decision-making processes that are being automated with artificial intelligence. She has been working at the urban/city level, thinking about issues of accountability and punishment of offenders and losses, in relation to the raw political power and accountabilities that the use of AI by officials brings.
These introductions highlighted themes such as: starting from the problem rather than the solution, the importance of bringing the right people together early, the importance of meaningful engagement, the role of workers, and sometimes needing to turn the table over entirely.
A comment was made by an attendee working in the humanitarian development sector. He highlighted that AI was already being used in the global south by individuals to write essays and apply for humanitarian funding, and the impacts and protection against this varied depending on the country’s data protection laws. However, there are problems if we haven’t decided our approaches to AI internally within our organisations, linking back to the importance of internal dialogue, capacity, and change management.
We had a brief discussion on whether data governance was fundamentally different to AI governance or not; whilst there wasn’t a consensus either way (some thought AI governance was a natural progression from data governance, others didn’t), it paved the way for a conversation about whether the more fundamental challenges about governance in general had yet been resolved. Richard highlighted that we haven’t really solved the problem about whether people want to be involved at all, but we are discussing it in the context of AI, perhaps because of the risk, the sense of disempowerment, and the importance of understanding the value exchange.
Another theme that came out of the conversation was about building capacity for participation, and the degree to which people need to be data literate in order to meaningfully participate in AI governance and for their opinions to be heard and respected by others. This sparked quite an interesting debate. Initially, it was posed that data literacy is essential for people to be able to proactively engage in conversations about data and AI use. This was countered by the suggestion that perhaps low data literacy is just an excuse for not sufficiently engaging with people, and you can still talk about the concepts and processes without an underlying understanding of the more technical workings, for example using techniques like Zoe had deployed in Togo. Going even further, could we leapfrog beyond literacy to power? However, this was again countered with the proposition that people aren’t able to challenge the questions they’re even being asked if they don’t have that underlying understanding, and therefore have to trust those asking the questions – whose questions get to shape our discussions? Perhaps what we’re talking about here is basic critical thinking skills.
This, unsurprisingly, morphed into a conversation about knowledge as power. Safiya highlighted that knowledge, today, is in a systemically discriminatory system and it’s shared through power, so how do we ensure that people have the basic knowledge and understanding of what they’re engaged in, and give people the confidence and space to have these discussions. She brought up the approach to cooperatives and distributed ownership in India, which is a useful framework for considering these issues.
Finally, we talked about the importance of improving the literacy of policymakers in data governance. Sometimes these activities are ‘outsourced’ by policymakers, and the methodologies are not well understood, so when the outputs come back and they don’t align with expectations, they can be discredited. Getting more policymakers into the room, upskilled, and involved in participatory governance may help to successfully embed these approaches.
Finally, we concluded that:
- We need to ensure we aren’t leaving people behind, in terms of decision making and financially
- The principle of democratic governance is essential, regardless of whether data governance and AI governance are different
- The term ‘literacy’ can be alienating or misused, but the basic concept of empowering people to be able to be part of the conversation is important
- We can’t forget the links to power – addressing power might look different in different contexts, and there needs to be the opportunity for redress