Workshop on Practical GeoAI Ethics

Jeni Tennison

Jeni spoke at the workshop, describing some of the ways in which organisations are bringing the public into the processes of data governance.

Mine was one of four short talks, the others focusing on:

The talk by PLACE was particularly interesting. They run a data club which they’re in the process of constituting and characterising as a data trust. The data club holds geolocated aerial and street level imagery (not lidar) from urban and peri-urban (not rural) areas in Global South countries. They make deals with relevant governments to collect the data; those governments own the data, but grant PLACE a licence to include it in their data club. Members of the club are able to analyse and use the imagery to create derived data products; commercial organisations pay larger fees and can keep the results private; non-commercial organisations pay lower membership fees but have to publish their derived data openly. All members have to sign up to the Locus Charter which commits them to certain ethical behaviours around the data.

This is a tough kind of initiative to do well, particularly because of:

  • the built-in power differential between a Global North institution with Global North members holding data about Global South countries
  • the understandable urge to be self-sustaining based on fees from members, given this will introduce conflicts of interest between funding and ethical use of data
  • the equally understandable focus on urban data, and how this will lead to biases
  • the tensions around seeking greater coverage and working with governments who might not always act in the best interests of all their citizens and residents
  • how (aerial) imagery is about communities but is rarely taken with the knowledge, consent or within the control of those communities

There’s not a huge amount of detail on the PLACE website. In particular, there’s not really detail about the governance structure or processes. Those details are important, I think. For example:

  • What are the exact terms and conditions around governmental use of this data? Are they able to share it with other organisations, or even open it up freely?
  • How are those governments involved in PLACE’s governance on an ongoing basis? Are they part of the boards that decide who can use data about their countries?
  • How are citizens and communities consulted or involved in data collection?

The last of these is the one I’m most exercised about. From the discussion, it sounds as if there is no consultation process, and that the government is taken to sufficiently represent citizen and community interests. I’m quite sceptical about whether that is a correct assumption.

Ruth’s talk on public participation in collecting data for research was also interesting. She discussed a particular case study where she wanted to use data from TwinsUK, specifically data generated by tracking the locations and various environmental readings (eg air pollution) from twins who are part of that cohort. Part of the ethics process includes discussions with a Volunteer Advisory Panel (VAP) about the research. Ruth said the consultation not only challenged her assumptions about the length of time participants would be willing to be tracked (two weeks rather than the one she’d assumed) but also improved the research design in other ways. She described that volunteers wanted data collected about them to be able to enhance research, felt empowered by helping to shape the research and how they trusted TwinsUK researchers more than other researchers, due to the closer engagement with them.

My own talk focused on public participation for governance, including in non-research settings. Aside from the notes on the slides, the main question that I was asked was whether “do no harm” (the third of the Locus Charter principles) was meaningful. I said (pretty firmly) that it wasn’t, because every use of data and AI has some winners and losers. You will be doing harm, either directly or indirectly by providing benefits to some people over others. So it’s a matter of choosing who benefits most from your interventions, and aiming towards justice.