Amid the hype and worry about the data-driven transformation of our world, there is often something conspicuously missing: personal stories. They ground our understanding that this change is not a remote future, but is a reality in progress that is affecting every relationship and interaction we have, as workers, family members, consumers and as citizens.
In an effort to address this and build on these case studies, Connected by Data worked with Mary Towers of the TUC’s AI project and affiliate unions to support three workers with direct experience of AI to be in conversation with MPs. On the 20th June, in the grand setting of Committee Room 9 of the House of Commons, a packed audience of trade unionists, policy professionals, politicians and journalists heard first hand how AI is affecting workers in the here and now.
Packed committee room for @The_TUC @ConnectedByData event in Parliament on workers' experiences of AI — powerful testimonies echoing recent @AdaLovelaceInst research on the need for greater transparency & human oversight when AI is deployed pic.twitter.com/NaZ0iNTazb
— Matt Davies (@halcyene) June 20, 2023
The worker experience of AI
Co-chaired by Connected by Data’s Jeni Tennison and the TUC’s Mary Towers, the aim of the event was to centre the worker experience of AI in a political and constructive debate.
The event kicked off with testimony from Garfield Hylton. Garfield, an Amazon worker and GMB member, has been involved in the UK’s first industrial action at Amazon, at the warehouse in Coventry. Garfield detailed his experience of the mundane but profound ways in which Amazon’s intricate systems of datafied management affect workers. Garfield’s experience brought home that the dehumanising ways in which AI can be used in workplaces is a ‘here and now’ issue. For example, disciplinary and management practices based on opaque and moving performance targets. With a blanket application, the systems do not make meaningful adjustments for glitches in the warehouse processes, or an individual’s circumstances, including injury or illness. Garfield’s testimony chimed with the British Academy’s Hetan Shah’s previous observation, that “the worry is not that we’re going to be taken over by robots, but that we’re treated like robots.”
For Luke Elgar, a Royal Mail worker and CWU national executive member, agreements between unions and employers can ensure AI and data-driven tech can be used “for the benefit of everyone, not just a whip for those who wield it”. Luke discussed the intensification of the micro-management of postal workers’ time and autonomy, and how this impacts on a diverse workforce. This includes the scandal that found its way to Parliament where Royal Mail bosses were excoriated for the previously denied surveillance of workers.
It was a privilege to speak on behalf of postal workers in the Houses of Commons this evening around AI and how it's used to track posties and dictate walking speeds.
— Luke Elgar (@luke_elgar) June 20, 2023
We must enhance workers rights and strengthen the ability to negotiate on all issues as this issue grows. pic.twitter.com/VzUuYSfDGe
While such AI can be seen as an extension of metric-based management of the labour force, generative AI has introduced a new dimension. Laurence Bouvard, a prominent voice-over artist and actress, outlined how people in her line of work are already losing out to generative AI. Laurence reminded the audience that “without training data AI does not exist”, and that this training data for large AI models is based on people’s voice or other creative work. Like “a parasite that feeds off its host”, Laurence described how this exploitation - often without the knowledge, remuneration or rights for the original creators - is enabled by the UK’s copyright laws which are decades old, from when the primary method of reproducing content was cassette tapes.
Moving from sentiment to substance
Most people will express a sympathetic sentiment to protect workers during a technological upheaval. But the question of when and how to give that sentiment legal and regulatory force was the key theme that ran through the evening.
Mick Whitley, Labour MP for Birkenhead, has been leading the development of an Artificial Intelligence (Regulation and Workers’ Rights) Private Member’s Bill. Starting from the premise that “these technologies can undermine many of our long held assumptions about workplace relations”, Mick seeks to propel “a much needed conversation on both sides of the Commons on alternative strategies to preserve the rights and dignity of workers”. This includes putting provisions for a “meaningful dialogue and collaboration between workers and management” on a statutory footing.
Damian Collins the Conservative MP for Folkestone and Hythe has ministerial experience of tech and digital policy, and a long run of work on the Online Safety Bill. Damian emphasised key principles of AI explainability and the basis of accountability. “At the heart of all of this is the idea that the people who built these machines are responsible for how they work”, and that there must be a ‘right to know’ of when and how AI is being used at work.
With that in mind, Stephanie Peacock, MP for Barnsley and Shadow Minister for Media, Data and Digital Infrastructure, expressed concern “that the government perhaps doesn’t share the focus of ensuring that tech is harnessed for the public benefit”. Stephanie cited how amendments to the Data Protection and Digital Information (No 2) Bill (DPDI Bill) that sought to defend or extend provisions for accountability and transparency were voted down by Damian Collins and fellow Conservatives at the Committee stage of the DPDI Bill. Jeni Tennison commented that the Bill was the Cinderella of AI regulation: doing the work actually protecting (or eroding) people’s data rights in the face of algorithmic decision making, while everyone else gets invited to flashy Global Summits.
The different political traditions and perspectives were further revealed by questions put to the panel by the workers. Garfield cited the commendable principles within AI White Paper, but asserted that without legal force companies like Amazon had no reason to adopt better practice. Damian acknowledged that new legislation might be necessary, but that the question is whether we are “confident the protections that exist already can be applied to companies working in the AI environment”, including employment law.
Stephanie rebutted the ‘wait and see’ approach. In the context of workers’ and trade unions’ rights that “have been under sustained attack”, it’s clear that the existing legislative framework cannot deliver the protections needed. Mick drew comparisons between the successful introduction of health and safety legislation that drove economy wide improvements, and how weak or voluntary could not be relied upon in such a critical domain.
Pleased to contribute to the AI at work panel organised by @ConnectedByData & @The_TUC with @MickWhitleyMP @DamianCollins
— Stephanie Peacock (@Steph_Peacock) June 20, 2023
Thanks to the 3 workers who spoke powerfully about their experiences. Important discussion on ensuring workers’ rights keep pace with technological change. pic.twitter.com/CL0EURuKw5
New technologies highlight age-old debates about the role of the state versus the market, and the balance of power between workers and employers. Luke’s question on how MPs might give force to the sentiment of rebalancing power in the workplace, was particularly pertinent given the Strikes Bill that was in the Commons the next day. Luke’s question positioned unions as a key component of ensuring the risk and reward of technology is justly distributed.
Inspiration may be taken from the German model of ‘works councils’ that empower unions - legally and with technical expertise - to negotiate tech and AI issues. Unions and civil society have been at the forefront of developing approaches to justly manage the AI transition: from the TUC’s project, to NASWUT’s AI principles and Equity’s AI Toolkit or IFoW’s Digital Information at Work Principles. These efforts represent a leading edge - grounded in experience - that provide a responsive and agile mechanism for technology adoption, moving much more quickly than government or regulators.
As Connected by Data and others have argued, the regulatory system should be seen holistically. It should extend beyond state institutions to meaningful mechanisms to incorporate the voice, interests and perspectives of those affected by data and AI. Without this, innovation will be lopsided and serve narrow objectives.
As ever, making high-level and shared principles a reality is the place where politics, and choices, happen. Representing the original ‘gig-economy’ workers, Laurence’s pointed question on what MPs will do to protect performing artists touched on both the granularity of the UK’s outmoded intellectual property regime and the value - broadly speaking - of art and creativity. Damian said, “we shouldn’t allow training data in AI to just take other people’s work and use it for free”. There is debate as to whether this type of property-based approach is feasible or desirable, in that it could entrench the power of current copyright holders, not necessarily the artists themselves. Either way, in data rights and other domains there is also a yawning gap on how even those rights enshrined on paper can be effectively accessed and exercised in practice - whether through regulatory enforcement, employment relations or other channels of redress.
On the question of AI regulation broadly, Mick asserted that “you can’t govern AI without legislation”. While deviating on the means and timing, Damian agreed that “the lesson of dealing with the tech companies is that unless they are required to do it they won’t do it the way we want them to”. For Stephanie, the DPDI Bill “was a really missed opportunity” to fill in some gaps in a “very fragmented enforcement regime”.
Mary Towers wrapped up by setting out the ‘jigsaw puzzle of AI regulation”. “Unions have a unique offering to make” she said, “through giving voice to the interests of workers during an industrial revolution, and as a vital collective counterbalance to purely corporate interests dominating the development of AI in society.”
‘Revolution’, ‘transformation’, ‘transition’ – however this dynamic is described, the underlying question is what interests and groups have influence and power to shape it, from the individual workplace to the highest levels of regulation.
To underscore that this is not a remote or abstract policy conundrum, Garfield concluded by laying out the wider context of an erosion of workers’ rights, and how that has enabled an environment where he is oppressively controlled every second of everyday by data driven technology.
“Let’s get back to basics,” he said, “let’s have an equal say, if you respect us we will respect you back”.
Connected by Data and the TUC would like to thank all the panellists and audience. This piece does not necessarily reflect the views of the organisations.
To discuss Connected By Data’s work further please contact:
Adam Cantwell-Corn at adam@connectedbydata.org
Alternatively, if you would like to speak to the TUC about the event or the TUC AI project, please contact:
Mary Towers at mtowers@tuc.org.uk