AI and data are everywhere in our political discourse and our everyday lives. Shaped properly, the benefits brought by such new technologies could lead to public benefit, with better services and better policies allowing us all to live better lives. But power over AI and data is currently concentrated in a small number of large companies working in their own interests, apparently untouched by questions of democratic accountability, and politicians easily seduced by hype or slowed by panic.
With governments increasingly using data and AI to make life changing decisions and big tech setting the rules and parameters of our digital lives, it is critical to bring democracy to the debate: we need greater transparency and openness about how these decisions are made, rights and other protections to ensure they are not harmful, and participation and deliberation to make them work in the democratic public interest. Government needs to demonstrate trustworthiness in its use of such technologies – not doing so risks disenfranchisement and alienation from and a further loss of trust in the political system. Who gets to decide our AI future is a question of critical democratic importance.
We seek to see:
- Progressive political parties heading into the next election with progressive policies on AI and data in their manifestos and post-election plans
- The Data Protection and Digital Information (No 2) Bill is either heavily amended or blocked
- A strong and diverse network of civil society organisations and research and academic institutions supports and scrutinises progressive policymakers on AI and data
Data Protection and Digital Information Bill: what happened at Lords Second Reading?
On Tuesday 19 December 2023 – the last day before their Christmas break – the House of Lords got to debate the Data Protection and Digital Information Bill for the first time.
This is known as Second Reading – First Reading is a formality, where the Bill is introduced to the House after moving through the Commons, and it will next go to Committee stage (for detailed scrutiny), then Report Stage and Third Reading (for amendments and further debate in the Chamber).
Below is a brief summary of the debate (starting with a table of contents). You can also read the debate in full, read my live tweeting, watch the debate or check out our page of resources on the Bill and its parliamentary passage.
- Data adequacy with the EU
- Business impact
- Length and complexity
- Lack of scrutiny
- Children’s data rights
- Coroner access to data
- Weakening of individual rights
- Collective data rights and use
- Importance of trust
- DWP access to bank account data
- Secretary of State powers
- Henry VIII clauses
- Artificial Intelligence
- Health data
- Expanded powers for police and security services
- The Information Commission
- Abolition of the Biometrics and Surveillance Camera Commissioner
- Cookie consent
- Direct marketing
- Digital identity
- Smart data
- Democratic engagement
- Death registration
- Work to do
Opening for the government, AI minister Viscount Camrose said the proposed reforms to the UK’s existing data protection regime (including the General Data Protection Regulation, or GDPR) ‘deliver on the Government’s promise to use the opportunity afforded to us by leaving the European Union to create a new and improved UK data rights regime’. He ran through the five parts of the Bill, arguing it would unlock the potential of data ‘not only for businesses but for people going about their everyday lives’, while ‘continu[ing] to maintain the highest standards of data protection that people rightly expect’; these are not ‘contradictory objectives’. He noted the Bill as it now stood – following some amendments tabled below Report Stage in the Commons (many of them very late and some of them controversial) – had doubled the estimated economic benefits, to £10.6bn over 10 years.
The Bill, according to Camrose, would ensure organisational compliance would be directly related to the risk its data processing posed to individuals; modernise an already ‘world-leading independent regulator’, the ICO; allow ‘responsible automated decision-making’ with human intervention where requested; seize billions of pounds in ‘the booming global data-driven trade’; ‘allow people to control more of their data’; make digital identity easier and more secure; and improve data for the NHS and adult social care, law enforcement and national security. Camrose highlighted amendments on the National Underground Asset Register and coroner access to social media data from deceased children (acknowledging complaints from bereaved families that it did not go as far as promised, pledging to listen to their arguments) – but touched only lightly on new DWP powers ‘to protect taxpayers’ money from falling into the hands of fraudsters’.
Opening for Labour, Lord (Jim) Knight – standing in for an unwell Baroness (Maggie) Jones – quoted his 2017 speech on the Data Protection Bill (later Act), about the ‘need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data’. That remained the vision for him – but the weakening of protections in this Bill would not achieve it. He felt the opportunity to strengthen the ICO to better regulate the use of data in AI had been missed, providing no new oversight of cutting-edge technologies (quoting the Ada Lovelace Institute); that the Bill weakened ‘the already limited safeguards’ around automated decision-making, especially in the workplace; quoted Stephen Cragg KC on the ability of the Secretary of State to add to the list of ‘legitimate interests’ without scrutiny; and further criticised the Bill for not empowering regulators for the government’s AI regulation plans, and for lost opportunities to build public trust in developing better technology and generate revenue for the taxpayer through data trusts.
Knight moved onto ‘broken promises and lessons not learned’, beginning with the coroner access to data promise and expressing confidence that another discussion carried over from the Online Safety Act discussion, around researchers being able to access data from social media platforms, would be actioned. Another failing was the Bill’s length and the late addition of amendments post-Commons scrutiny – ‘the more I read the Bill and various interpretations of its impact, the more I worry about it’. Businesses need to understand it – Knight asked for the government to share its assessment on the Bill’s impact on EU adequacy. The Bill seemed design to meet the government’s own interests, on changing direct marketing rules during elections and the DWP bank account powers (which, he noted, the ICO was not convinced by), and asked which other countries had such provisions, what consultation had been carried out, and whether HMRC already had such powers. Knight also wanted to see changes or clarity in other areas, including a clear definition of high-risk processing; not weakening subject access requests; needing more information on smart data schemes; the workability of nuisance call proposals; and appropriate consultation on changes to cookies. He said ‘a new data protection Bill is needed, but perhaps not this one… there is much to do.’
There was lots of praise for the many briefings peers had received. There was also a lot of agreement that current data regulation was ‘outdated and in urgent need of reform’ (Bishop of Southwell and Nottingham); Lord Kirkhope (Con), who had been involved in drafting GDPR, noted it was a significant achievement (‘done by Brits in Europe’) but ‘not without need for enhancement or improvement’.
But there was also widespread agreement that there were many problems with the Bill. Lord McNally (Lib Dem) said government rhetoric that the Bill balanced reducing burdens while maintaining high data protection standards as having ‘met with scepticism and outright criticism’. Baroness Bennett joked that the government had avoided its favourite word in presenting the reforms, ‘world-leading’. Concerns included:
Lord Allan (Lib Dem) said the ‘alarm bells ring’ on this and had three main concerns: that the core regime would be seen as too weak to protect EU data subjects, security-related surveillance measures would create an unacceptable risk and redress mechanisms would be ‘inaccessible or ineffective’. Lord Kirkhope urged ‘extreme caution’ around some of the nations the UK wanted data adequacy with, as they might not align with the EU. Baroness Young (Lab) thought the EU would be concerned about the excessive powers granted to the Secretary of State. The Lord Bishop of St. Albans noted briefings highlighting a divergence of data regimes did not ‘sound like less red tape, nor does it sound particularly pro-business’. Lord Kamall (Con) asked ‘Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?’ Lord Vaux (crossbench) was pleased the government’s impact assessment had picked up the ‘significant negative impacts on trade and on the costs of doing business’ losing adequacy would have, though found the projected costs ‘surprisingly low’ (contrasting it with a higher ‘conservative’ estimate from the New Economics Foundation and UCL European Institute). He wondered what changes had been made to the Bill to take account of EU concerns; ‘I have to assume a responsible Government must have had discussions with the EU’.
Lord de Clifford (crossbench) ‘feared’ that the Bill’s ‘complexity will mean that it will not be fully complied with by a number of small to medium-sized businesses that do not have the resources or time to research and instigate any changes that may be required’. The Bill could favour larger companies ‘whose ability to collect, process and monetise data is well known’, so new regulations should not require a lot of resource to implement. There should also be a publicity campaign.
Several other peers expressed concern that it was ‘a long and very complex Bill’ (Lord Vaux). Lord McNally (Lib Dem) felt ‘its scale and how it has been dealt with by the Government in its preparation, false starts and in the other place mean that we are going to legislate for myriad issues, each of which are of importance to the sector, the individual concerned or society and will require our full due care and attention’.
The Lords echoed concerns from the Commons on this. Lord Bassam (Lab) described the late amendments as ‘an affront to our parliamentary system’. Baroness Bennett (Green) noted the late amendments: ‘That really does not look like a codesigning process’ (the non-affiliated Baroness Uddin wanted an assurance that advocacy and civil rights organisations had been involved, as well as industry). Baroness Young also noted an ‘absence of scrutiny’ from the Joint Committee on Human Rights.
Baroness Kidron, a key figure in the Online Safety Act debates whose words were echoed by several peers, was concerned that ‘children have not been explicitly protected from changes that lessen user data protections in the Bill’. She hoped this was inadvertent and the age appropriate design code would not be weakened. She also called for an AI code for children and for the ICO to set standards on edtech, and expressed concern that the law might not be fully equipped to deal with AI models trained on generating child sexual abuse material. Lord Bassam (Labour) highlighted clause 2 allowing companies ‘to exploit children’s data for commercial purposes’, and the need for further safeguards. Another specific issue was…
Following discussions on the Online Safety Act, Baroness Kidron said the government had agreed to ‘create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child’, but were reneging on that to narrow the scope to suicide. Viscount Camrose said the government was firmly on the side of bereaved parents and would listen to this criticism (which it appears to have done).
Baroness Young was concerned the Bill ‘seems to tilt the balance of advantage to businesses and government authorities rather than to the individual’. This included not gripping the AI challenge: ‘it erodes further the already inadequate legal safeguards that should protect individuals from discrimination or disadvantage by AI systems making automated decisions’. Lord Kamall was one of those raising the weakening of subject access requests, citing his personal experience; the Bishop of Southwell and Nottingham ‘share[d] the concerns of many civil society groups that the Bill will reduce transparency by weakening the scope of subject access requests’.
Baroness Kidron hoped the Bill would take the ‘wonderful opportunity’ to put ‘data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes’. She also agreed with Lord Knight that routes to accessing commercial datasets should be found for public interest research, rather than ‘embed[ding] the asymmetries of power’. Baroness Bennett underlined the importance of data being made available, to be used for the public good, inviting NGOs to think about community uses as well as just research. Baroness Young wanted the ICO to be able to consider class action complaints brought by civil society organisations or trade unions, while Baroness Uddin wanted greater protection for consumers. Lord Kamall wondered who ‘owned’ data – for example, that which we had willingly given to social media companies who were now monetising it for commercial purposes – and wanted more accountability.
The Bishop of Southwell and Nottingham said ‘it is worrying that, with some of the measures in the Bill, the Government seem to be reducing the levers and mechanisms that public trust depends upon’, and worried that protections were being weakened ‘to the extent that they will struggle to achieve their original purposes’ (referring to Public Law Project briefings). He quoted the chief executive of the Data Protection Officer Centre, who said ‘Whilst countries across the globe are implementing ever-more robust data protection legislation, the UK seems intent on going in the opposite direction and lowering standards’, and asked the government what reassurance it could give on maintaining public trust.
Lord Kamall (Con) shared Lord Knight’s concerns and asked for more clarity on the rules. Baroness Young thought it ‘a major intrusion into the privacy of pretty well all individuals in the UK’ – including most people in the Lords – ‘and, to some extent, an infringement on the confidential relationship that you ought to be able to expect between a bank and its customer’. Lord Vaux thought the powers ‘draconian’. The Bishop of Southwell and Nottingham wondered to what extent ‘the state demanding data without cause’ complied with the Nolan principles. Lord Sikka’s (Lab) speech was largely given over to this issue, criticising the government for making Orwell’s 1984 a reality, and questioning the expected saving given conflicting numbers. Baroness Bennett wondered where the powers to tackle economic crime were. Lord Davies (Lab) said UK Finance had criticised the powers – they weren’t part of the economic crime plan or fraud strategy, should be more narrowly focused and should not leave vulnerable customers disadvantaged. Lord Bassam (Lab) thought the powers covered 40% of the population, questioned the government’s ‘real need here’ and lack of consultation when the idea had been around for so long, and asked: ‘When were the Government planning to tell the citizens of this country that they were planning to take this new set of powers to look into their accounts?’
Lord Kamall thought ‘they lack the necessary scrutiny and safeguards’. Baroness Young thought the EU was enunciating a view that ‘the new powers of the Secretary of State to meddle with the objective and impartial functioning of the new Information Commission’ could threaten data adequacy with the EU.
Baroness Uddin said ‘Many fear that such extensive powers cannot possibly be for the public good’. Lord Bassam: ‘The Government’s answer to a lack of clarity in so many areas of the Bill is to build in huge, sweeping, Henry VIII powers. When reviewing the legislation recently, we managed to count more than 40 proposed statutory instruments. That is an immense amount of power in the hands of the Secretary of State. We do not believe that this is the right way to legislate on a Bill that is so fundamental to people’s lives and the future of our economy. We want to bring these powers back into play so that they have the appropriate level of parliamentary scrutiny.’
Baroness Young agreed with Lord Knight and other speakers that the Bill was a missed opportunity for gripping some of the challenges presented by AI. Lord Holmes (Con) noted that ‘AI is nothing without data’, and wanted clear and transparent labelling of the uses of AI, especially in the public sector. He also asked how much line-by-line analysis there had been to ensure coherence with AI White Paper. Viscount Camrose acknowledged comments about a lack of AI regulation, but said that data protection legislation applies to the extent that personal data is involved.
The Lord Bishop of St. Albans worried about the new definition of ‘personal data’, including the ‘imprecise and troubling’ definition of being able to identify an individual by ‘reasonable means’, and some genetic data no longer being classed as ‘personal data’. Baroness Bennett referred to a briefing from Understanding Patient Data which highlighted such decisions would now be a subjective test applied by data controllers – she noted the existing ‘very grave concern in our community about the use of medical data, the possible loss of anonymity, and the reuse of data for commercial research’ and urged the minister to take a look.
Lord Davies thought it clear the Bill had been written with and for industry, and worried (building on Understanding Patient Data and a briefing from the BMA) about less clear and straightforward rules, increased risks of disclosure, negative impact on public attitudes, the loose definition of scientific research, and a fear that the new rules were in ‘grave danger’ of moving beyond the protection provided by the Caldicott principles.
The Bishop of St. Albans worried clauses 19 and 28 to 30 did not provide proper accountability and could give the police immunity, even if they had committed a crime – ‘an extraordinary amount of unchecked power’. The National AIDS Trust had highlighted police officers sharing an individual’s HIV status without their consent.
Lord Kamall worried about the Commission’s bandwidth, given all the powers it was being given. Baroness Young didn’t think its existing record was sparkling, and wanted improvements to the Bill to ensure clear statutory objectives and clear independence alongside the ability to scrutinise it. She noted: ‘In my experience, all too often, Governments plural, not just the current Government, establish watchdogs, then act surprised when they bark, and go and buy a muzzle. If the public are to have trust in our digital economy, we need a robust independent watchdog with teeth that government responds to’. Lord Kamall also wanted to understand government thinking on whether new powers, requiring people to attend interviews as part of an ICO investigation, were too Big Brother-ish.
Lord Kamall was concerned about whether the ICO would have the bandwidth to absorb these powers. Lord Vaux thought it ‘interesting’ Camrose had not mentioned this in his speech, and referred to a report by the Centre for Research into Information Surveillance and Privacy (CRISP) that said none of the government’s arguments that the functions would be covered elsewhere ‘bear robust scrutiny’. He had real worries about the misuse of surveillance powers.
Lord Kamall understood they had become annoying, but – as someone who had been in the European Parliament for those debates – noted they had been supported because users should be told what was being done with their information.
Lord Kamall said the industry was concerned that a recent case between Experian and the ICO had led to confusion about the use the open electoral register for direct marketing purposes – the ICO interpretation would require a notification to every individual for every issue affecting their ability to target customers, though there were clear concerns about privacy.
Lord Holmes thought this could be very positive, but wondered what areas were being looked at. Lord Clement-Jones (Lib Dem) wondered why the UK had chosen not to allow private sector digital ID systems to be used for access to services. Lord Kamall echoed some concerns from the Commons around digital exclusion.
Lords were positive about this – Lord de Clifford noted open banking had brought ‘immense efficiencies and security for small businesses’, though urged caution in extending such schemes so open APIs could not be infiltrated or hacked. Lord Holmes thought there were ‘clearly extraordinary opportunities but they are not inevitabilities’, and that we need to understand consent mechanisms and for citizens to be enabled to understand it is their data.
Lord McNally thought these new powers ‘may invite abuse. I put that mildly’, referring to a Financial Times article that morning. Baroness Bennett said the Bill would allow the Government to tear up long-standing campaign rules with new exemptions and safeguards being removed. She noted what the ICO had said during public consultation: ‘This is an area in which there are significant potential risks to people if any future policy is not implemented very carefully’. Baroness Uddin said the Direct Marketing Association thought this could be open to abuse. Lord Bassam said ‘These powers were opposed in the public consultation on the Bill; this is not what the public want. We have to wonder at the motives of the Government in trying to change these rules at such a late stage and with the minimum of scrutiny’.
Lord Arbuthnot (Con): ‘I have learned in relation to the Post Office scandal that the complexity of computers is such that nobody really fully understands exactly what programs will do, so it is absurd that there is still in law a presumption that computers will operate as they are intended to. I hope that noble Lords will be able to turn their minds to changing that in the relatively near future.’
The Bishop of Southwell and Nottingham commended the move from a paper to an electronic register, but noted the UK Commission on Bereavement had recommended more could be done to reduce the administrative burden on bereaved people (including notifying private companies).
Peers were agreed that there was ‘an enormous amount of work to do’ on the Bill (Baroness Bennett); ‘The Bill will need a lot of work, and there are hours and hours of happy fun in front of us’ (Baroness Young). They also thought the stakes were high: Lord Davies said ‘the Bill is consistently behind the curve, always fighting the last war… I am not convinced that sufficient thought has been given to how developments in digital technology require developments in how it is tackled in legislation’, and Lord Vaux that: ‘We must get this Bill right. If we do not, we risk substantial damage to the economy, businesses, individuals’ privacy rights—especially children—and even, as far as the surveillance elements go, to our status as a free and open democratic society.’
For the Lib Dems, Lord Clement-Jones noted that ‘not a single speaker failed to express concerns about the contents of the Bill… so much of the Bill seems almost entirely motivated by the Government’s desire to be divergent from the EU to get some kind of Brexit dividend’. He complained that ‘the Bill dilutes where it should strengthen the rights of data subjects’ and wanted a strengthening of article 22, to include partly as well as fully automated decisions; worried the move from data protection officers and data protection impact assessments ‘simply sets up a potential dual compliance system with less assurance—with what benefit?’ Given the ‘catalogue’ of problems, it was no surprise so many peers raised data adequacy; he hoped the minster ‘has some pretty good answers, because there is certainly a considerable degree of concern’. He was concerned about biometrics (which need new legislation) and DWP powers (‘entirely disproportionate and we will be strongly opposing them’. There was a ‘whole series of missed opportunities’ to ‘create ethical, transparent and safe standards for AI systems’ and for greater imagination around new models of personal data control. The Bill is ‘a dangerous distraction. It waters down rights, it is a huge risk to data adequacy, it is wrong in many areas and it is a great missed opportunity in many others.’ The Bill ‘appears to have very few friends around the House’ and should expect ‘to have a pretty rough passage’.
For Labour, Lord Bassam concluded that ‘The range of concerns raised is a good indication of the complexity of this Bill and the issues which will keep us pretty busy in Committee’. While supporting ‘the principle of modernising data protection legislation and making it suitable for a rapidly changing technological landscape’, the Bill was ‘a missed opportunity to grasp the challenges in front of us. It tinkers rather than reforms, it fails to offer a new direction and it fails to capitalise on the positive opportunities the use of data affords, including making data work for the wider social good’. He agreed with Lord Holmes that it did not treat our data with respect. He worried that all too soon we would be back with a No. 3 Bill to ‘address the more profound issues at the frontier of data use’. Rights were being ‘watered down or compromised’ and the ‘widespread fear of machines making fundamental decisions about our lives with no recourse to a human being to moderate the decision’ were felt more widely than individual data subjects. The coroner access to data amendment had ‘broken the trust of bereaved parents who were expecting this issue to be resolved in the Bill’. All in all: ‘we regard the Bill as a disappointment that fails to harness the huge opportunities that data affords and to build in the appropriate safeguards. My noble friend Lord Knight put his finger on it well, at the front of the debate, when he said that we need a data protection Bill, but not this Bill… we look forward to a long and fruitful exchange with the Government over the coming months. This will be a Bill that challenges the Government’.
For the government, Viscount Camrose thanked peers for their ‘powerful and learned contributions to a fascinating and productive debate’. He underlined the support it enjoys from ‘both the ICO and industry groups’. It was ‘certainly not our intent’ that the Bill does not go far enough in protecting personal data rights and stressed that the fundamentals of GDPR remain at its heart. On democratic engagement, he said the government would always need to consult ICO, other interested parties and gain parliamentary approval. The ICO remains independent. ‘The Government believe that our reforms are compatible with maintaining our data adequacy decisions from the EU’. They would listen to bereaved parents (and appear to have done so). Abolishing the Surveillance Camera Commissioner would end duplication and improves consistency. And on DWP powers – the department was reliant on legislation over 20 years old, parliamentary time was tight, and DWP had been working closely with industry to test the proposals.
The Bill goes to Lords Committee stage next, where the Lords will go through the Bill line-by-line. Dates have yet to be confirmed, but rumours suggest it will be late February.
The committee will be a ‘Grand Committee’. This means any member of the Lords is welcome to go along, but there are no votes – any changes to the Bill have to be agreed unanimously. If an amendment is not generally agreed, the member is expected to withdraw it and instead bring it at the next stage, Report Stage, when Lords will hear and vote on amendments. After that, it’s Third Reading in the Lords – a final opportunity for Lords to make amendments – and then back to the Commons for them to consider the Lords’ amendments. The Bill will then ‘ping pong’ between the two Houses until all amendments are agreed.
Committee will examine the Bill in the following order:
- Clauses 1 to 5: data protection definitions and lawfulness of processing
- Schedule 1: recognised legitimate interests
- Clause 6: purpose limitation
- Schedule 2: purpose limitation
- Clauses 7 to 14: processing and international law, and political opinions; data subjects’ rights; automated decision-making
- Schedule 3: automated decision-making
- Clauses 15 to 24: obligations of controllers and processors
- Schedule 4: obligations of controllers and processors
- Clause 25: transfers of data to third countries and international organisations
- Schedules 5 to 7: transfers of data to third countries
- Clauses 26 to 46: safeguards for processing for research; national security; intelligence services; the role of the Information Commissioner and enforcement
- Schedule 8: complaints
- Clauses 47 to 51: court procedures around subject access requests, Electronic Identification and Trust Services for Electronic Transactions Regulations; protection of prohibitions, restrictions and data subject’s rights; regulations under GDPR and minor amendments
- Schedule 9: data protection – minor amendments
- Clauses 52 to 117: digital verification services; customer data and business data; privacy and electronic communications
- Schedule 10: privacy and electronic communications
- Clauses 118 to 128: privacy and electronic communication – codes of conduct and consultation, trust services, information to improve public service delivery, law enforcement information-sharing agreements, information for social security services
- Schedule 11: power to require information for social security purposes
- Clauses 129 to 137: retention of information by providers of internet services, retention of biometric data, registers of births and deaths
- Schedule 12: registers of births and deaths
- Clause 138: National Underground Asset Register
- Schedule 13: National Underground Asset Register
- Clauses 139 to 142: National Underground Asset Register, information standards for health and social care
- Schedule 14: information standards for health and adult social care
- Clause 143: the Information Commission
- Schedule 15: the Information Commission
- Clauses 144 to 157: the Information Commission; final provisions
Almost every modern policy reform relies on new data systems, and increasingly involves technologies labelled as Artificial Intelligence (AI) to make and shape decisions. Whether that’s education, health, benefits, housing, social care, work, policing, democratic engagement or a multitude of other areas.