As the Data Protection and Digital Information Bill continues its erratic legislative journey, we convened an open call for interested parties to share, discuss and maybe act together. In this open Zoom meeting, Gavin Freeguard chaired an open discussion of first reactions and thoughts in a confidential and trusted space.
In a press release, the government highlighted a few areas where the Bill has changed – including commercial activity coming under the definition of scientific research, some reporting and redress requirements (such as around Data Protection Impact Assessments - DPIAs) being relaxed further, and some drafting changes around automated decision making. But on the whole, in structure and substance the Bill is close to the original. This means the issues raised at our online workshop on the Bill last September are still of concern, with some in civil society fearing the Bill has become even more deregulatory.
Michelle Donelan laid out the Government’s intentions with the bill in a recent Conservative Home article, saying that this version has been ‘co-created with business’ after further consultation.
The general position of the (more than 30) civil society attendees was that the Bill requires either significant amendment, splitting up or scrapping.
Concerns raised about the changes to the Bill included:
- Changed definitions of scientific research to include research by private companies
- Reduced reporting and record keeping requirements for all businesses
- Limitation of DPIAs in various areas including automated decision making (ADM)
- The naming of Direct Marketing as a legitimate interest
- The governance of the Information Commissioner’s Office (ICO) by a board appointed by the Secretary of State.
Many of the previous concerns around the Bill still remain, including:
- Concerns about EU adequacy
- Questionable costing claims for business and growth
- Expanding how data can be used for scientific research
- Weakening rights around consultation and redress
- Automated decision making, including allowing it without human involvement unless the processing is deemed ‘high risk’
- The removal of legitimate interest balancing tests in a number of areas with the Secretary of State able to add to this list
- Weakening the independence of the ICO
- The focus of the impact assessment for the Bill only quantifying the costs and benefits for business, not citizens or communities.
There was also concern about how these changes would interact with:
- other legislation, including the Online Safety Bill and AI white paper to either produce unintended and/or confusing consequences
- what was perceived to be an already weakly-enforcing ICO, combined with an increase in the discretion given to business to make potentially damaging decisions about data processing
Break Out Groups
Breakout discussions explored three areas in further depth: automated decision making; the widening grounds of data use; and rights, reporting and redress. The following sections list the main points made and concerns raised by civil society attendees within each.
Automated decision making
- The default of allowing ADM without human involvement unless the processing is high risk could weaken protections, especially for workers at risk of impacts not having an opportunity for redress.
- Secretary of State powers to expand where ADM could be used and decide what is meant by ‘meaningful human intervention’ add to existing concerns.
- Changes around profiling seem to muddy waters further. The requirements only apply when a significant decision is made without meaningful human involvement, but doesn’t define what constitutes a significant decision or meaningful human involvement.
- This is anticipating the AI white paper and it is very likely things won’t be joined up and won’t be backed up by legislation. Regulators will be using existing powers and competencies to regulate AI; the provisions in the Bill weaken one of the few areas of legislation for ADM.
- Existing rights around ADM and human intervention probably aren’t used much already and these strip even more away - how likely is it that anyone would actually be able to implement their rights?
- The chasm between what’s in the legislation and what an individual can do already is quite a challenge. How could you understand how a decision has been made about you, and challenge it?
- Explainability for many of these systems is difficult.
- It would be good to include a clause that when Secretary of State powers are used, they must go to public consultation first.
- Should the Bill include a role for humans (including trade union reps) in the process? ADM can have a real impact on pay rates.
Widening grounds of data use
- The definition of statistics and research has widened. Widening the definition of statistics might be even more risky than widening that of research. This may also put national statistics at risk.
- How is the legitimate interest change different to existing practice? (That said, organisations don’t follow recommendations, and the regulator does not enforce them.)
- The Bill removes consideration of fundamental rights.
- The Bill reduces accountability mechanisms for data subjects to challenge processing with wider society impact.
- Scrutiny is needed of the new proposal on legitimate interest, plus the new Article 8A purpose limitation with Convention 108 Article 5 and its obligations on safeguards.
- The combination of changes could have a big impact on how organisations do data and this is a real worry - added together it means a bigger hit and potential unexpected consequences.
Rights, reporting and redress
- There is already poor effective regulation and enforcement even within current legal frameworks. With a reduction in the ICO’s legal remit, we can expect a further decline in de-facto enforcement.
- There is a concern that the Bill reduces standards on what data controllers might do.
- There is a concern that the Bill leaves too much to further and expansive discretion of the Secretary of State and the regulator.
- The Bill narrows the ICO’s strategic priorities, with a reduced focus on data protection in favour of innovation.
- The approach to impact assessments is another example of moving from high standards to discretionary judgments.
- There is a concern that companies or public bodies will take advantage of weakened data subject rights by claiming that Subject Access Requests are ‘vexatious’ as a justification for not complying with requests.