What Should Change in the Data Protection and Digital Information Bill?

Jonathan Smith

On 30th September 2022, CONNECTED BY DATA organised a civil society workshop to explore the Data Protection and Digital Information Bill.

As we wait for the date for the second reading of the Data Protection and Digital Information Bill, many civil society organisations have different concerns and interests, and can see different opportunities and risks.

Here at CONNECTED BY DATA, for example, we think the Bill presents a real opportunity to influence how data is democratically and participatively governed, to ensure it works for us all, as part of the UK’s post-GDPR future. However, we also fear that, in its present form, the Bill reduces individual citizens’ control and influence, removes transparency and key safeguards, limits opportunities for engagement and ignores evidence about the collective and equality impacts of data processing and AI.

The workshop brought together people from over thirty civil society organisations to share information, understand what these concerns and opportunities might be, and discuss how we might work together to influence the Bill and manifestos for the next general election. Together, we explored eight areas of concern:

Our data adequacy with the EU

To protect their citizens (and business interests!), countries have rules in place that limit when and how personal data can flow across national borders. Say an organisation in country A wants to access data about people in country B. If country B deems that data protection in country A is “adequate” – that is, that it meets the same minimum standard as country B – then the organisation can just transfer the data. If it’s not the case, organisations in country A have to put in place a bunch of extra legal measures to demonstrate they are protecting that data, which can be costly and time-consuming.

Countries within the EU are assumed to have data adequacy with each other because they share the same regulatory regime. Now the UK has left the EU, however, the EU gets to decide whether the UK’s data protection regime is adequate – up to the standard of the General Data Protection Regulation (GDPR) – and therefore whether businesses and other organisations can transfer personal data from the EU to the UK. One consequence of changes to the UK’s data protection regime could be that it loses data adequacy, which would put a whole set of additional requirements on businesses and other organisations in the UK.

The primary impact of losing adequacy would be to UK businesses. The overall cost of losing data adequacy – based purely on additional legal and technical costs – has been estimated at £1 billion to £1.6 billion. There may be other costs to businesses, for example in reduced international trade, and to the economy as a whole (for example if data-intensive businesses choose to locate in Europe to make it easier to access that larger market).

Losing data adequacy would have an impact on ordinary people too. Many of us have encountered US-based websites that we can no longer access since GDPR came into effect; this is because the US does not have data adequacy and some organisations respond to this by simply not allowing people from the EU or UK to access them.

Does the Bill threaten data adequacy? Possibly. The key areas of concern within the group were:

  • The UK’s own adequacy regime: For example, the EU does not think the US has an adequate data protection regime. If the UK grants the US data adequacy, then the EU would not think the UK has an adequate data protection regime because data could be transferred from the EU to the US via the UK without the additional protections the EU thinks is necessary.
  • Limitations to individual data rights: There are places in the Bill where individuals are losing data rights compared to GDPR, for example due to the redefinition of personal data, or the discouragement of subject access requests (where an individual can request the information an organisation holds on them) by labelling them as vexatious or excessive.
  • The list of “recognised legitimate interests”: This list effectively removes checks and balances around certain kinds of data processing, including for crime, safeguarding and in emergencies.
  • The role of the Secretary of State: In particular, the Bill gives the Secretary of State power to add things to the list of recognised legitimate interests through secondary legislation, which is rarely scrutinised by Parliament.

Ultimately, it is hard to tell whether the Bill will have an impact on the UK’s data adequacy with the EU, because it is up to them to make that decision (and it may partly be a political one at the time). But during its scrutiny of the Bill, Parliament should query the confidence of the Government in a continuing positive adequacy decision.

Business and growth

The ongoing cost of living crisis, and the recent market downturn following the Truss Government’s “mini-Budget”, mean that the impact of the Bill on business, productivity, and growth is important.

One of the stated aims of the Bill is to cut data protection compliance costs for business. The Government’s own impact assessment puts the overall economic benefits of the Bill at between £1.3 billion and £8.5 billion over 10 years, but if you dig into the rationale for those benefits, many of them don’t come to businesses (for example, £7.5m-£153.3m/year is estimated to come from police officers not having to log justifications for access to personal data, compared to £32.6m-£155.6m/year arising from savings on business compliance costs). The group was generally sceptical about the estimates on business costs and benefits within that impact assessment.

Members of the group doubted the Bill would bring business benefits, particularly for small businesses, and thought it could make things worse. In particular:

  • Businesses looking to trade with the EU, or that have EU citizens as customers, will still have to comply with the GDPR; in effect, the requirements of DPDIB will be layered over the GDPR rather than replacing it, adding to the compliance burden.
  • The powers given to the Executive to alter data protection legislation could give larger businesses the opportunity to lobby for legislative changes, but these are unlikely to favour smaller businesses.
  • These Executive powers lead to regulatory uncertainty, and there is also uncertainty about the EU’s data adequacy decision. Both these sources of uncertainty may lead to lower business confidence and investment.
  • Reducing regulation around data protection reduces trust in business use of data, which can reduce use of digital services and sharing of data.

It’s also worth noting that a large proportion of the estimated compliance cost savings for businesses arise from being able to refuse more subject access requests (where people ask for data that an organisation holds about them). But these are essential for people to be able to exercise their data rights (and diluting these rights could risk data adequacy with the EU).

The group also identified several areas of missed opportunities that could deliver business benefit, for example improvements to sharing of data for research purposes, and improvements to the effectiveness of the data portability right (whether as part of a Smart Data programme or outside it).

Overall, Parliament should scrutinise the claims of business benefits arising from the Bill, and on which kinds of businesses these are likely to fall.

Workers’ rights and the workplace

We are increasingly surveilled at work and automated decision making is used to make judgements about us during recruitment, appraisal, and beyond.

The TUC manifesto on Dignity at Work and the AI Revolution lays out several recommendations for how data and AI should be used in the workplace. USDAW – the UK’s fifth biggest trade union – has also recently published a manifesto for addressing issues around the impact of new technology which talks about the collection and use of data.

The group highlighted that workers can be enthusiastic about the adoption of new technologies and the use of data

  • if these uses are fair and non-discriminatory
  • if they improve, rather than damage, mental health and well-being
  • if workers are consulted and involved in their adoption

Areas in the Bill of particular concern to this group included:

  • Continuing to encourage employers to consult with employees when developing risk assessments around the collection and processing of data – something that is repealed by the current Bill.
  • Places where the Bill makes it harder to access information, for example additional barriers around subject access requests, as these are frequently used by workers.
  • The use of automated decision-making, especially as this relates to decisions made about workers.

Science and research

Data is essential to science, innovation and research, and there are and have always been specific rules within data protection legislation that support its use.

The main areas of concern for the group around research use of data is the broad definition of “research”. On the one hand, it’s clear that not only academics do useful research – research is carried about by market research organisations, for example, or think tanks – but equally, the public tends to be concerned about commercial innovation using data to develop profit-generating applications, which might also be categorised as “research”.

There is a real risk that if the Bill encourages these grey area uses of data “for research”, and the public doesn’t view them as being for the public good, it will diminish trust in more obviously essential uses of data, such as national statistics or academic health research.

But pinning down good definitions of what “research” and “public good” actually mean is difficult. The group suggested that this could be defined through codes of practice rather than regulation, for example by the Office for Statistics Regulation for the generation of statistics, the Market Research Society for market research, or the Economic and Social Research Council (ESRC) for economic and social research.

Automated decision-making

Legislation around automated decision-making governs when decisions about people can be made using algorithms and AI, and what rights those people then have to object, ask for human review, or seek redress. In GDPR, this is done by Article 22.

Members of this group were fairly relieved that the Data Protection and Digital Information Bill hadn’t removed controls over automated decision-making altogether, but still had some significant concerns about the way in which Article 22 is rewritten in the Bill.

  • As written, the clauses are not sufficient to tackle harms arising from automated decision-making, including the use of unfair and discriminatory algorithms that widen inequalities. (It’s also worth noting that the potential for “Monitoring, detecting or correcting bias in relation to developing AI systems” to become a recognised legitimate interest, as proposed in Data: a new direction consultation, is not reflected in the Bill.)
  • They also flip the original regulation in Article 22 on its head: instead of automated decision-making being prohibited except for when it’s safe, it’s now allowed except for under what are judged to be risky circumstances.
  • The Bill doesn’t make the distinction between data subjects (those who the data is about) and decision subjects (those who are affected by automated decision making). Sometimes these can be different (Jeni’s discussed this in her weeknotes).
  • The Bill puts particular protections in place around the use of special category data in algorithms, but recent EU case law has found that special category data can sometimes be derived from non-special category data (such as deriving someone’s sexuality from their spouse’s name, or race from the area where they live).
  • The Bill gives the Secretary of State the power to determine whether a given decision does or doesn’t have a “significant effect”. This introduces uncertainty for everyone involved.
  • It looks as though organisations could get around many of the protections the Bill does provide around automated decision making through a minimal inclusion of a person in the decision making process. The group discussed the limitations of ‘humans-in-the-loop’ as a mechanism for protecting against algorithmic harms, and thought restrictions should apply around automation in significant decisions even when people are involved.

The group also discussed some of the regulations they wanted to see around automated decision-making, including:

  • a requirement to notify people when decisions are made by them in a fully or partially automated way
  • rights of appeal, redress and (human) review, and a requirement to publish information about such appeals
  • a meaningful approach to transparency and accountability, such as through independent algorithmic audits
  • rights to face-to-face human engagement around certain decisions

Digital identity

Digital verification services, as they are known in the Bill, are services that verify something about somebody, which usually requires checking their identity in some way. Examples of digital verification services are:

Part 2 of the Bill focuses on these digital verification services. This legislation is the result of the Digital Identity and Attributes consultation. It will put into law the digital identity and trust framework developed by DCMS (not to be confused with GDS’s digital identity solution for government). The Bill also defines a register of organisations that provide accredited digital verification services.

The group had a number of concerns and criticisms about the Bill:

  • The Home Office, which runs some important digital verification services (not least about the right to work and right to rent) has yet to be accredited under DCMS’s framework.
  • There is some scepticism about whether people will understand and accurately assess the trustworthiness of accredited (and non-accredited) services. Trust marks are only useful if people know not to trust services without them.
  • It’s not clear how and when anyone will check that services continue to adhere to the framework after they’ve been accredited.
  • The governance of the scheme isn’t specified in the Bill – the Government have announced that they will be setting up an Office for Digital Identity and Attributes but this isn’t mentioned in the Bill.

Overall, the group were concerned that the scheme will have a similar impact to the cookie law, with no one satisfied with the outcome.

Democratisation and participation in data governance

There aren’t many rights around democratisation and increasing participation in data governance in existing law. But it is an area that the group was interested in developing further. The group briefly touched on:

  • the Bill’s removal of the right to consultation during the development of a privacy impact assessment (now risk assessment)
  • the opportunity to re-introduce GDPR’s Article 80(2), which allows for organisations to represent the collective interests of data subjects, but was not transposed into the UK GDPR

One side of greater public participation in decisions about data is transparency. The group discussed journalistic exemptions that enable reuse of data, noting that the ICO currently has an open consultation on journalistic use of data.

The group also discussed whether requiring or encouraging public engagement when assessing the legitimate interests balancing tests might be something worth arguing for in amendments to the Bill. These tests are currently carried out by organisations wanting to do data processing – hardly independent arbiters of whether their requirements outweigh data subject rights and interests. There were concerns about making consultation with data subjects meaningful, proportionate, recognising that small organisations wouldn’t be able to afford to carry out large deliberative exercises, but also examples in research settings where, for example, public involvement in research ethics boards is more standard practice.

The independence of the Information Commission(er) and power of the Government

One of the major concerns raised in responses to the Data: a new direction consultation was the impact on the independence of the Information Commissioner’s Office (ICO), which is responsible for regulating data and information under several pieces of law. Fortunately, the changes in the Bill do not have as broad an impact as those that were consulted on. However, there was still concern within the group that the ICO is being turned into an economic regulator rather than one focused on people’s rights.

As was touched on in other discussions, there was far greater concern about the multiple “Henry VIII clauses” which give powers to the Secretary of State to make amendments to data protection law through secondary legislation, and minimal Parliamentary scrutiny. These give the Secretary of State power to, for example:

  • add new “recognised legitimate interests”, which are essentially new circumstances or purposes for which processing data is lawful
  • determine whether or not a given use of automated decision-making has a significant effect, determining the level of regulation around it
  • add new circumstances where data originally collected for one purpose can be reused for a new purpose (purpose limitation)

While these make it easy to adapt legislation to a changing technological environment, they also make it easy to do so for politically motivated reasons, and without proper thought or consultation. This introduces greater risks of unintended consequences, and increases uncertainty about the future state of the law, which has impacts on business confidence and investment, particularly as future changes may affect the EU’s data adequacy decision.

The group recognised, however, that the use of Henry VIII powers is a more general problem, not one limited to this Bill.

Do you collect, use or share data?

We can help you build trust with your customers, clients or citizens

 Read more

Do you want data to be used in your community’s interests?

We can help you organise to ensure that data benefits your community

 Read more