Giving us more information and controls is not fixing this problem

We want more information about data and more control over it

79%

±4%

79%

±4%

 Read more

of internet users in the US think tech companies do not provide them enough control over how information about their activities are tracked and used

Washington Post - Schar School

Results from a poll conducted Nov. 4-22, 2021, among a random national sample of US households, resulting in 1,122 adults including 1,058 internet users.

Companies tell us we own our data – they know we won’t use the controls we are given

quote
quote

Companies like to tell us that they “care” about our privacy or that our “privacy is important” to them, but the truth is that tech companies systematically co-opt both their employees and the law so that everyone – even those who consider themselves privacy advocates—and everything they do – even tasks that seem privacy protective – all end up serving their employers’ data extractive needs in the end.

Despite years of heavy criticism, privacy self-management (i.e., the principle that people individually manage their privacy via notice and choice) remains the standard of privacy protection throughout the Western world. Building on previous research, this article provides an overview and classification of the manifold obstacles that render privacy self-management largely useless in practice. People’s privacy choices are typically irrational, involuntary and/or circumventable due to human limitations, corporate tricks, legal loopholes and the complexities of modern data processing. Moreover, the self-management approach ignores the consequences that individual privacy choices have on other people and society at large. Regarding future research, we argue that the focus should not be on whether privacy self-management can be fixed by making it more user-friendly or efficient – it cannot. The concept is based on fundamentally wrong assumptions. To meaningfully address the potentials and dangers of personal data processing in the 21st century, a shift away from relying purely on individual control is inevitable. We discuss potential ways forward, stressing the need for government intervention to regulate the social impact of personal data processing.

Data use is complex – it is impossible to understand all the implications of our individual choices

quote
quote

Part of the problem with the ideal of individualized informed consent is that it assumes companies have the ability to inform us about the risks we are consenting to. They don’t. Strava surely did not intend to reveal the GPS coordinates of a possible Central Intelligence Agency annex in Mogadishu, Somalia — but it may have done just that. Even if all technology companies meant well and acted in good faith, they would not be in a position to let you know what exactly you were signing up for.

Because of increased technological complexities and multiple data-exploiting business practices, it is hard for consumers to gain control over their own personal data. Therefore, individual control over personal data has become an important subject in European privacy law. Compared to its predecessor, the General Data Protection Regulation (GDPR) addresses the need for more individual control over personal data more explicitly. With the introduction of several new principles that seem to empower individuals in gaining more control over their data, its changes relative to its predecessors are substantial. It appears, however, that, to increase individual control, data protection law relies on certain assumptions about human decision making. In this work, we challenge these assumptions and describe the actual mechanisms of human decision making in a personal data context. Further, we analyse the extent to which new provisions in the GDPR effectively enhance individual control through a behavioural lens. To guide our analysis, we identify three stages of data processing in the data economy: (1) the information receiving stage, (2) the approval and primary use stage, and (3) the secondary use (reuse) stage. For each stage, we identify the pitfalls of human decision-making that typically emerge and form a threat to individual control. Further, we discuss how the GDPR addresses these threats by means of several legal provisions. Finally, keeping in mind the pitfalls in human decision-making, we assess how effective the new legal provisions are in enhancing individual control. We end by concluding that these legal instruments seem to have made a step towards more individual control, but some threats to individual control remain entrenched in the GDPR.

Data from sensors and satellites is not personal data, but is about us and our communities

Our homes

Our neighbourhoods

Our land

Our electricity use

Our waste

Our connectivity

Data’s wider impact to our societies is a collective problem – it needs collective solutions

64%

±3.5%

64%

±3.5%

 Read more

of people in the US think the government should do more to regulate how Internet companies handle privacy issues

Washington Post - Schar School

Results from a poll conducted Nov. 4-22, 2021, among a random national sample of US households, resulting in 1,122 adults including 1,058 internet users.

quote
quote

The harm to any one individual in a group that results from a violation of privacy rights might be relatively small or hard to pin down, but the harm to the group as a whole can be profound. Say Amazon uses its data on consumer behavior to figure out which products are worth copying and then undercuts the manufacturers of products it sells, like shoes or camera bags. Though the immediate harm is to the shoemaker or the camera-bag maker, the longer-term—and ultimately more lasting—harm is to consumers, who are robbed over the long run of the choices that come from transacting in a truly open and equitable marketplace. And whereas the shoemaker or camera-bag manufacturer can try to take legal action, it’s much tougher for consumers to demonstrate how Amazon’s practices harm them.

The challenges we face making changes to our digital and data environment are similar to those we face making changes to our natural environment. While we can take individual action, the scale of change we need requires collective action.

Do you collect, use or share data?

We can help you build trust with your customers, clients or citizens

 Read more

Do you want data to be used in your community’s interests?

We can help you organise to ensure that data benefits your community

 Read more