European Voice debatesHow the debates work+feedback
10 July 2014 - 18 July 2014
Technology: In the era of big data, relying on informed consent for data privacy is illusory.
Voting at a glance
74%
26%
69%
31%
68%
32%
65%
35%
64%
36%
63%
37%
71%
29%
71%
29%
70%
30%
%
100%
  • Thu10 Jul
  • Fri11 Jul
  • Sat12 Jul
  • Sun13 Jul
  • Mon14 Jul
  • Tue15 Jul
  • Wed16 Jul
  • Thu17 Jul
  • Fri18 Jul

Closing

Joe Mc Namee

For the motion

Executive director, European Digital Rights (EDRi)
I find it sad that Sandy feels the need to demean himself and his institution by using such silly “straw man” arguments. Sandy's straw man is campaigning for a ban on all profiling. Profiling that does not lead to discrimination and is subject to enforceable legal safeguards is not a problem. Profiling that is arbitrary, discriminatory and left to the law of the jungle is objectionable, undermines trust and undermines the rights of individuals. Why is informed consent not enough? Because we need enforceable rules that require accountability when profiling generates new personal data.
MIT Professor
So far Mr McNamee has bobbed and ducked, talked about other topics and repeated himself. What he has not done is to address the fact that in both Europe and elsewhere in the world there are long-standing examples of informed consent that work quite well despite complex subject matter and uncertain outcomes. It works. So why does he, and other members of the European elite, have such unreasoned objection to informed consent? I believe that it is because informed consent will cause the European elite to lose power.
Simon Taylor
18 July 2014

The aim of this debate was to discuss whether relying on informed consent for data privacy is illusory in an era of the big data. The concept of informed consent derives from the early years of the IT revolution when individuals had a clear idea of what data they were being asked to share and what it would be used for. The era of big data means that disparate datasets can be almost infinitely re-used to produce new valuable information. While it is possible to grant explicit consent to the use of personal data for medical research purposes or to provide information about earnings and outgoings to a mortgage lender, the individual is not able to have the same level of clear understanding of the ultimate use of their data.

This is not necessarily a problem until the issue of the abuse of data and the violation of privacy becomes an issue. Even where there is no evidence of clear harm, individuals can feel uneasy if the data they have shared is used to generate information about them in ways they don't understand. Jamie Bartlett of the centre for the analysis of social media at Demos highlights the recent case where it emerged that Facebook had manipulated users’ feeds to produce an emotional reaction. Although the actual harm was debatable, there was a backlash from users who were angry that what they saw as their private sphere was being violated. As Bartlett points out, Facebook is not a public utility: it is a private company that has the right to use your data under its terms and conditions.
One of the fundamental problems in the hyper-connected world that we enjoy and profit from is that we expect the digital services we use to be free. A whole ecosystem has grown up where we do not pay for these services. The companies themselves use the data that we provide to sell to advertisers. In this situation, users by definition do not have control over their data after the initial decision to share it and it makes the concept of informed consent hard to realise.

If it is not possible for traditional models of notice and consent to be carried over into the big data era, there is a clear need to a strong legal framework to accompany any models for granting consent as many speakers have stressed. If there are abuses then individuals need to be able to seek redress. Those guilty of abuses should face the threat of meaningful sanctions. Redress should accompany a system based on transparency and accountability, two crucial principles stressed by Sandy Pentland of MIT and Sir Nigel Shadbolt of the Open Data Institute. Data-processing organisations should be open about what they are collecting and how they are doing it, and individuals should be able to take down or amend incorrect information. Our two main speakers have disagreed about whether the right legal framework exists to provide adequate levels of protection. Sandy Pentland says that systems of informed consent have worked for the financial and medical research sectors where they are backed up by robust checks and enforcement. This approach could be extended to personal data sector, he argues.

Joe McNamee of European Digital Rights argues that legal protection from the risk of profiling is being watered down as the EU revises its data protection legislation. Sandy Pentland argues that the generation of additional data based on willingly-shared information is a normal part of commercial activity and praises EU legislators for keeping this possibility alive.

What is clear is that the legal framework should be able to defend data privacy and offer effective action against abuses. With the right framework in place it should be possible to develop new services and approaches which give individuals more understanding and control of their data and allow them to benefit from the value that the data can provide.

I would like to thank the two speakers and the guest commentators for their contribution to the debate as well as the many people who joined the debate via the comment section and by voting.

Joe Mc Namee

The proposer's closing remarks

18 July 2014

I find it sad that Sandy feels the need to demean himself and his institution by using such silly “straw man” arguments. Sandy's straw man is campaigning for a ban on all profiling. Sandy's straw man thinks that predictable, reasonable, credit assessment should be banned. Sandy's straw man is the invention of Sandy's imagination.

Profiling that is based on objective criteria, that does not lead to discrimination and that is subject to enforceable legal safeguards is not a problem. Profiling that is arbitrary, discriminatory and left to the law of the jungle is objectionable, undermines trust and undermines the rights of individuals.

In the late eighties, insurance companies used to ask people if they ever took an AIDS test. If they had, they were assumed to have been engaged in some sort of risky sexual behaviour that led them to take the test. The insurance companies then adjusted their premiums (or rejected people as clients) accordingly. At that time, the East German authorities insisted on an AIDS test for all people planning to spend an extended period in the country. I was going there for a semester as a student.

My (informed) choice was to take up the exciting academic opportunity and risk being profiled (forever?) as someone who was engaged in risky sexual activity, paying the price in higher insurance premiums for years to come, all because a blood test had proven that I didn't have a virus. Or I could turn down the opportunity, which ended up being one of the most enriching of my life.

At least in my example the insurance companies were being honest about the fact that their guesses about your sexual activity were arbitrary and based on assumptions that they clearly knew (but did not care) were going to be wrong in a large proportion of cases. At least I could, in theory, argue my case, if I could find an insurance company that would open to listening to me.

Why is informed consent not enough?

Because we need enforceable rules that require accountability when profiling generates new personal data that we do not have access to and when it leads to arbitrariness and discrimination.

Because we need data minimisation to prevent companies from creating unnecessary databases that create security risks.

Because we are only just beginning to get a grasp not alone of the abuses of data on the basis of dubious catch-all “consent” clauses and the abuse of personal data by governments, but also the scale of the security meltdown in business and government.

18 July 2014

So far Mr McNamee has bobbed and ducked, talked about other topics and repeated himself. What he has not done is to address the fact that in both Europe and elsewhere in the world there are long-standing examples of informed consent that work quite well despite complex subject matter and uncertain outcomes. It works. So why does he, and other members of the European elite, have such unreasoned objection to informed consent?

I believe that it is because informed consent will cause the European elite to lose power. A concrete example: in many parts of Europe loans are decided by the local banker, and by him alone. He can make his decision based on your clothes, your ethnic background, even on the way you smell. He has power over you. But in other countries there is 'big data'...data about how you paid off credit cards, loans, repaid, etc, and by law the banker must follow the facts as presented in that data.
Additionally there is transparency...you can see all of the data on which the decision is made. There is also accountability... a formal process to dispute erroneous data or decisions that don't fit the facts. In countries where decisions are made on data the banker loses his power over you.

It seems to me that European elites are making a big noise about big data because if the citizens are informed, and can independently decide to consent or to reject based on the facts, then the elites will lose power. If citizens can decide for themselves, then why do they need the elites? The experience of many countries where citizens are empowered to make decisions based on data shows that transparency and accountability are the main requirements for citizens to make good decisions and to be treated fairly. Unfortunately, as the poll in this debate shows, the elites have made ordinary people so scared of data that they may sign away their right to be informed and to decide for themselves to consent or reject. The result of stealing from citizens the right to informed consent will be that the bankers and other members of the elite will be free to make decisions without being constrained by facts.

This debate has now finished and voting is closed.

Comments are closed for this stage.

Featured comments

Comments from the floor

  • Stephan Engberg 18 July 2014 - 5:22pm

    The underlying issue here is if fast and unsubstantiated arguments allow government overriding individual control of data.

    Given existence of such legitimate security - or even clear economic - requirements, it might be acceptable for governments on behalf of other citizens to override the fundamental principle as several comments assume.

    But it is my argument that such issues do not exist since the massive damage originating from citizens loosing control warrant strong stand on the principle being build into technology and better alternatives could handle the legitimate requirements without citizens loosing control of data.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 2:16pm

    Comment for Somon Tayler

    You say
    "As Bartlett points out, Facebook is not a public utility: it is a private company that has the right to use your data under its terms and conditions."

    Legally that is provided there is an agreement. There is not.
    It is also provide there is no fraudulent behaviour involved. There
    is.

    Theoretically Google and Facebook are utility companies and should be treated as such with strong neutrality requirements. Having infrastructure taking control of market process is detrimental to society similar to roads discriminating which cars go where. Having an infrastructure doing surveillance not only on every streetcorner but in your home and in all devices is simpy not sustainable.

    Facebook and Google may complaint that there business models cannot work unless they are allowed to abuse network power as they cannot provide gratis service without payments. Sure, so charge on market terms and provide the security that avoid the network effects and facilitate free markets and respect fundamentals rights.

    The problem here is not that Europa has a strange bureaucratic data protection regulation. The problem is that NEITHER US nor Europe facilitate security and basic market processes in a networked world.

    Informed Consent is just an illusion when we talk identification in digital infrastructure and society processes - public or private.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 1:16pm

    This discussion as it demonstrate how debate should not occur as there is no convergence to isolate the problems.

    Mr. Pentland just insist that Informed Consent as "worked quite well" ignoring all arguments including a vast array of examples on massive circumvention as to the contrary and fundamental problems problems just as he has ignored all ways to make citizen control work for the sake of security and the economy.

    Surely if Mr. Pentland are so convinced Informed Consent - without enforcability or chance for the data subject to understand the causal implications - can work, he must be able to provide a better argument?

    The problem is not that the principle of Informed Consent is wrong but the simple lack of enforcement, circumvention and de facto lack of consent beyond some pseudo-claim for feeding lawyers and media with excuses.

    The essence of the problem is - beyond the obvious lack of security - an increasingly dysfunctional economy where free choice have increasingly less impact as consumers are managed, profiled, manipulated and discriminated.

    We have facing nothing less than an institutional failure on a major scale where a few extremely powerful companies in collusion with an increasingly more Command & Control oriented Bureaucracy are taking control of citizens and society processes at the expense of everybody other than their narrow interests in power and profit.

    The problem is not what is done to data after collection - the main problem is identification, that distort all value processes and eliminate security in the first place.

    No mechanisms can contain the rapidly escalating abuse after security was breached with identification of the most vulnurable and critical element in any economy - the demand - with the vital function of enforcing free choice on value systems to ensure they provide increasing value.

    The illusion of "free services" is the best proof of market failure distorting investments, competition, choice as well as the related network effects creating massive anti-trust market distortions.

    Rules alone simply cannot contain the problem unless technically enforceable purpose specification and isolation through contextual identity.

    Regards

    Stephan Engberg

  • Richard Beaumont 18 July 2014 - 12:33pm

    We have a long way to go before we will get this right I think.

    The proposed EU Data Protection Regulation addresses many issues, but its effectiveness will also come down to enforcement - where Europe has typically lagged behind the US.

    One of the key issues for me is to move away from the unreadable legalese of current 'privacy notices' towards clear messages that people can both understand and act upon.

    It is obvious that the use of data which is most contentious and most widespread is profiling for commercial/advertising purposes. These are the practices that have most impact on individual lives, and are also the most obscure in terms of public understanding of how it works.

    For my money, if we can solve that one issue with genuine transparency and choice/control - then we will go along way towards allaying suspicions, curbing unwanted commercial surveillance, and understanding of the underlying economics of the web.

    For example, even on this site, the information about the use of cookies and tracking technologies is both inadequate and difficult to understand for the average user. It also does not offer any choice - even though EU law already demands it.

    It just goes to show how far we have to go.

  • Richard Beaumont 18 July 2014 - 12:33pm

    We have a long way to go before we will get this right I think.

    The proposed EU Data Protection Regulation addresses many issues, but its effectiveness will also come down to enforcement - where Europe has typically lagged behind the US.

    One of the key issues for me is to move away from the unreadable legalese of current 'privacy notices' towards clear messages that people can both understand and act upon.

    It is obvious that the use of data which is most contentious and most widespread is profiling for commercial/advertising purposes. These are the practices that have most impact on individual lives, and are also the most obscure in terms of public understanding of how it works.

    For my money, if we can solve that one issue with genuine transparency and choice/control - then we will go along way towards allaying suspicions, curbing unwanted commercial surveillance, and understanding of the underlying economics of the web.

    For example, even on this site, the information about the use of cookies and tracking technologies is both inadequate and difficult to understand for the average user. It also does not offer any choice - even though EU law already demands it.

    It just goes to show how far we have to go.

  • Stephan Engberg 18 July 2014 - 1:16pm

    This discussion as it demonstrate how debate should not occur as there is no convergence to isolate the problems.

    Mr. Pentland just insist that Informed Consent as "worked quite well" ignoring all arguments including a vast array of examples on massive circumvention as to the contrary and fundamental problems problems just as he has ignored all ways to make citizen control work for the sake of security and the economy.

    Surely if Mr. Pentland are so convinced Informed Consent - without enforcability or chance for the data subject to understand the causal implications - can work, he must be able to provide a better argument?

    The problem is not that the principle of Informed Consent is wrong but the simple lack of enforcement, circumvention and de facto lack of consent beyond some pseudo-claim for feeding lawyers and media with excuses.

    The essence of the problem is - beyond the obvious lack of security - an increasingly dysfunctional economy where free choice have increasingly less impact as consumers are managed, profiled, manipulated and discriminated.

    We have facing nothing less than an institutional failure on a major scale where a few extremely powerful companies in collusion with an increasingly more Command & Control oriented Bureaucracy are taking control of citizens and society processes at the expense of everybody other than their narrow interests in power and profit.

    The problem is not what is done to data after collection - the main problem is identification, that distort all value processes and eliminate security in the first place.

    No mechanisms can contain the rapidly escalating abuse after security was breached with identification of the most vulnurable and critical element in any economy - the demand - with the vital function of enforcing free choice on value systems to ensure they provide increasing value.

    The illusion of "free services" is the best proof of market failure distorting investments, competition, choice as well as the related network effects creating massive anti-trust market distortions.

    Rules alone simply cannot contain the problem unless technically enforceable purpose specification and isolation through contextual identity.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 2:16pm

    Comment for Somon Tayler

    You say
    "As Bartlett points out, Facebook is not a public utility: it is a private company that has the right to use your data under its terms and conditions."

    Legally that is provided there is an agreement. There is not.
    It is also provide there is no fraudulent behaviour involved. There
    is.

    Theoretically Google and Facebook are utility companies and should be treated as such with strong neutrality requirements. Having infrastructure taking control of market process is detrimental to society similar to roads discriminating which cars go where. Having an infrastructure doing surveillance not only on every streetcorner but in your home and in all devices is simpy not sustainable.

    Facebook and Google may complaint that there business models cannot work unless they are allowed to abuse network power as they cannot provide gratis service without payments. Sure, so charge on market terms and provide the security that avoid the network effects and facilitate free markets and respect fundamentals rights.

    The problem here is not that Europa has a strange bureaucratic data protection regulation. The problem is that NEITHER US nor Europe facilitate security and basic market processes in a networked world.

    Informed Consent is just an illusion when we talk identification in digital infrastructure and society processes - public or private.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 5:22pm

    The underlying issue here is if fast and unsubstantiated arguments allow government overriding individual control of data.

    Given existence of such legitimate security - or even clear economic - requirements, it might be acceptable for governments on behalf of other citizens to override the fundamental principle as several comments assume.

    But it is my argument that such issues do not exist since the massive damage originating from citizens loosing control warrant strong stand on the principle being build into technology and better alternatives could handle the legitimate requirements without citizens loosing control of data.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 2:16pm

    Comment for Somon Tayler

    You say
    "As Bartlett points out, Facebook is not a public utility: it is a private company that has the right to use your data under its terms and conditions."

    Legally that is provided there is an agreement. There is not.
    It is also provide there is no fraudulent behaviour involved. There
    is.

    Theoretically Google and Facebook are utility companies and should be treated as such with strong neutrality requirements. Having infrastructure taking control of market process is detrimental to society similar to roads discriminating which cars go where. Having an infrastructure doing surveillance not only on every streetcorner but in your home and in all devices is simpy not sustainable.

    Facebook and Google may complaint that there business models cannot work unless they are allowed to abuse network power as they cannot provide gratis service without payments. Sure, so charge on market terms and provide the security that avoid the network effects and facilitate free markets and respect fundamentals rights.

    The problem here is not that Europa has a strange bureaucratic data protection regulation. The problem is that NEITHER US nor Europe facilitate security and basic market processes in a networked world.

    Informed Consent is just an illusion when we talk identification in digital infrastructure and society processes - public or private.

    Regards

    Stephan Engberg

  • Stephan Engberg 18 July 2014 - 1:16pm

    This discussion as it demonstrate how debate should not occur as there is no convergence to isolate the problems.

    Mr. Pentland just insist that Informed Consent as "worked quite well" ignoring all arguments including a vast array of examples on massive circumvention as to the contrary and fundamental problems problems just as he has ignored all ways to make citizen control work for the sake of security and the economy.

    Surely if Mr. Pentland are so convinced Informed Consent - without enforcability or chance for the data subject to understand the causal implications - can work, he must be able to provide a better argument?

    The problem is not that the principle of Informed Consent is wrong but the simple lack of enforcement, circumvention and de facto lack of consent beyond some pseudo-claim for feeding lawyers and media with excuses.

    The essence of the problem is - beyond the obvious lack of security - an increasingly dysfunctional economy where free choice have increasingly less impact as consumers are managed, profiled, manipulated and discriminated.

    We have facing nothing less than an institutional failure on a major scale where a few extremely powerful companies in collusion with an increasingly more Command & Control oriented Bureaucracy are taking control of citizens and society processes at the expense of everybody other than their narrow interests in power and profit.

    The problem is not what is done to data after collection - the main problem is identification, that distort all value processes and eliminate security in the first place.

    No mechanisms can contain the rapidly escalating abuse after security was breached with identification of the most vulnurable and critical element in any economy - the demand - with the vital function of enforcing free choice on value systems to ensure they provide increasing value.

    The illusion of "free services" is the best proof of market failure distorting investments, competition, choice as well as the related network effects creating massive anti-trust market distortions.

    Rules alone simply cannot contain the problem unless technically enforceable purpose specification and isolation through contextual identity.

    Regards

    Stephan Engberg

  • Richard Beaumont 18 July 2014 - 12:33pm

    We have a long way to go before we will get this right I think.

    The proposed EU Data Protection Regulation addresses many issues, but its effectiveness will also come down to enforcement - where Europe has typically lagged behind the US.

    One of the key issues for me is to move away from the unreadable legalese of current 'privacy notices' towards clear messages that people can both understand and act upon.

    It is obvious that the use of data which is most contentious and most widespread is profiling for commercial/advertising purposes. These are the practices that have most impact on individual lives, and are also the most obscure in terms of public understanding of how it works.

    For my money, if we can solve that one issue with genuine transparency and choice/control - then we will go along way towards allaying suspicions, curbing unwanted commercial surveillance, and understanding of the underlying economics of the web.

    For example, even on this site, the information about the use of cookies and tracking technologies is both inadequate and difficult to understand for the average user. It also does not offer any choice - even though EU law already demands it.

    It just goes to show how far we have to go.

  • Stephan Engberg 18 July 2014 - 5:22pm

    The underlying issue here is if fast and unsubstantiated arguments allow government overriding individual control of data.

    Given existence of such legitimate security - or even clear economic - requirements, it might be acceptable for governments on behalf of other citizens to override the fundamental principle as several comments assume.

    But it is my argument that such issues do not exist since the massive damage originating from citizens loosing control warrant strong stand on the principle being build into technology and better alternatives could handle the legitimate requirements without citizens loosing control of data.

    Regards

    Stephan Engberg