A PYMNTS Company

Privacy’s Trust Gap

 |  August 14, 2017

By Neil M. Richards (Washington University) & Woodrow Hartzog (Samford University)

Abstract:      It can be easy to get depressed about the state of privacy these days. In an age of networked digital information, many of us feel disempowered by the various governments, companies, and criminals trying to peer into our lives to collect our digital data trails. When so much is in flux, the way we think about an issue matters a great deal. Yet while new technologies abound, our ideas and thinking — as well as our laws — have lagged in grappling with the new problems raised by the digital revolution. In their important new book, Obfuscation: A User’s Guide for Privacy and Protest (2016), Finn Brunton and Helen Nissenbaum offer a manifesto for the digitally weak and powerless, whether ordinary consumers or traditionally marginalized groups. They call for increased use of obfuscation, the deliberate addition of bad information to interfere with surveillance; one that can be “good enough” to do a job for individuals much or even most of the time. Obfuscation is attractive because it offers to empower individuals against the shadowy government and corporate forces of surveillance in the new information society. While this concept represents an important contribution to the privacy debates, we argue in this essay that we should be hesitant to embrace obfuscation fully.

We argue instead that as a society we can and should do better than relying on individuals to protect themselves against powerful institutions. We must think about privacy instead as involving the increasing importance of information relationships in the digital age, and our need to rely on (and share information with) other people and institutions to live our lives. Good relationships rely upon trust, and the way we have traditionally thought about privacy in terms of individual protections creates a trust gap. If we were to double down on obfuscation, this would risk deepening that trust gap. On the contrary, we believe that the best solution for problems of privacy in the digital society is to use law to create incentives to build sustainable, trust-promoting information relationships.

We offer an alternative frame for thinking about privacy problems in the digital age, and propose that a conceptual revolution based upon trust is a better path forward than one based on obfuscation. Drawing upon our prior work, as well as the growing community of scholars working at the intersection of privacy and trust, we offer a blueprint for trust in our digital society. This consists of four foundations of trust — the commitment to be honest about data practices, the importance of discretion in data usage, the need for protection of personal data against outsiders, and the overriding principle of loyalty to the people whose data is being used, so that it is data and not humans that become exploited. We argue that we must recognize the importance of information relationships in our networked, data-driven society. There exist substantial incentives already for digital intermediaries to build trust. But when incentives and markets fail, the obligation for trust-promotion must fall to law and policy. The first-best privacy future will remain one in which privacy is safeguarded by law, in addition to private ordering and self-help.

Full Article: Social Science Research Network