Answering Digital Reputation Challenges by Addressing Incomplete Notice and Choice Privacy Policy

Jonathan A. Obar (University of Ontario Institute of Technology/Michigan State University)

August 2016

Note: This submission was contributed by the author(s) to the Office of the Privacy Commissioner of Canada’s Consultation on Online Reputation.

Disclaimer: The opinions expressed in this document are those of the author(s) and do not necessarily reflect those of the Office of the Privacy Commissioner of Canada.

This document serves as a brief summary of my published and ongoing research into practical strategies for addressing digital reputation challenges. For an example of my work in the area, please review the abstract below from “Big Data and the Phantom Public: Walter Lippmann and the Fallacy of Data Privacy Self-Management” published in Big Data and Society. The contents of this summary and the example of my work provided, address the OPC’s question “(2) What practical, technical, policy or legal solutions should be considered to mitigate online reputational risks?”

Speaking about PIPEDA in 2013, OPC Commissioner Stoddart commented “It is increasingly clear that the law is not up to the task of meeting the challenges of today — and certainly not those of tomorrow.”Footnote 1 The paper noted above expands on these concerns, introducing what I term ‘the fallacy of data privacy self-management’. The fallacy describes an unattainable ideal perpetuated by governments, including the Canadian government, clinging to flawed notice and choice privacy policy. The ideal suggests (incorrectly) that digital citizens have the time, resources and ability to read, understand and provide consent to all privacy and terms of service (TOS) policies they encounter (‘notice’), and to access, review, evaluate and manage a massive and constantly evolving mosaic of Big Data products and services operating within and across the global economy, increasingly linked to eligibility decisions (‘choice’).Footnote 2 Indeed there are transparency challenges, as noted in the OPC’s position paper,Footnote 3 and demonstrated by my OPC-funded (with Andrew Clement) research into the transparency of Canadian ISPs.Footnote 4 The question is, if given access to all relevant privacy and TOS policies, and access to all data products, even the products veiled by data brokers, what then?

A quantitative survey analysis (n=523) I recently completed (with Anne Oeldorf-Hirsch) provides an empirical glimpse into the challenges faced. The study assesses the extent to which individuals ignore the privacy and TOS policies of social networking services. The study includes an original assessment of participant interaction with the policies of a fake SNS called NameDrop created by the researchers. Based on average adult reading speed (200-300 words/minute), the privacy policy should have taken 26-40 minutes to read, and TOS 14-22 minutes. Results suggest that 74% of participants skipped the privacy policy, selecting the ‘quick join’ option, common to SNS. Of those that read the policy, average reading time was 73 seconds, with some reading for three seconds. TOS average was 51 seconds, with a 14-second median.

Indeed, transparency is a great place to start, as is notice and choice policy; however, all are terrible places to finish. They leave digital citizens with nothing more than an empty promise of protection, an impractical opportunity for data privacy self-management, and as Daniel Solove analogizes ‘too much homework’.Footnote 5 It should be acknowledged that the OPC correctly places specific emphasis on the digital reputation challenges unique to children. If current policy has yet to make data privacy self-management possible for adults, what hope is there for children? This question further emphasizes the importance of calling-out the incompleteness of the unattainable ideal.

Moving forward, I recommend three strategies:

  1. For policymakers to recognize and acknowledge that notice and choice policy is a great place to start, but an unrealistic place to finish if we are ever to realize digital reputation outcomes that will empower and protect digital citizens,
  2. The development and support of representative data management services to act as infomediaries to enable digital citizens the opportunity to delegate responsibility via a principal-agent relationship. Services currently operating in the financial sector (e.g. Lifelock), and those targeting university students going on the job market (e.g. Rep'nUp), among others, aught to be the subject of various knowledge translation efforts, and
  3. To promote a form of data justice, Rawlsian/Senian regulatory philosophy aught to be employed.Footnote 6 This would involve the development of policies ensuring representative data management services are targeted towards communities more likely to be the subject of Big Data-driven discrimination.

I am currently conducting research in each of these areas and would be very pleased to contribute my working and published papers to your deliberations.

I have read and understood the consultation procedures. These comments are meant to implicate the OPC and the Federal government.

Thank you for your consideration.


(Original signed by)

Jonathan A. Obar, PhD
Assistant Professor, University of Ontario Institute of Technology
Research Associate, Quello Centre for Telecommunications Management and Law, Michigan State University


Obar, J.A. (2015). Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self-management. Big data and society, 2(2), 1-16.

In 1927, Walter Lippmann published The Phantom Public, denouncing the ‘mystical fallacy of democracy.’ Decrying romantic democratic models that privilege self-governance, he writes: “I have not happened to meet anybody, from a President of the United States to a professor of political science, who came anywhere near to embodying the accepted ideal of the sovereign and omnicompetent citizen.” Almost 90 years later, Lippmann’s pragmatism is as relevant as ever, and should be applied in new contexts where similar self-governance concerns persist. This paper does just that, repurposing Lippmann’s argument in the context of the ongoing debate over the role of the digital citizen in Big Data management. It is argued that proposals by the Federal Trade Commission, the White House and the US Congress, championing failed notice and choice privacy policy, perpetuate a self-governance fallacy comparable to Lippmann’s, referred to here as the fallacy of data privacy self-management. Even if the digital citizen had the faculties and the system for data privacy self-management, the digital citizen has little time for data governance. We desire the freedom to pursue the ends of digital production, without being inhibited by the means. We want privacy, and safety, but cannot complete all that is required for its protection. If it is true that the fallacy of democracy is similar to the fallacy of data privacy self-management, then perhaps the pragmatic solution is representative data management: a combination of non/for-profit digital dossier management via infomediaries that can ensure the protection of personal data, while freeing individuals from what Lippmann referred to as an ‘unattainable ideal.’

Date modified: