Submission to Consultation on Online Reputation (FIPA)
BC Freedom of Information and Privacy Association
Note: This submission was contributed by the author(s) to the Office of the Privacy Commissioner of Canada’s Consultation on Online Reputation.
Disclaimer: The opinions expressed in this document are those of the author(s) and do not necessarily reflect those of the Office of the Privacy Commissioner of Canada.
In this submission, FIPA considers what policies will best allow Canadians to have control of information about them, prevent and reduce information-based harm, and ensure reputational privacy enhances—and does not impede—free association and democratic free expression.
We look first at existing protections — from the law to social norms, and from market solutions to online architecture. We argue that existing privacy statutes, the common law and statutory torts, and Criminal Code provisions already go some way toward protecting individuals' ability to govern the use of their personal information and image, but that legislation cannot solve all of our societal problems.
We talk about normative constraints reinforced largely through informal peer-to-peer education, which allow individuals to share information discriminately in quasi-public settings. But those norms are still being developed and learned.
We discuss privacy protections in the architecture of social networks and other sites — but point out that privacy controls vary and can change, and can leave less technologically-sophisticated users behind.
And we talk about market solutions that exist to solve reputational woes — but only for those who can pay.
To help solve fill some of the gaps in the existing protections, we propose (1) public education that takes a rights-based approach to online reputation; (2) higher standards for privacy controls, including privacy-protective default settings, that social networking companies could agree to; and (3) legal solutions that do not overreach, and specifically legislate against unwanted behavior.
This submission also talks about “obscurity”-defined as a lack of search visibility, unprotected access, identification, or clarity-and practical ways to make use of the concept, before delving into the question of a “right to be forgotten”.
We urge great caution in implementing the latter—we argue against intermediary liability, warn against creating tiered access, and urge insurance against erroneous or malicious requests, among other things—and make four broad recommendations:
- That any measures taken to address online reputation concerns be handled by an appropriately-resourced body that is accountable to the public.
- That any obscuring or takedown processes be relatively simple, have clear evaluation criteria, and involve the notification of content creators or hosts when appropriate
- That public education efforts should be made so that Internet users know that information may be omitted from their searches or browsing.
- That data should be collected about online reputation’s effects on the online and o lives of individuals.
The full submission is available in the following language(s):
Note: As this submission was provided by an entity not subject to the Official Languages Act, the full document is only available in the language provided
Report a problem or mistake on this page
- Date modified: