Language selection

Search

Summary of reputation submissions

The OPC received a total of 28 submissions from industry, academics, civil society, lawyers and the general public. The submissions presented a broad range of proposals for protecting reputational privacy from standardized takedown request forms and procedures to enhanced powers for the OPC. The “right to be forgotten” (RTBF) was referenced in over half of the submissions, with most arguing against the European model of RTBF but many in favour of the idea that individuals should have a right to have their personal information de-indexed in specific circumstances.

Some submissions focussed on solutions that would allow individuals to exercise better control over the extent and accessibility of the personal information they provide to online services. We heard that companies have a responsibility to create robust technical settings to increase user control, and promote their usage to lessen the risk of reputational harm. Specific proposals included creating more mechanisms for individuals to opt-out of providing their personal information, implementing standardized privacy settings that limit sharing of personal information by default, and having a standardized process for removing or correcting personal information across platforms.

The lack of order making power was seen as limiting the OPC’s effectiveness in addressing online reputation problems. Some proposed that the OPC advocate for stronger enforcement powers, including the power to levy administrative monetary penalties. It was also suggested that the OPC use its existing powers under section 5(3) of the Personal Information Protection and Electronic Documents Act (PIPEDA) to prohibit the collection and use of personal information in limited circumstances where reputational harm is particularly egregious, for example, revenge websites.

Minors were identified by multiple submissions as being particularly vulnerable to reputational harm online. It was suggested that the OPC work to limit corporate collection, monitoring, and retention of young people’s data.

Many submissions recommended strengthening education initiatives around digital ethics and individual responsibility in posting personal information online – including recognition of the potential impacts of posting information about others. Some felt that companies should provide educational messaging to help users determine how much and with whom they want to share their information, and that parents should encourage their children to use online privacy tools. We heard that digital literacy should be a core component of school curricula, and this education should extend to teachers and parents. It was also suggested that government and industry should collaborate to develop and share educational resources.

Education was also repeatedly suggested as an effective mechanism for reducing the reputational harm faced by other vulnerable groups, such as women and minorities. Specifically, it was suggested that such individuals should be made aware of their rights in online spaces and empowered to build communities where their rights are respected. The OPC was encouraged to make a special effort to reach out to organizations that work with vulnerable individuals and marginalized communities in order to gain a better understanding of their specific needs and how to address them.

The RTBF was by far the most popular topic in the submissions, many of which rejected the European model of RTBF for practical, policy and legal reasons. Some objected to search engines having the responsibility for balancing freedom of expression with privacy rights, and stressed the need for clear criteria, transparency, and oversight. It was suggested that, if a RTBF were to be implemented in Canada, content creators or hosts should be notified and given the opportunity to dispute any removal or obscurity requests based on their own rights and interests or a public interest.

Some felt that solutions already exist to the problems RTBF is meant to address. These solutions include defamation law, privacy torts, website takedown policies, and PIPEDA’s framework for the management of personal information. It was suggested that laws which restrict availability or use of personal information were worth exploring as a solution in some circumstances. The examples listed included clean slate laws, such as credit reporting and juvenile criminal law, which limit retention, as well as employment laws that prohibit employers from asking for social media passwords.

Charter issues were raised by many of those who objected to a RTBF. Frequently cited were the right of every Canadian to get access to relevant content on the internet via search engines, the media’s right to disseminate content online, and a search engine operator’s right to freedom of expression. Media organizations in particular saw the RTBF as a threat to freedom of expression and freedom of the press.

That said, some submissions were willing to entertain some deindexing mechanism in certain circumstances, for instance, intimate information which creates a risk of harm.  For those submitters, having search engines create an effective and efficient challenge tool (as has happened in the EU) was an appropriate means of increasing access to justice; some submitters suggested that relying on a process involving a regulator, or the Court, for removal of information would put this remedy out-of-reach for individuals and/or overburden those systems.

Date modified: