Language selection


Overview of Consent Submissions


In May 2016, the Office of the Privacy Commissioner of Canada (OPC) released a discussion paper on consent and a notice of consultation, inviting organizations, individuals, academics, advocacy groups, IT specialists, educators and other interested parties to comment. After discussing a number of “challenges” to meaningful consent, the paper requested comments on four questions:

  1. Of the solutions identified in this paper, which one(s) has/have the most merit and why?
  2. What solutions have we not identified that would be helpful in addressing consent challenges and why?
  3. What roles, responsibilities and authorities should the parties responsible for promoting the development and adoption of solutions have to produce the most effective system?
  4. What, if any, legislative changes are required?

The OPC received a total of 51 submissions. Roughly half of the submissions came from businesses or associations representing businesses. Four civil society groups made submissions. The balance of the submissions came from academics, the legal community, regulators and individuals. 

Overview summary

A large number of the submissions, representing different constituencies, acknowledged that the increasingly complex environment poses challenges for the protection of privacy and the consent model. However, there are significant differences among the submissions and the different stakeholders with respect to how these challenges can be or should be addressed.

The business submissions tended to emphasize and praise the technology-neutral and flexible nature of PIPEDA. As one company commented, “the current legislative framework still provides a workable regime and is flexible enough to deal with the challenges described in the paper.” Another company recommended that given PIPEDA’s “solid, yet flexible, framework … there is no need for a ground-up re-working of Canada’s privacy law framework or an overly-rigid approach to PIPEDA interpretation.”

Within the business community, many of the submissions suggested that there are ways to address some of the challenges to consent highlighted in the OPC discussion paper without resorting to legislation, for example, by allowing organizations greater freedom to rely on implied consent; by OPC providing additional guidance; by broadening the concept of publicly available information; and by adding an EU-type “legitimate Interests” provision to PIPEDA by means of a commentary on the term “legitimate purposes” in section 4.3.3 of Schedule 1 to PIPEDA.

While agreeing that most of the solutions discussed in the discussion paper can be accommodated within the existing legislative framework, a minority of the business submissions believe that a few amendments are needed. An amendment to section 7 of PIPEDA to include a “legitimate business interest” exception was suggested by several organizations.  One submission recommended amending paragraph 7(2)(c) to provide more “legal certainty” by expanding the provision to expressly contemplate data analytics processing.

The advocacy community, the regulators and some of the academics, generally recommended a broader range of “solutions” to address the perceived shortcomings of PIPEDA and the consent model, including giving the OPC stronger enforcement powers and urging the OPC to take a more active enforcement role. These submissions were also more likely to refer positively to European concepts such as “data portability” and the Right to be Forgotten. They also generally opposed measures that they perceived as weakening PIPEDA’s consent requirements.

With one exception, none of the business submissions recommend stronger enforcement powers.

A number of submissions expressed support for Privacy by Design and several, primarily non-business submissions, proposed a variety of technical solutions, including machine-readable privacy policies that stick to data to define allowed usage and obligations as it travels across multiple parties (“data tagging” or “smart data”); a privacy preferences system managed through a plug-in that would communicate a user’s privacy to websites; and the use of consent receipts designed to make consent “transparent on scale, across jurisdictions, domains and the Internet.”

Two submissions indicated support for privacy by default, while one industry association suggested that privacy by default and privacy by design do not need formal integration as they are already recognized as best practices. 

The submissions from individuals also generally call for stronger enforcement powers. As one individual suggested, privacy commissioners should have: i) increased power and; ii) should proactively audit privacy compliance as opposed to relying on complaints.

The submissions from the legal community generally argue that PIPEDA, together with evolving jurisprudence, is adequate to address the challenges referred to in the discussion paper without amendment. 

Specific themes

The following summarizes the responses around several common themes. Note, not all of the submissions commented on all of these themes. 

Simplifying/Standardizing Privacy Policies

Several submissions discussed ways to simplify privacy policies. One common suggestion was to filter out information which is already assumed by the user. This would make policies shorter and allow organizations to give priority to “non-obvious and complex uses.” One submission recommended a risk-based approach to simplifying policies by focussing on the collection, use or disclosure of data that may present a risk of harm to the individual. This type of approach would help address what one academic submission referred to as the paradox of informed consent: “if the information provided is shorter, a person is not fully informed, but if the full information is provided, it is too long to reasonably expect a person to fully read and understand it.”

A few submissions went further and recommended the use of short standardized privacy policies.  One submission suggested creating “a standardized best practices privacy policy” that would describe common uses and practices. Organizations would then be able to state they comply with this standard policy but would be required to identify practices which “deviate from the standardized policy” and state whether they are necessary or not necessary for the individual device, service, or app to function. In the latter case, the individual would be able to opt in or out of these collections or uses. A submission from the legal community recommended that the OPC be given the power to impose a simplified consent form.

Technical Solutions

Some of the solutions presented included “tagging” data and restricting collection, use, disclosure, and retention of both tagged and untagged data. Another concept that was referenced was the use of “consent receipts” to exert control over future choices. There were also recommendations for dashboard/portals to control and adjust privacy settings. One submission suggested that technical measures could be adopted to mask/conceal certain data elements, thereby protecting only that data which needs to be protected, and allowing other data elements to be freely used.

OPC Guidance

Many submissions commented on the value of OPC guidance and several recommended that the OPC issue additional guidance. A number of submissions specifically referred to the OPC’s OBA Guidance and one business association referred to it as, “A prime example of the viability of PIPEDA’s current consent requirement within a complex data ecosystem…”. Several submissions, particularly but not solely from the business community, referred to the need for guidance on de-identification.  One submission specifically commented on the need for guidance and tools for entrepreneurs and small and medium businesses.

A civil society organization recommended that the Commissioner should be empowered to issue “comfort letters”, at a business’s expense, providing its preliminary opinion whether a proposed practice would comply with PIPEDA. The same organization recommended specific guidance on the length of time data may be retained and methods of disposal, enforced through audits of online data retention practices.


The questions in the discussion paper on de-identification, contractual measures to protect de-identified data, and assessing the risks of re-identification prompted a large number of comments. As previously noted, several submissions suggested that the OPC should issue guidance on issues such as methods of de-identification and assessing the risk of re-identification. A few of the submissions referred favourably to the UK Information Commissioner’s “Anonymisation Code” as a good example of risk-based guidance.

Several business submissions, and some submissions from the legal community, argued that de-identified information is not personal information and thus does not fall within PIPEDA’s framework, and that consent is therefore not required. One submission recommended an amendment to PIPEDA (in order to remove any legal uncertainty) that would expressly authorize organizations to de-identify personal information without the necessity of obtaining consent to do so. 

While agreeing that de-identification and contractual backstops are useful strategies for minimizing the risks of authorized collection, use, and disclosure, one civil society organization argued these should not be considered as alternatives to consent. It was put forward that since the processing of the data to de-identify it is a use, an individual’s consent is still required. Another submission made the point that “(e)ffective governance requires understanding the risks and benefits related to data application, whether the data is identifiable or not.”

Trustmarks/Codes of Practice

There was a lack of agreement across all stakeholder groups about the value of trustmarks and codes of practice.  Some business submissions opposed both. One business suggested that “’One-size-fits-all’ sectoral codes of practice, trustmarks, and privacy seals do not reflect the diversity of practices and needs of businesses in the digital economy.” Other business submissions expressed support for voluntary trustmark programs or for activity-based codes developed with input from the target industries. 

One academic recommended that both codes of practice and trustmarks should be part of the “regulatory toolbox” while a member of the legal community did not think that either codes or trustmarks offer any significant benefits. A civil society organization rejected the voluntary, industry-driven trustmark model but expressed support for a trustmark overseen by a “credible organization independent of industry influence – either the Privacy Commissioner or an independent organization supervised by the Privacy Commissioner.” An individual also expressed caution about trustmarks developed and operated by industry bodies.

No-Go Zones

The business submissions rejected the concept of legislated or regulatory no-go zones suggesting that section 5(3) already ensures “responsible use of Personal Information”, and that no-go zones “may not afford the needed flexibility to deal with an ever-changing environment.”

Support among other stakeholders was mixed. The submissions from the legal community that mentioned the concept were opposed to it. One academic submission opposed the concept of no-go zones on the grounds that the problems they try to solve could be remedied better by re-establishing the meaningfulness of consent through increased information and voluntariness, rather than limiting each person’s freedom to choose altogether.

The strongest support for no-go zones came from a civil society organization and provided some potential examples to explore, such as: i) Recording sound from user’s microphone or camera, except in where a user is using the microphone or camera as part of obtaining services from the site; ii) Publishing personal information for the purpose of incentivising individuals to pay for the removal of their information; iii) Attempting to re-identify a user in anonymized data; and iv) Discriminating against users on the basis of a prohibited ground.

One individual also expressed support for no-go zones.

Ethical Assessments

Overall, there was considerable support for ethical assessments and ethical frameworks, particularly in light of the growing importance of big data and the Internet of Things. 

One business association suggested that an ethical framework “may be useful in providing structure to the assessment of risks and rewards associated with the usage of personal information specifically in regard to big data.” An individual commented that “big data can be considered as a common good” and that “a public conversation is needed about the ethical use of big data, even in anonymized form.” An academic called for the development and application of broader, surveillance and ethical assessment tools to help ensure that big data analysis is conducted with appropriate regard for privacy and other values.

The business submissions generally see value in ethical assessments as one aspect of being an accountable organization but not as a mandated activity carried out by a third party, such as an ethics board.

Date modified: