Language selection


Public Consultation on Modernization of the Privacy Act

Submission of the Office of the Privacy Commissioner of Canada to the Minister of Justice and Attorney General of Canada



March 22, 2021

The Honourable David Lametti, P.C., M.P.
Minister of Justice and Attorney General of Canada
284 Wellington Street
Ottawa, Ontario  K1A 0H8

Minister Lametti,

I thank you for the opportunity to comment on the Public Consultation that Justice Canada launched in November 2020, on the modernization of the Privacy Act.

I strongly support the Department of Justice objective of modernizing Canada’s long outdated federal public sector privacy law, and I am pleased to participate in its consultations on achieving that objective. Attached you will find the Submission from my Office, which is organized under the following themes: (i) interpretations and definitions (ii) institutional obligations and (iii) maximizing regulatory effectiveness. The document includes our recommendations to further improve on the department’s efforts.

I look forward to continued collaboration with your department in this important initiative for Canadians and for a strong protection of their privacy rights.

Once again, I wish to thank you for the invitation to participate in the public consultation.


(Original signed by)

Daniel Therrien


c.c.: Carolyne Maynard, Information Commissioner

I - Introduction

We strongly support the Department of Justice (Justice) objective of modernizing Canada’s long outdated federal public sector privacy law, and are pleased to participate in its consultations on achieving that objective. We are encouraged by the openness of the latest step in the initiative, by which the broader public can respond to the proposals the government is now exploring around the Act’s reform.

We are also very encouraged by the thoughtful and comprehensive consultation paper Justice published last November, which demonstrates the seriousness of the government’s intent for meaningful reform.Footnote 1 The consultation document proposes substantive changes that represent significant strides toward a law, in step with modern domestic and international norms that would help to foster public trust that our federal government is acting effectively in the public interest while ensuring the protection of privacy rights.

Key Proposals Leading to Substantive Improvement for Privacy

There are numerous proposals put forth in the Justice document Respect, Accountability, Adaptability: A discussion paper on the modernization of the Privacy Act that could significantly improve the positioning of Canada’s public sector privacy law to effectively address the modern, data-centric needs of Canadians.

These notably include:

A broadened purpose statement that incorporates key rights-based language. The paper is explicit that a reformed Privacy Act begins with a foundation of respect and recognition of rights.Footnote 2 It goes on to include some of the rights-based language we have suggested, in listing, among other key objectives for the Act, “protecting individuals’ human dignity, personal autonomy, and self-determination.”Footnote 3 Though not all of our suggestions on this front are reflected, we note that the proposed changes are substantive and go much further towards providing a critical rights-based foundation than does the draft Consumer Privacy Protection Act (CPPA) for the private sector.Footnote 4 We believe a legal framework that entrenches privacy as a human right, and as a safeguard for other democratic rights and citizens’ trust in government, is essential for modernized privacy laws in both the public and private sector.

Introduction of a principles-based approach. This includes, notably, the introduction of principles to the Privacy Act to better align with domestic and international regimes and support the regulation of new or unforeseen circumstances. Among these principles would be a new accountability principle supported by concrete requirements to demonstrate strong governance and oversight practices through program design with privacy protection in mindFootnote 5, privacy impact assessments (PIA) obligationsFootnote 6 and new record-keeping obligations,Footnote 7 and, a meaningful threshold for limiting collection - aligned with modern norms.

Measures to tackle technological opportunities and challenges head-on. This includes exploration of measures for addressing automated decision making, data integration, and evolving approaches to transparency, with a recognition of the complexities involved that will need to be assessed in further detail.Footnote 8

Enhanced emphasis on the protection of personal information. These proposals, necessary in light of the increased risks in our data driven society, take the form of an expanded definition of personal information, requirements for protection of information, including obligations to safeguard personal information, and mandatory breach reporting to individuals and the Privacy Commissioner. Similarly, a requirement to maintain breach records to enable OPC oversight will further ensure that breaches are appropriately managed and risks to affected individuals mitigated.

Significant strides in measures to provide meaningful oversight and quick and effective remedies. This includes constructive proposals, which will need to be properly resourced, for our Office to take a more active guidance roleFootnote 9 through issuing advance opinions, overseeing pilot projects, and greater discretion to publish compliance outcomes. Finally, our Office supports Justice’s proposed enhanced compliance framework, which includes (i) expanded proactive audit powers,Footnote 10 accompanied by (ii) increased discretionFootnote 11 to focus our limited resources on the most pressing issues, along with (iii) much needed remedies in the form of order-making and expanded rights of recourse to Federal Court. While we continue to advocate for further enhancements to ensure effective and timely remedies consistent with domestic and international privacy norms, the proposed modifications represent an important rebalancing of the interplay between the Privacy Act and the recently modernized Access to Information Act (ATIA). We note that the proactive audit powers and discretion noted above go further than the measures proposed in Bill C-11 toward ensuring meaningful oversight on the most pressing issues. Further, the proposed order-making framework, while limited, is more streamlined (and therefore likely to be both more effective and timely for Canadians) than the framework proposed in the draft CPPA.

Review of the Privacy Act and the Access to Information Act

As mentioned in Justice’s discussion paper, the federal government launched a reviewFootnote 12 of the ATIA last year. Our Office provided the President of the Treasury Board with a submission in December 2020.Footnote 13 The ATIA and the Privacy Act share similar provisions throughout, with many concepts and interpretations that are relevant to both Acts. Both the ATIA and the Privacy Act play a central role in preserving the information rights of Canadians and modernization of both of these laws should occur in tandem. A more open, transparent government that is also a responsible steward of Canadians’ personal information is essential to upholding the foundations of our democracy.

We therefore look forward to the ongoing reviews of both the ATIA and the Privacy Act, and welcome the opportunity to work with both Ministers in addressing similar provisions between the Acts, as well as providing input on concurrent amendments.

Themes of our input

In the context of the substantial and constructive proposals for reform put forward in the consultation paper, the remainder of our response focuses on providing important nuances and raising a few key recommendations for consideration. These are organized under three themes: (i) interpretations and definitions (ii) institutional obligations and (iii) maximizing OPC’s effectiveness as a regulator.

Under the theme of interpretations and definitions, we first address questions posed in the consultation paper regarding the definition of personal information. In short, we support measures proposed by Justice for a more principled, inclusive definition. Second, we provide our views on the proposed public interest framework, signalling that we generally agree with the proposed “reasonably required” standard because of its similarity to the “necessity and proportionality” test, but suggesting that new statutory language be clear that the privacy impacts must be proportionate to the public interest at stake. Third, we raise certain concerns with respect to the proposed greater latitude for collecting ‘publicly available’ personal information. Fourth, we advocate for expanded measures to ensure that new proposed categories for disclosure (like de-identified information) are principled and supported by appropriate checks and balances.

Under the theme of institutional obligations, we first raise the need for clarity and precision around a threshold for government collection of personal information. Following this is commentary on new transparency measures proposed in the consultation paper. Thereafter, we focus upon proposed additions to the breach-reporting requirements to ensure timely notifications to our office and breach record keeping obligations to ensure consistency between the Privacy Act and PIPEDA. This is followed by recommendations that Justice further explore measures to ensure accountability to individuals in automated decision-making, as informed by our Office’s recent consultation on artificial intelligence. Finally, we discuss new proposals for sharing personal information, requirements to ensure it is accurate, and organizational processes for receiving complaints.

Under the theme of maximizing regulatory effectiveness, while recognizing the significance of proposals made by Justice, we note that the proposed enhancements nonetheless lag behind the privacy laws of other jurisdictions, both domestic and international. Specifically, this includes limitations on individuals’ right of recourse to court and the proposed order-making power that is limited to access issues rather than also including collection, use and disclosure contraventions in line with other data protection authorities. We conclude our submission with recommendations regarding specific modifications to minimize avoidable non-discretionary obligations for our office that impinge on our ability to allocate our limited regulatory resources for maximum effect for Canadians.

II - Interpretations and definitions

Core concepts, such as the definition of personal information, identifiability, publicly available, and the public interest, go to the heart of the Act’s effectiveness in protecting citizen’s privacy rights. Since the Act first came into force, digital technologies have evolved rapidly and in ways we could have not foreseen, making it all the more important that the Act’s central concepts are modernized and that interpretations adapt to the times.

Definition of personal information

We support measures proposed by Justice to bring a more inclusive approach to the definition of personal information. Justice asks for feedback on three proposals to update the definition of personal information: a) clarifying when an individual is “identifiable,” b) simplifying the definition by removing reference to “recorded”, and, c) removing exemptions from within the definition itself. We address each of these below.

Clarifying when an individual is “identifiable”

Justice asks how sensitivity to context should factor into determining if an individual is identifiable, such as where identifiability may turn on access to confidential information. Context must clearly play a role, but we caution against overly narrow criteria for identifiability that would be out of step with rapidly evolving technology and well accepted jurisprudence. The definition of “personal information” deserves broad and expansive interpretation under the Privacy Act.Footnote 14 The quasi-constitutionality of the Privacy Act requires that rights be interpreted broadly and exceptions to that right be narrow and specific. Modern technologies have made it much easier to access information and link datasets together, potentially making information that may initially seem innocuous or non-personal, actually very revealing. This is why we recommend the Act include a definition of “identifiability” according to the test set out by the Federal Court in Gordon v. Canada: “information will be about an ‘identifiable individual’ where there is a serious possibility that an individual could be identified through the use of that information, alone or in combination with other information.”Footnote 15 Codifying this test in the law will promote greater certainty regarding the interpretation of “identifiable” and consider context. It would also be in keeping with recent developments in Ontario to add a definition of “de-identify” to the Personal Health Information Protection Act (PHIPA).Footnote 16


  • Amendments to the definition of “personal information” in the Act should include a test for “identifiability” which is consistent with Gordon v. Canada, and which can be applied contextually.

Removing “recorded” from the definition

Justice notes that stakeholders have recommended removing the Act’s current requirement that personal information be “recorded,” but asks for feedback on the practical benefits of such an approach.Footnote 17 In our view, there are two key benefits. First, it is more in keeping with a principle-based Act – consistent with Canadian jurisprudenceFootnote 18 and domestic and international privacy law norms.Footnote 19 Second, it ensures that important obligations under the Act are applied to the handling of unrecorded personal information.

It is true that certain rights and obligations in the Act, such as an individual’s right to access and correct their personal information, or retention obligations, could only be meaningfully applied to personal information that is ‘recorded’ in some form. However, a range of other important rights and obligations in the Act can and should apply to personal information, regardless of whether it is “recorded” or not. For instance, where an institution searches, views, monitors, accesses, or verbally communicates personal information about individual, but stops short of creating a record, it should nonetheless still be required to demonstrate that the collection is reasonably required and adhere to other applicable legal requirements.

To address the modern realities of the digital age, a definition of personal information must go beyond the concept of personal information being limited to a record. Nevertheless, a right of access is dependent on adequate documentation by institutions of key activities and decisions, as well as the retention of those documents. Therefore, it is equally important to ensure that institutions have record-keeping obligations when personal information is used or disclosed in a manner that directly affects the individual. To this extent, we are supportive of Justice’s proposed broadened scope of the definition of “administrative purpose” and recommend there be an associated record keeping obligation for these practices.


  • The current requirement that personal information be “recorded” should be removed from the definition of personal information.
  • There should be record-keeping obligations when personal information is used or disclosed in a manner that directly affects the individual, in line with Justice’s proposed broadened definition for “administrative purpose”.

Removing exemptions from the definition

The consultation paper proposes to remove the list of exemptions currently found in paragraphs (j) to (m) of the definition of personal information.Footnote 20 These exemptions are for information about a person relating to their position or function as a government employee, subcontractor, ministerial adviser or staff, or recipient of discretionary government benefits, as well individuals deceased for more than 20 years.

We support removing these exemptions from the definition for both simplicity and on a principled basis, so long as any amendments made to sections 7, 8 and 26 to allow for use and disclosure of this information do not result in a broadening of these categories in a way that affects privacy rights. We would be interested in engaging in further discussions on any specific proposals to amend these provisions.

For similar reasons, we recommend that Justice not add an exemption to the definition to exclude business information as the discussion paper currently proposes.Footnote 21 In the case of sole proprietorships, contact information is often both business and personal information simultaneously, leading to difficulty in distinguishing business from personal information as compared with other business structures. Prior to 2015, PIPEDA similarly excluded business contact information from the definition of personal information. However, Parliament ultimately chose to remove this exclusion to the definition in 2015. It replaced the exclusion with a much more limited clause indicating that such information was not subject to PIPEDA only for “the business contact information of an individual that the organization collects, uses or discloses solely for the purpose of communicating or facilitating communication with the individual in relation to their employment, business or profession.”Footnote 22 We would advocate for consistency between laws on this point.


  • Business contact information should only be excluded from the definition of personal information to the extent it is collected, used or disclosed for the purpose of communicating or facilitating communication with the individual in relation to their employment, business or profession, as under PIPEDA.

New public interest provisions

The consultation paper proposes to eliminate the current 8(2)(m) provision under the Privacy Act and to replace it with a new framework allowing for use or disclosure of personal information when it is “reasonably required” in the public interest. Under current requirements, in order for an institution to exercise its discretion to use or disclose personal information pursuant to section 8(2)(m) of the Privacy Act, its head must hold the opinion that the public interest “clearly outweighs” any invasion of privacy or that the disclosure would “clearly benefit” the individual to whom it relates. In our view, this approach protects the privacy rights of individuals and generally strikes the right balance. Having said that, the OPC is open to the “reasonably required” standard proposed by Justice, with certain amendments.

We are generally in agreement with the list of criteria for determining what is “reasonably required,” (subject to our comments in the sub-section Collection threshold: ‘reasonably required’) because this would bring the standard closer to a necessity and proportionality threshold. Clear, specific criteria in a new law as proposed by Justice would help to guide decision makers when considering making public interest disclosures and provide important safeguards against unwarranted disclosure of personal information. This is particularly important given the absence of legal definitions to draw from the statute, other legislation, or case law in Canada interpreting the concept. The ‘public interest’ is obviously a very broad concept that can be difficult to define, particularly when left to individual opinion or discretion.

That said, and as elaborated upon later in this document, while the fourth criteria in the framework for “reasonably required” requires that institutions must “consider the degree of intrusiveness as compared to the public interests at play”, simply listing factors to consider, without providing for a clear threshold may not be sufficiently instructive for a broad use and disclosure provision of this nature. In our view, a proportionality assessment should be a distinct requirement in determining when the threshold for use or disclosure in the public interest is met, and the Act should specify that the intrusiveness must be proportionate to public interests at play.

We note that the paper asserts that the amended public interest provisions would maintain the exceptional aspect of the public interest authorities under the current Act, so that it could not be used on a systematic or routine basis. We support this and recommend that there should still be a presumption in favour of non-disclosure unless there are compelling arguments to the contrary.Footnote 23

Under the current rules, institutions report the total number of disclosures made pursuant to s. 8(2)(m) in their annual reports. It is unclear from the consultation paper whether this would continue to be a requirement. A similar requirement to report publicly on the use of the new public interest disclosure provision would be a welcome transparency measure.Footnote 24


  • While we believe the current s.8(2)(m) of the Privacy Act strikes the right balance, we are open to the proposal to replace it with a new framework allowing for use or disclosure of personal information when “reasonably required” in the public interest, subject to our recommendations outlined in the section on “collection threshold”. Specifically, we recommend the “reasonably required” framework include clearer language to specify that the impact on privacy must be proportionate to public interests at play. This would provide a clear threshold for use of the provision.
  • Requirements to report publicly on the use of new public interest authorities should be maintained.

Framework for ‘publicly available’ personal information

We support Justice’s proposal to more clearly define and regulate ‘publicly available’ personal information, rather than excluding it from the requirements of the Act. We also strongly support the approach of adding specialized rules that would ensure the public sector’s use of publicly available personal information is aligned with individuals’ reasonable expectations of privacy.Footnote 25

The concept of ‘publicly available’ personal information has evolved significantly since privacy laws in Canada were first enacted. We now live in a time where we both share more data about ourselves and evolving technologies make it easier to collect, analyze, and disseminate information about us. Courts have recognized that the fact that information is publicly available does not necessarily mean that our reasonable expectations of privacy shrink.Footnote 26

As noted, a positive aspect of the proposed approach is that all the Act’s rules would apply to publicly available personal information, which is not currently the case. Notably, this is a clear acknowledgement that the ability to access personal information online does not render personal information non-personal, and that institutions cannot collect information – regardless of whether it is publicly available or not – in the absence of legislative authority and in accordance with the collection threshold under the Act.

We agree with the approach taken by Justice to include a definition within the Privacy Act that takes into account the context in which the information is made public, and whether the individual has a reasonable expectation of privacy in the information irrespective of the fact that it is publicly available. This type of contextual approach is consistent with the Supreme Court of Canada’s approach in determining whether an individual has a reasonable expectation of privacy in public spaces in the criminal law context.Footnote 27

The consultation paper suggests a modernized Act could define personal information as being “publicly available” in three instances:

  1. When it has been made manifestly public by the individual the information relates to.
  2. When it is broadly and continuously available to all members of the public and the individual has no reasonable expectation of privacy in the information.
  3. When another act of Parliament or a regulation requires the information to be publicly available.

While Justice suggests that reasonable expectations of privacy should inform the second element of the definition, we recommend that an individual’s reasonable expectation of privacy be considered more broadly regardless of how the information was made publicly available. The reason for this recommendation is that even in instances where personal information is publicly available, an individual may still retain a privacy interest in their personal information, particularly when it comes to its collection and use by the federal government. A contextual assessment of an individual’s reasonable expectation of privacy should consider not only the specific factors about how information was publicized, under what circumstances, and by whom, but should also consider the intent behind its collection and use.

This is why it is so important that the Privacy Act include a robust framework for publicly available personal information. The definition proposed by Justice goes a long way to achieving this but could be enhanced by explicitly adding that publicly available personal information does not include information in respect of which an individual has a reasonable expectation of privacy. Alternatively, the “specialized rules” announced in the Justice proposals, “that would ensure the public sector’s use and disclosure of publicly available personal information is aligned with individuals’ reasonable expectations of privacy”, should be clear and effective in protecting these expectations.

The digital era is full of complexity that can result in individuals having little or no meaningful control over their personal information being in the public domain. For instance, an individual may not always have control over what information about them is made broadly and continuously available by others. Likewise, the complexities of modern technologies could result in individuals themselves making personal information available to a public audience without fully appreciating how broadly available it actually is, or the full extent of how that information could be collected or used by others. Even information that is required to be made public by law, such as court records, can include very sensitive information that is published for a very specific purpose, and individuals might not reasonably expect that this information could be collected, used or disclosed by government for unrelated purposes without special consideration for these circumstances.Footnote 28


  • The definition of “publicly available” should explicitly state that publicly available personal information does not include information in respect of which an individual has a reasonable expectation of privacy. Alternatively, any specialized rules pertaining to the public sector’s use of publicly available personal information should be clear and effective in protecting these expectations.

De-identified personal information

In principle, we are supportive of the proposed framework for de-identified information put forth by Justice.Footnote 29 The consultation paper proposes to define the concept, clarify that it remains under the Act while providing for flexibility in its use and disclosure, and to introduce an offence for re-identifying or attempting to re-identify de-identified information.

By treating de-identified information as “personal information,” as proposed by Justice, this would prevent this information from falling outside the scope of the law. Given the ever-present potential for de-identified information to be re-identified, as well as the potential for such information to be used in ways which could have significant impacts on individuals’ rights, it is important that institutions clearly understand that, though additional flexibility is granted in certain situations, privacy legislation will remain in effect when de-identified information is used. De-identified information could be exempted from certain provisions of the Privacy Act or the application of certain provisions may be relaxed, but other provisions should still apply.

As for the definition itself, as suggested in our recent AI-specific proposals for PIPEDA reform,Footnote 30 the Ontario PHIPA provides a potential definition for “de-identify”, but which would also need to account for the concept of “identifiability” as established in Gordon.


  • The Act should define de-identified personal information to allow for a more targeted and nuanced application of certain rules. For instance, while de-identified information might be exempted from certain provisions of the Privacy Act, or their application relaxed, other provisions should continue to apply.

III – Institutional obligations

Updating the Act to allow for flexibility and adaptation to modern digital realities, while also maintaining respect for privacy rights is by no means an easy task. However, as Justice suggests, even with the many benefits of a modern digital government, Canadians still rightly expect that there should be good reasons for the federal government to access their data, limits on how it is used and shared, and clear protections in place for it.Footnote 31

The consultation paper makes a number of key proposals to update or add to the Act’s rights and obligations, and rules for collection, use and disclosure of personal information, in order to address the realities of the digital age. The following section will address a number of these proposals, specifically those related to the collection threshold, transparency and breach reporting obligations, amendments to the use and disclosure authorities in the Act, and will also draw on key recommendations for regulating artificial intelligence informed by our Office’s recent consultation on artificial intelligence in the private-sector context.Footnote 32

Collection threshold: ‘reasonably required’

One of the more fundamental changes Justice proposes is the inclusion of foundational and broadly recognized data protection principles to support rules that are more precise in the Act.Footnote 33 As it relates to collection, Justice proposes the Act could include a new “identifying purposes” principle, in addition to a “limiting collection” principle to restrict the types and amount of personal information federal public bodies may collect. Under the proposed collection threshold, a federal public body would be limited to collecting only the personal information “reasonably required” to achieve a purpose relating to its functions and activities, or where it is otherwise expressly authorized by another act of Parliament.

The widely accepted standard in Canada at the provincial level, as well as internationally (E.g. GDPR, New ZealandFootnote 34), is that personal information collected by organizations must be “necessary” for the programs they administer and services they deliver.Footnote 35 According to the consultation paper, although the term “reasonably required” differs from what is found in other data protection instruments, in practice, this collection standard would be “essentially equivalent to leading international standards.”Footnote 36 While it remains to be seen whether these two standards will be interpreted the same in practice, we accept that a “reasonably required” standard is workable if the aim is to add clarity to the law while yielding results similar to the longstanding principles of necessity and proportionality.

Framework for assessing ‘Reasonably Required’

Justice proposes four key factors that institutions would have to take into account to determine if a collection is “reasonably required,” including:

  1. the specific purpose for the collection, particularly whether it was for law enforcement or national security purposes;
  2. the mechanisms or means employed to collect the information;
  3. whether there are less intrusive means of achieving the purpose at a comparable cost and with comparable benefits to the public; and
  4. the degree of intrusiveness of the collection compared to the public interests at play.

These factors are common to the “reasonably required” standard proposed for the contemplated public interest framework, except for the addition of a safeguarding criterion for public interest uses or disclosures. Below we provide comments on three of the proposed factors in this framework, as well as recommend amendments, which are meant to apply equally to both the proposed collection and public interest provisions.

i) The specific purpose for the collection, particularly whether it was for law enforcement or national security purposes;

The first factor deals with identifying the specific purpose for the collection. Identifying purposes is central to establishing what personal information is required in a given context. It will be important that the test in the law: 1) ensures that the public objective of the function or activity in question is defined with sufficient precision so that a meaningful assessment of what is reasonably required in a specific context can take place, 2) that the specific function or activity for which the collection is reasonably required has a clear lawful basis. An overly broad purpose can lead to over-collection. As the Supreme Court of Canada has noted, if the objective is defined too broadly, it risks inflating the importance of the objective and compromising the analysis.Footnote 37 As well, though it may be implied that the function or activity must have a lawful basis, as is the case with the current s. 4 of the Act in referring to an operating program or activity, we have seen this generously interpreted over the years. Requiring that identified purposes be clearly linked to a lawful authority would be a helpful prompt for institutions to ensure they have identified valid legal grounds for collecting personal information.

To illustrate this point, the Framework we developed to help guide government institutions in their assessments of privacy-impactful initiatives implemented in response to the COVID-19 pandemic,Footnote 38 specified that even in challenging circumstances government institutions should still ensure that their measures are necessary and proportionate, which means essentially evidence-based, necessary for the specific purpose identified and not overbroad. In that context, we emphasized the importance of ensuring that a public health purpose underlying a potentially privacy infringing measure was science-based and defined with some specificity, pointing out that it was not enough to simply state that a measure generally “supports public health.”

The first three elements of Article 5(1) of the GDPR is a helpful model along these lines. Article 5(1)(a) requires that personal information is “processed lawfully, fairly, and in a transparent manner”, article 5(1)(b) refers to “specified, explicit and legitimate purposes”, and article 5(1)(c) requires that personal information be “adequate, relevant, and limited to what is necessary in relation to the purposes for which they are processed”.

ii) Whether there are less intrusive means of achieving the purpose at a comparable cost and with comparable benefits to the public.

The third factor in the proposed framework speaks to the existence of less intrusive means of achieving a purpose at a “comparable cost and with comparable benefits.” We would note that the notion that slightly higher costs might justify more privacy intrusive measures is plainly inappropriate. Consideration of whether there are less intrusive means of achieving the same purpose is undeniably a relevant element of a proportionality assessment, and institutions may consider costs in this analysis, however the requirement that the cost be “comparable” fails to recognize there may be instances where increased costs may be appropriate to ensure privacy rights are upheld. Cost is only one of a multitude of factors that should be considered when evaluating less intrusive means of achieving a purpose. Statistics Canada pointed to reduced data collection costs as a factor in support of its broad data collection of personal information from a credit bureau and from financial institutions, however our investigation found that it failed to demonstrate that the projects were proportionate to the invasion of privacy that entailed and that less invasive alternatives were not reasonably available.Footnote 39

iii) The degree of intrusiveness of the collection compared to the public interests at play.

The fourth factor in the proposed framework would require government institutions to consider the degree of intrusiveness of the collection compared to the public interests at play. This is indeed a key factor supporting a proportionality assessment that would help ensure the expected benefits of collection are balanced against the privacy intrusiveness. In that regard, we suggest that it would be clearer to include language that specifies the intrusiveness must be proportionate to public interests at play. Introducing an explicit proportionality requirement to determine whether a collection is “reasonably required” would have the effect of limiting the risk of over-collection of personal information because government initiatives would have to be carefully evaluated for privacy risks at the outset and institutions would have to ensure that the collection is fair, not arbitrary, and proportionate in scope. It would also require an objective assessment of the importance of the specific public policy need behind the demand. The more severe the potential impact on privacy, the more pressing and substantial the public goal should be.

Lastly, we suggest including a new final factor (v) – ‘any other relevant factor(s) in the circumstances’ – simply to emphasize that the list of factors is not exhaustive and that all relevant conditions should be taken into consideration in determining whether a collection is reasonably required in the circumstances.


The factors to be taken into account in assessing ‘reasonably required’ should be amended to include consideration of:

  1. The specific, explicit and lawful purposes identified, and whether the personal information to be collected is limited to what is reasonably required in relation to those purposes for the collection, particularly whether it was for law enforcement or national security purposes;
  2. The mechanisms or means employed to collect the information;
  3. Whether there are less intrusive means of achieving the same purpose at a comparable cost and with comparable benefits to the public;
  4. Whether the loss of privacy or other fundamental rights and interests of the individual is proportionate to the degree of intrusiveness of the collection compared the public interests at play;
  5. Any other relevant factors in the circumstances.

Transparency obligations

We support proposed enhancements to transparency as suggested in the consultation paper. Transparency is critical to empowering citizens with the knowledge needed to exercise their rights. It also requires the government to be accountable for its handling of personal information. These are critical aspects of a meaningful data protection framework.

However, notice to individuals about collections and their purpose is not currently required when institutions collect information about someone from an indirect source. Over the years, new technologies have made indirect collection of personal information increasingly attractive for reasons of efficiency, accuracy, cost and convenience for institutions and citizens. We have seen increases in indirect collection, including both targeted and mass collections, which are of potential concern to citizens.Footnote 40

The consultation paper proposes a number of ways in which the ability to collect information about individuals indirectly will be expanded – such as the ability to collect publically available information, and to collect information from other federal institutions (such as under data integration initiatives, etc.). As the paper states, this is reasonably driven by the evolution of technology – such as the convenience benefits of a ‘tell us once’ approach. However, these increases in indirect collection risk resulting in the erosion of citizens’ meaningful knowledge of what, exactly, has been collected about them, when, and for what purpose, as well as their ability to hold government to account.

The consultation paper proposes that:

The Act could also include a right for individuals to be notified of when their personal information is collected by a federal public body. The Act could set out the minimal elements that would have to be included in a notice to individuals. However, the Act could also set out reasonable limits to this right, such as: where the individual already has been notified; where the federal public body is authorized to collect personal information from a source other than the individual; where the purpose of the collection relates to a law enforcement or national security matter; or where providing notice would be practically impossible or would defeat or prejudice the purpose of the collection or result in the collection of inaccurate information.

Removing the second and third exceptions above would significantly strengthen the transparency regime. At the same time, it would still allow for withholding information from citizens where it would be impossible or would defeat or prejudice the purpose (which could include in certain circumstances law enforcement or national security matters). Enhanced ‘Personal Information Registries,’ and privacy notices posted on a website, no matter how well crafted, do not empower citizens with knowledge nearly as well as having a detailed and contextualized understanding of what has been collected about them, when and for what purposes.

Other countries which have embraced greater indirect collection of personal information have recognized this challenge and adopted innovative approaches to counteract this ‘transparency gap’. We urge that further innovations be explored to bridge the gap in citizens’ knowledge inevitably created by increasing indirect collection.


  • Measures to further enhance transparency should be considered, including:
    • Limiting exceptions to the right of direct notification about collection (in particular, to eliminate some of the exceptions to the right of notice for indirect collection);
    • Requiring that notices require details to be provided of what information was collected, when, and in what context, as well as the purposes for which it may be used or disclosed; and
    • Exploring ways to enhance direct access for individuals to what information departments have collected about them, when, and what purposes it is used or disclosed for.

Public sector breach reporting

We strongly support Justice’s proposals to introduce a “safeguarding” principle, a breach record-keeping requirement, and mandatory reporting to our Office and notification to affected individuals of a breach where there is a risk of significant harm to an individual.Footnote 41 These measures recognize the significant increases in the risks of unintended or unauthorized disclosure, loss, access, and theft of the government’s personal information holdings.

To ensure the proposed privacy breach reporting regime to our office provides meaningful oversight, we believe more prescriptive wording within the Act on timeliness for reporting is required. Rather than ‘as soon as practically possible,’ we suggest institutions should report breaches to our Office without unreasonable delay and no later than seven calendar days after the institution becomes aware of the breach. Clear, short timelines for reporting would allow for an effective response, provide certainty to institutions about what is legally required, and be consistent with modern international norms.Footnote 42

Individuals should also be notified as quickly as possible, given that they are the directly impacted parties and need to know what (if any) corrective steps they can or should take. However, we recognize that the imposition of a specific maximum notification timeline for reporting to individuals could result in the disclosure of preliminary and potentially confusing or incorrect information, and thus recommend that such notifications to individuals occur only without unreasonable delay. Additionally, a clear and consistent definition for the threshold for reportable breaches in the legislation would ensure consistency with PIPEDA and the proposed CPPA. We therefore recommend use of ‘real risk of significant harm’ in both statutes.

Breach reports received by the OPC are at times incomplete. When seeking additional information, the OPC has at times found it difficult to get access to internal and third party breach investigation reports, over which organizations sometimes claim solicitor-client or litigation privilege. We therefore recommend adding a provision that would require institutions to share internal or third party breach investigation reports with our Office, including records and reports to which solicitor-client privilege is claimed. This would substantially facilitate timely and resource efficient oversight (to avoid our Office having to duplicate the efforts of such internal exercises). In keeping with Supreme Court jurisprudence, such an exception would need to be “sufficiently clear, explicit and unequivocal to evince legislative intent to set aside solicitor-client privilege”.Footnote 43 For added clarity, the provision could indicate that an institution would not be deemed to have waived privilege by providing such reports to the OPC.


  • That OPC should be provided with access to all reports prepared by or for institutions about the cause of a breach and related lessons learned.
  • Federal public bodies should notify OPC without unreasonable delay, and no later than seven calendar days, after the institution becomes aware of the breach.
  • The threshold for reporting breaches should be consistent with PIPEDA and the CPPA, that is, when there is a ‘real risk of significant harm’ to individuals.

Automated decision-making

The consultation paper states that certain rights and accountability requirements related to automated decision-making systems, such as AI tools, may be desirable in the law.Footnote 44 We believe this approach is warranted because of the unique privacy risks that such systems introduce, such as their potential ability to infer or predict individuals’ attributes, or to use such insights to automatically determine the allocation of certain credits or programs.

Automated decisions can raise issues of fairness, accuracy, and discrimination. The use of personal information through AI tools to influence individuals’ behaviour can have wider implications for freedom and democracy as well. The consultation paper is explicit that a reformed Privacy Act should be based on a foundation of respect and recognition of rights, which includes recognition in the purpose clause that an underlying objective of the Act is to protect individuals’ human dignity, personal autonomy, and self-determination.Footnote 45 Introducing rights specific to automated decision-making would better support this objective and prevent negative downstream impacts on individuals.

This approach would also ensure that the law remains technologically neutral by providing protection in relation to automated decision-making instead of any particular AI technology.

The consultation paper proposes that the Privacy Act could align with federal policy instruments on automated decision-making to help ensure individuals know when they are interacting with such systems, the types and sources of personal information used by them, and general information on how they function.Footnote 46 There is benefit to aligning the law with such elements of the Directive on Automated Decision-Making (DADM), so as to streamline requirements, promote compliance, and better recognize rights. As Teresa Scassa notes in a forthcoming publication, “the requirements to comply with directives are internal to government, as are the sanctions. Directives do not create actionable rights for individuals”.Footnote 47 Incorporating privacy obligations into the law instead of through a Directive can therefore better support the intended objective of recognizing rights. Establishing technologically neutral statutory rights related to automated decision-making would not preclude developing guidance to operationalize these rights. Quite the contrary, it would provide a useful framework for responding to new technology and novel uses of automated decision-making, and would therefore facilitate a consistent, principled approach.

The DADM includes specific obligations that are important in addressing privacy risks, including human intervention, and meaningful explanations of decisions. These were also part of our recommendations for AI regulation under PIPEDA, which culminated our public consultation on AI. While the specific risk-based manner in which these obligations are implemented under the DADM may not be suitable in law, we recommend that individuals be provided with the statutory right to contest automated decisions to a person, and the right to receive explanations. These are particularly important in the public sector context to respect natural justice and procedural fairness.

A right to a meaningful explanation would be similar to what is found in Articles 13, 14 and 15 of the GDPR, which requires data controllers to provide individuals with “meaningful information about the logic involved” in in any automated decision-making, including profiling.Footnote 48 We also recommend the law define a standard for the level of explanation required, such as to (i) allow individuals to understand the nature of the decision to which they are being subject and the relevant personal information being relied upon, and (ii) the rules that define the processing and the decision’s principal characteristics. However, where trade secrets or security classification may prevent such an explanation from being provided, organizations could use the following three factors to provide an adequate explanation: (i) the type of personal information collected or used, (ii) why the information is relevant, and (iii) its likely impact on the individual.Footnote 49

The ultimate objective of a right to meaningful explanation is to address potential scenarios where black box algorithms and unknown personal information is used to automatically make decisions about an individual. It provides an avenue of recourse and respects basic human dignity by ensuring that the organization is able to explain the reasoning for the particular decision in understandable terms.Footnote 50

Demonstrable accountability for automated decision-making

Other proposed accountability measures include the obligation to undertake a privacy impact assessment (PIA) for automated or manual profiling activities that involve sensitive personal information, or other activities involving a high risk for personal information.Footnote 51 Although not specific to AI or automated decision-making, Justice also proposes adding an obligation to design programs and activities with the protection of personal information in mind (also known as “privacy by design”).Footnote 52 These obligations, along with a broadened scope of “administrative purpose” under the Act to ensure it applies to the design and development of artificial intelligence systems, are positive elements of a demonstrable accountability framework for AI.

The obligation to consider privacy by design would be consistent with modern legislative approaches, such as that found in the GDPR and Bill 64.Footnote 53 Moreover, PIAs are useful tools when designing for privacy and human rights in AI, and they support demonstrable accountability by allowing the privacy regulator to review the documented assessments, which can show the due diligence the organization took before implementing the AI activity.

More specifically in the context of automated decision-making, a requirement for what is known as “traceability” would allow a government department to locate the personal information that a machine used to reach a decision. This would support the previously mentioned rights to explanation and human intervention, and seek to dispel black-box decision-making. Such accountability is particularly important in a public-sector context. Quebec’s Bill 64 and recent amendments to Ontario’s PHIPA include requirements related to traceability. This recommendation also received wide multi-stakeholder support in the OPC’s public consultation on AI.

Inferences as a collection

We are also supportive of the proposal to make it clear that drawing inferences about an individual would qualify as a collection of personal information. This is key for protecting human rights because inferences can lead to a depth of revelations, such as those relating to political affinity, interests, financial class, race, etc., and often be drawn about an individual without their knowledge, and used to make decisions about them. Like all other personal information, inferences would then be subject to the limiting collection threshold, and its accompanying test to ensure that it is “reasonably required” for a given purpose and fair and proportionate in the context.

Overall, to mitigate the risks of privacy violations associated with automated decision-making, as well as give meaning to individual rights, the Act will need to define automated decision-making to create specific protections to apply to it.


  • The law should define automated decision-making.
  • The law should include a right to meaningful explanation and human intervention related to the use of automated decision-making, as currently supported by the TBS Directive on Automated Decision-Making.
  • A specific standard should be denoted for the level of explanation required, so as to allow individuals to understand: (i) the nature of the decision to which they are being subject and the relevant personal information relied upon, and (ii) the rules that define the processing and the decision’s principal characteristics.
  • Where trade secrets or security classification prevent such an explanation from being provided, the following should at least be provided: (i) the type of personal information collected or used, (ii) why the information is relevant, and (iii) its likely impact on the individual.
  • The law should contain an obligation for institutions to log and trace personal information used in automated decision-making.

Disclosures and information-sharing

There are presently more than a dozen provisions in section 8 of the Act to allow for sharing of personal information without prior consent of the individual(s) in question. Some of these exceptions for sharing are broad, for example, for any other purpose set out in federal law or regulation (8(2)(b)). Others are narrow and specific, for example, the collection of a debt (8)(2)(l). Besides this wide variance in the scope of sharing, there are also considerable differences in the accountability and transparency mechanisms attached to these provisions.

The consultation paper proposes to add further exceptions, and to make modifications, which depending on specific wording and regulation, could have the effect of broadening existing provisions for disclosure under the Act.

Given the breadth of disclosures contemplated in the discussion paper, we have summarized our comments and recommendations in relation to a number of specific provisions:

Proposal OPC comment
Consistent uses Consistent use, under 8(2)(a) of the Act, is cited to the OPC frequently as an authority for disclosure or use. We have noted in the past that federal institutions can interpret the concept too broadly (i.e. purposes they deem to be consistent may, in fact, only be loosely related). The Justice proposal as framed is likely to bring more rigor to this provision, which is welcome. We are highly supportive of the addition of criteria to highlight appropriate considerations as proposed, including the link between the original and the updated purpose; the context in which personal information was originally collected; the nature of the personal information; possible consequences and benefits for individuals; and the existence of appropriate safeguards or risk mitigation measures.

We believe that transforming existing notification requirements into a record keeping requirement, with the Privacy Commissioner having the ability to proactively audit these records, as proposed by Justice, maintains appropriate demonstrable accountability for consistent uses.
Disclosures for data integration This proposed new exception for disclosures without consent is described as allowing the Government to share, link, and analyze data to obtain new insights to support service-delivery initiatives, policy development, system planning, resource allocation and performance monitoring. We agree this is a desirable objective. However, there should be appropriate constraints and safeguards for what could entail bulk disclosures of sensitive information and the aggregation of multifaceted ‘profiles’ on individuals.

The consultation paper refers to data integration provisions of Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA), but does not indicate whether safeguards and oversight measures such those set out in FIPPA would apply. We recommend that Justice do so given the depth and breadth of disclosures that could be involved.

Such safeguards include the adoption of data standards established with the data protection authority, annual reporting on data integration activities, usage of technical safeguards (including de-identification and encryption), data retention limits, and a specific defined role for the privacy commissioner, including enforcement powers.

OPC supports making these requirements explicit preconditions in the Act, (as is the case in s. 49.3(1) and (2) of FIPPA), rather than having these requirements factor into whether data integration is “reasonably required”.
Disclosures to next of kin in certain cases for compassionate reasons We agree there is merit in adding this exception as suggested by Justice.
Disclosures for emergencies or serious threat to public or individual safety. We believe that adding this exception would be justified given the extenuating circumstances in which the requirement to make such a disclosure may arise. However, similar provisions in analogous provincial legislation focus on specific threats to the health and safety of individuals, rather than on risk to the public generally, and contain language recognizing the importance of privacy despite the existence of a serious threat.Footnote 54 Ideally, there would be limiting language, which specifies the exigent purpose of the disclosure while at the same recognizing the importance of privacy of individuals.
Clarifying 8(2)(c) to “support courts and tribunal cases” Under the current Act, s. 8(2)(c) authorizes the disclosure of information when required by subpoena, warrant or order. The addition of other judicial authorities does not change the nature of the provision, although it expands its scope. We generally support this provision, provided that disclosure remains limited to information relevant to litigation.
Replacing investigative bodies regime under 8(2)(e) with model like under PIPEDA We agree with the Department of Justice's proposed elimination of pre-approval procedures for investigative bodies, but note that it could significantly increase the number of investigative bodies and the disclosure of personal information. It will therefore be important that the definition of “investigative body” remain narrow. If the pre-approval requirements are eliminated, we recommend that the new recordkeeping requirements being considered for “consistent uses” and “public interest” disclosure be expanded to apply to this provision (for both the disclosing and recipient institutions), and that annual reporting on the use of the provision be considered.
Distinguishing information-sharing with foreign governments from other information-sharing under 8(2)(f) Our Office supports the Justice proposal concerning section 8(2)(f), that disclosures to foreign governments be distinguished from information sharing with other bodies. We also agree that information sharing should be subject to written information sharing agreements with minimum base requirements. However, the consultation document doesn’t mention review or consultation with OPC on information-sharing agreements.

As we have noted to the ETHI Committee, we continue to believe the Act should “require that all information sharing under paragraphs 8(2)(a) and (f) of the Privacy Act be governed by written agreements and that these agreements include specified elements. Further, all new or amended agreements should be submitted to the OPC for review, and existing agreements should be reviewable upon request”.Footnote 55
New disclosure provision mirroring 8(2)(i) for Statistics Canada This new provision is described as one that extends to Statistics Canada for “statistical and research purposes” the disclosure authority already granted to Library and Archives Canada for archival purposes. While the use of information for archival purposes entails a measure of privacy risk (through its prolonged retention), it is significantly different from the activities of Statistics Canada (to create data linkages for statistical purposes), which carries significantly more privacy risks.Footnote 56 Also missing from the proposal are the protective measures provided for in the current research disclosure exception.Footnote 57

Pursuant to such an exception, there would seemingly be no limit to what personal information Statistics Canada could collect, irrespective of whether there is a defined statistical or research purpose. It is unclear why current legal authorities are insufficient for allowing institutions to disclose personal information to Statistics Canada as required. For this reason, we recommend that this proposed exemption not be adopted.
Clarifying 8(2)(j) disclosures for research and statistics While we support a new requirement for the head of an institution to specify security / confidentiality measures to be contained in disclosure agreements, it is unclear whether and how this provision may overlap with data integration units, whose primary function would appear to be to conduct research and statistical analysis.

Clarification would be helpful in understanding the relationship between this provision and the provision dealing with data integration services.


  • For disclosures to data integration units, we recommend protections as set out in Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA), including: data standards, annual reporting, technical safeguards (including de-identification), data retention limits, and a specific role for the OPC, including enhanced enforcement powers.
  • For disclosures for emergencies or serious threats to public or individual safety, we recommend that any such provision include limiting language which specifies the exigent purpose of the disclosure and which recognizes the importance of the privacy of individuals.
  • For disclosures to investigative bodies which would no longer be pre-approved, we recommend that the term “investigative body” be narrowly defined, that accountability be strengthened through a record keeping requirement for both the disclosing and recipient institutions, that there be review by the OPC, as well as the potential for annual reporting on the use of the provision.
  • For disclosures to foreign governments, we recommend there be a requirement for written information sharing agreements, that such agreements be required to contain minimum base requirements set out in the law, and that institutions be required to submit agreements to the OPC for review.
  • We recommend against a provision modeled on the current 8(2)(i) to allow for disclosures to Statistics Canada for statistical or research purposes.

Challenging compliance

We support the proposal to introduce a “challenging compliance” principle into the Act. That said, to fully realise the benefits of introducing such a principle, we recommend it be framed, similarly to PIPEDA or the CPPA, as requiring the institution itself to implement mechanisms to respond to privacy complaints. The purpose of this would not be to add administrative burden, but instead, aim to provide a more efficient and timely approach, for both institutions and individuals, to address individuals’ concerns. The more intensive process of filing a formal complaint with our Office could thus be avoided in many cases (or the process could be expedited, if the issue is still not resolved before a formal complaint is filed). It would also strengthen the effectiveness of the accountability principle – by increasing the direct accountability of institutions to individuals who raise concerns.

In line with encouraging more efficient means for dealing with privacy matters, the upcoming review of the access provisions under the Privacy Act could also consider incentives for proactive or informal disclosures. Consideration could be given to how such proactive and informal disclosures could go along with a corresponding reduced burden pertaining to formal access requests for the same information. We would be pleased to discuss this issue further.


  • As part of a principle on challenging compliance, institutions should implement mechanisms to respond to privacy complaints, in the interests of efficient and timely resolution for citizens’ concerns where possible.
  • The Act should incentivize the use of informal mechanisms to provide individuals with access to their personal information held by government institutions.

IV – Maximizing regulatory effectiveness

As noted in our introduction, we strongly support the constructive proposals in the consultation paper relating to effective oversight by our Office. These proposals represent important and pragmatic progress. For this reason, we focus on two residual issues. These include the scope of order-making powers and the reduction of non-discretionary obligations for our Office.

Recourse to Federal Court

We welcome the proposal to expand the scope of the Federal Court’s current review authority to include complaints relating to the collection, use, disclosure, retention or safeguarding of personal information. This proposal recognizes the importance of individuals having a right to effective redress.

We believe this right to a remedy could be further enhanced if it were not limited, as proposed, to only those matters considered to be unresolved. This would provide broader access to justice for individuals by allowing them to pursue matters in a wider range of situations, such as where the OPC declines to investigate.

A potential model is the scheme in the Official Languages Act (OLA). Section 77 of the OLA provides that, after a person has made a complaint to the Commissioner, they may apply to the Court for a remedy in three circumstances:

  1. Where the complainant has received the results of an investigation.
  2. Where the complainant has been informed of the Commissioner’s decision to refuse or cease to investigate the complaint.
  3. Where he or she has not been informed of the result of the investigation or of a decision within six months after the complaint is made.


  • An individual’s right of recourse to the Federal Court should not be limited to unresolved matters and should instead be modeled on section 77 of the Official Languages Act (OLA).

Appropriate order-making power

The consultation paper proposes providing the OPC with the power to issue orders similar to those of the Information Commissioner, specifically relating to complaints concerning refusals of access to personal information.

We welcome Justice’s proposal to provide us with order making powers. However, limiting these powers to cases of denial of access requests (as with the Office of the Information Commissioner) ignores the differing context. While such orders provide an effective remedy to almost all affected individuals in the cases reviewed by the Office of the Information Commissioner, they provide such a remedy to only a small fraction of the individuals affected by the cases reviewed by the OPC. Indeed, access complaints affect only a few hundred Canadians each year, whereas our Office's investigations into non-compliance with the collection, use and disclosure provisions can affect millions of Canadians.Footnote 58

We believe that the most relevant comparison is not with the Office of the Information Commissioner, but instead with the scope of powers conferred on other privacy authorities in Canada and abroad. Elsewhere in the world, privacy regimes based on a legal model like ours (in the United Kingdom, Ireland, Australia and New Zealand) allow authorities similar to ours, independent of government, to issue orders against public sector organizations and government agencies. Within Canada, similar regimes exist in Quebec, Alberta, British Columbia and, more recently, Ontario – where, under the data integration framework in its recently amended FIPPA, the Commissioner may order the unit to discontinue, change, or implement a new practice or procedure, or order the destruction of personal information.


  • OPC’s ability to issue orders should not be limited to denial of access matters but should also include matters involving the collection, use and disclosure of personal information by government institutions.

Minimizing avoidable non-discretionary obligations

We welcome proposals in the consultation paper to enhance the role of the Privacy Commissioner from both a proactive, advisory perspective, as well as from an enforcement perspective. An effective regulator is one that helps regulated entities comply with the law, but which also has the ability to take meaningful enforcement action when that fails.

An effective regulator is also one that has the ability to make choices about its priorities, activities and the allocation of its resources. Adding new responsibilities to an already extensive list of obligations risks exhausting the OPC’s finite resources which may hinder the Office’s ability to effectively fulfill its mandate to the detriment of Canadians.

It is in this vein that we have identified a number of areas under both the current and proposed regime where enhancements could be made that would provide the OPC with more discretion to manage its limited resources to the overall end of being a responsive and effective regulator.

Collaboration with regulatory counterparts

The OPC is pleased to see that the consultation paper proposes to enable increased ability for us to collaborate with regulatory counterparts as we have consistently called for. The OPC has generally been at the forefront of the movement towards more collaboration between oversight bodies over the past decade, noting the importance of networked oversight given how much intelligence is shared between departments and how often multiple agencies come together to work on group taskforces or as integrated enforcement teams.

That said, there is a crucial legal distinction to be made between legal flexibility and latitude to consult when reasonably necessary, versus an obligation to consult in all instances that the paper appears to propose. We are concerned that this requirement will result in the unnecessary disclosure of personal information without a valid reason. In addition, it would add a logistical operational burden. This would affect not only the OPC, but also (if the requirement was made reciprocal, as we assume it would be for coherence) all the other oversight bodies implicated. Oversight agencies need the discretion to share information to benefit from each other’s expertise, avoid duplication of work, collaborate when warranted, and move quickly and independently on their own when an issue is precisely in their orbit. It is equally important that each oversight body maintain its independence, while benefiting from the specialised knowledge each oversight body brings to bear. OPC’s status as an independent, arms-length agent of Parliament is critical to maintain its credibility.


  • The Privacy Act should enable consultation with relevant oversight bodies and other regulators without imposing mandatory consultation.

Oversight of response to vexatious personal information requests

Echoing recent ATIA amendments, the consultation paper proposes the introduction of discretion for institutions to decline to respond to vexatious or abusive personal information requests. We agree that this is a reasonable provision, though care must be taken that it is applied appropriately.Footnote 59 Currently, Justice proposes the relatively extraordinary oversight mechanism (not in place for any other provisions in the Privacy Act) of requiring institutions to obtain direct OPC approval of each such decision. While we are not strongly opposed to this oversight regime, we would respectfully suggest that it may be more appropriate to apply the same oversight model as for other denial of access to requesters. Specifically, such decisions could be made by the responsible institutions, but be subject to ex-post facto oversight by the OPC through the standard complaint mechanisms - reinforced by the order making powers for access matters proposed by Justice to guard against potential abuse.

We note that the more extraordinary oversight regime proposed by Justice is similar to that introduced in the ATIA in response to concerns raised by a range of stakeholders that there was the potential for extensive use (and potential abuse) of this provision by institutions. However, we are not aware of indicators to date that this has materialized.Footnote 60 In Ontario, where similar decisions can be made by the head of provincial and municipal institutions, subject to normal oversight mechanisms by the Ontario OIPC, appeals relating to denial of access for ‘frivolous or vexatious’ requests under Ontario’s FIPPA and MFIPPA similarly account for fewer than 2% of all access-related appeals.Footnote 61


  • Approval to decline vexatious requests should be done by individual institutions, while remaining subject to OPC review via complaints from individuals or the Commissioner’s discretionary investigative and audit powers.

Ongoing obligation to review FINTRAC safeguards

A current problematic example of a similar ongoing constraint on the OPC is the case of section 72(2) of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (PCMLTFA), which requires the Privacy Commissioner to review, every two years, the measures taken by FINTRAC to protect information it receives or collects. An effective regulator should not be auditing a very narrow slice of a single organization’s activities for compliance in this manner, as OPC does with FINTRAC. We already have the authority under section 37 of the Privacy Act to review the practices of any government institution, which extends to FINTRAC; the provision in the PCMLTFA is therefore superfluous. This duplication of law results in an inefficient constraint on scarce resources.


  • An amendment to the PCMLTFA should be made, consequential to any new Privacy Act provisions, to remove the requirement for our office to conduct biennial reviews of FINTRAC, on the basis that we have authority under the latter (sec. 37) to review the practices of all institutions at our discretion.

OPC review of privacy impact assessments

We strongly support integrating PIA obligations in the Act, including the proposed obligation to share PIAs with OPC for review and recommendation. However, given the volume of PIAs produced by institutions should be very significant, even with the proposed risk based approach, the proposal to require the OPC to provide recommendations within a specific timeline will be very resource intensive. In addition to appropriately resourcing this function, discretion will be important to ensure that reviewing PIAs on a timely basis does not overwhelm limited regulatory resources and that OPC is able to target its PIA reviews on a priority basis to matters of the highest risk to privacy.

It is important to note that under the current PIA policy-level framework OPC does not have to issue recommendations on all PIAs received. Despite this, all PIAs are reviewed for risk based triaging and serve a valuable function of informing the OPC of government practices and associated privacy risks. Done correctly, PIAs are meant to serve as an institution’s internal risk identification and mitigation exercise and should not require OPC intervention in all cases. They are also meant to be evergreen, meaning OPC comment on each iteration may not be necessary despite the benefit of such updates in keeping the OPC apprised of government activities.

  • There should be a legal obligation for institutions to prepare and submit privacy impact assessments to the OPC with OPC retaining discretion on whether or not recommendations will be issued.

Legislative and regulatory consultation

Currently, there is a policy level requirement to notify the OPC of any planned initiatives, including legislation, regulations, policies, and programs that relate to the Act or that may have an impact on the privacy of Canadians.

Several provincial and international laws now set out an explicit requirement for institutions to consult their data protection authority as they prepare new bills. For example, in Newfoundland and Labrador, the Access to Information and Protection of Privacy Act requires consultation with the Commissioner on a proposed bill that could have implications for access to information or protection of privacy, as soon as possible before, and not later than, the date of notice to introduce the bill in the House of Assembly.Footnote 62 The Commissioner must advise the Minister as to whether the proposed bill has implications for access to information or protection of privacy, and can comment publically on a draft bill. Similarly, the GDPR requires that “Member States shall consult the supervisory authority during the preparation of a proposal for a legislative measure to be adopted by a national parliament or of a regulatory measure based on such a legislative measure, which relates to the processing of personal data”.Footnote 63

Similarly, we recommend that the OPC receive advance notice and be consulted on any proposed legislation or regulation that may have privacy implications prior to tabling.

In providing the Privacy Commissioner discretion in its other functions, this would free resources to devote to other privacy impactful measures such as engaging in consultations of this nature.

  • Government institutions should be required to consult with OPC on draft legislation and regulations with privacy implications before they are tabled.
Date modified: