Language selection

Search

Privacy guidance on facial recognition for police agencies

May 2022

Overview

  1. Facial recognition (FR) has emerged as a powerful technology that can pose serious risks to privacy.
  2. Canada’s federal, provincial and territorial privacy commissioners are of the opinion that the current legislative context for police use of FR is insufficient. In the absence of a comprehensive legal framework, there remains significant uncertainty about the circumstances in which FR use by police is lawful.
  3. This guidance is issued jointly by the privacy protection authorities for each province and territory of Canada, and the Office of the Privacy Commissioner of Canada. The purpose of this guidance is to clarify police agencies’ privacy obligations relating to the use of FR under current laws, with a view to helping ensure that any use of the technology complies with the law, minimizes privacy risks, and respects privacy rights.
  4. This guidance should not be understood as opining on the legality of specific uses of FR by police. Each federal, provincial, and territorial privacy commissioner reserves the right to make determinations about what applications of FR are permissible within their jurisdiction and under what circumstances, in consideration of their respective laws.

Scope

  1. This guidance is for federal, provincial, regional and municipal police agencies. It was not written for other public organizations outside of the police that are involved in law enforcement activities (for example, border control), nor for private-sector organizations that carry out similar activities (for example, private security). However, these organizations must still ensure their compliance with all applicable laws, including privacy and human rights laws. Sections of this guidance may be helpful for that purpose.

Introduction

  1. FR technology has emerged as a tool of significant interest to police agencies. Used responsibly and in the right circumstances, FR may assist police agencies in carrying out a variety of public safety initiatives, including investigations into criminal wrongdoing and the search for missing persons.
  2. At the same time, FR has the potential to be a highly invasive surveillance technology.
  3. The use of FR involves the collection and processing of sensitive personal information: biometric facial data is unique to each individual, unlikely to vary significantly over periods of time, and difficult to change in its underlying features. This information speaks to the very core of individual identity, and its collection and use by police supports the ability to identify and potentially surveil individuals.
  4. FR technology also scales easily, costs relatively little to use, and can be deployed as an add-on to existing surveillance infrastructure. This includes the capacity to automate extraction of identifying information from a wide range of sources, including virtually any source of digital imagery (such as video surveillance and body worn cameras), both online and off.
  5. The prospect of police agencies integrating FR technology into law enforcement initiatives thus raises the possibility of serious privacy harms unless appropriate privacy protections are put in place.
  6. The freedom to live and develop free from surveillance is a fundamental human right. In Canada, public sector statutory rights to privacy are recognized as quasi-constitutional in nature, and aspects of the right to privacy are protected by sections 7 and 8 of the Canadian Charter of Rights and Freedoms (the Charter). These rights dictate that individuals must be able to navigate public, semi-public, and private spaces without the risk of their activities being routinely identified, tracked and monitored. While certain intrusions on this right can be justified in specific circumstances, individuals do not forego their right to privacy, including their anonymity, merely by participating in the world in ways that may reveal their face to others, or that may enable their image to be captured on camera.
  7. Privacy is also necessary for the realization of other fundamental rights that are protected by the Charter. Privacy is vital to dignity, autonomy, and personal growth, and it is a basic prerequisite to the free and open participation of individuals in democratic life. When surveillance increases, individuals can be deterred from exercising these rights and freedoms.
  8. Surveillance is also linked with systemic discrimination, including discrimination experienced by racialized communities. Longstanding concerns about the disproportionate policing of racialized communities raise serious questions about the privacy and human rights impact of applying FR technology to, for example, historical datasets such as mugshot databases. When considering the impact of FR technology on individual privacy then, police agencies must also account for and respect the right to the equal protection and equal benefit of the law without discrimination.
  9. If used inappropriately, FR technology may therefore have lasting and severe effects on privacy and other fundamental rights. This includes harms to specific individuals whose personal information may be collected, processed, or disclosed. It also includes harms to groups and communities and more general societal harms that flow from increases in the capacity of authorities to monitor the physical and virtual spaces in which we interact. Such increases can be difficult to contain once they are set in motion.
  10. The nature of these risks calls for collective reflection on the limits of acceptable FR use. These limits are defined not only by the risks associated with specific FR initiatives, but also by the aggregate effects of all such initiatives, taken together over time, on the general surveillance of public and private space. The question of where acceptable FR use begins and ends is in part, then, a question of the expectations we set now for the future protection of privacy in the face of ever-increasing technological capabilities to intrude on Canadians’ reasonable expectations of privacy.
  11. The process of establishing appropriate limits on FR use remains incomplete. Unlike other forms of biometrics collected by police agencies such as photographs, fingerprints or DNA profiles, FR use is not subject to any focused statutory rules. Instead, its use is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by FR. This creates room for uncertainty concerning what uses of FR may be acceptable, and under what circumstances.
  12. It is in this context that our offices issue the present guidance document. The guidance is meant to clarify legal responsibilities, as they currently stand, with a view to helping to ensure that any use of FR by police agencies complies with the law, minimizes privacy risks and respects the fundamental human right to privacy. This guidance should not be read as justifying, endorsing or approving the use of FR by police agencies. Nor does it replace the broader need for a more robust regulatory framework for FR.
  13. While this document addresses many legal requirements that pertain to FR use, it does not necessarily address all such requirements in every situation. Police agencies remain responsible for ensuring that any use of FR complies with all applicable legal requirements in any particular circumstances.

FR technology

  1. FR technology is a type of software that uses complex image processing techniques to detect and analyze the biometric features of an individual’s face for the purposes of identification or verification (also known as “authentication”) of an individual’s identity. While early versions relied on humans to manually select and measure the landmarks of an individual’s face, today the process of creating a facial template or “faceprint” is fully automated. Using advanced, “deep learning” algorithms trained on millions of examples, FR technology is able to create three-dimensional faceprints consisting of close to a hundred biometric features from two-dimensional images.

How is FR used?

  1. Identification and verification have specific meanings within the context of FR. Identification is used in the investigative sense of determining the identity of an otherwise unknown individual. Here, FR compares the image inputted into the system (also known as the “probe” image) against all other images in a database of pre-enrolled faces in an attempt to learn the individual’s identity. This is sometimes referred to as “1:N” matching.
  2. Verification is a special case of identification. It is used primarily for security in cases where an identity is already attached to the probe image. Rather than multiple images, FR compares the probe image to the one image in the database corresponding to the identity claim. If they match, the individual’s identity is proven to a higher level of assurance. In contrast to identification, verification is sometimes referred to as “1:1” matching.
  3. This guidance is focussed primarily on FR use for the purposes of identification. While verification is a common use of FR in general (e.g., to unlock one’s phone), the mandate of police agencies aligns more closely with identification.

How does FR work?

  1. FR has a number of components that each play a role in determining how it functions in a particular set of circumstances. Depending on the FR product in use, some components may be configurable by the end user. However, in cases where FR is purchased from a vendor rather than built in-house, the functionality of some components may be hard-coded into the product itself and can only be changed through switching products or by receiving an updated version.
  2. The following list provides a brief description of the key components police agencies should be aware of when deploying FR in a law enforcement context.
  3. Training data. The image processing algorithms that power FR are generated using machine learning methods that require a large number of labelled examples of individuals’ faces for training. This set of labelled examples is known as the “training data” of the algorithm. The machine learning process works by successively tuning the parameters of a mathematical model to best fit the structure of the training data. This reliance on training data has both advantages and disadvantages. On the one hand, the FR algorithm is able to “learn” to detect the distinguishable features of human faces, without the need for explicit programming. However, on the other hand, any systemic flaws or deficiencies in the training data may be learned and ultimately reproduced in the algorithm.
  4. Algorithms. FR works by performing a series of discrete tasks. There are four key tasks to be aware of, each of which is automated using an algorithm. However, taken together, they form one overarching algorithm for the system. Their work may be described as follows:
    • A face detector scans an image and picks out the faces in it
    • A faceprint generator takes an image of a face and creates a faceprint of it
    • A faceprint comparator compares two faceprints and returns a similarity score
    • A faceprint matcher searches a database of faces and (using the faceprint comparator) returns a list of candidates whose similarity score is at or above a given threshold
  5. Probe image. FR takes as input one or more images of individuals whose identities it then attempts to discover or verify. This image is known as a “probe” image. The manner in which a probe image is inputted into a FR system for identification may vary. When there is a significant delay between the initial collection and subsequent comparison of the image, this is referred to as “post” FR. When the collection and comparison happen instantaneously or close to instantaneously, this is referred to as “live” or “real-time” FR. A combined, special case is also possible. This is when there is a significant delay between collection and comparison, but the input into the system is nonetheless done automatically without the use of discretion. This may be referred to as “quasi-live” or “quasi-real-time” FR.
  6. Face database. To identify or verify the identity of an individual in a probe image, FR must have access to a database of identified faces against which to match the image of the individual in question. Usually, the face database in a FR initiative is provided by the end user. In the law enforcement context, examples may include a mugshot database or missing persons database. However, some FR vendors have attempted to compile their own databases, typically of images taken from the Internet, and include use of them as part of their product, the legal basis for which is far less clear.Footnote 1
  7. Faceprint. After detecting the various features of an individual’s face, FR measures them and encodes the result in a template of numerical values called a “faceprint.” A faceprint is a biometric, similar to a fingerprint template—a set of unique physical characteristics inherent to an individual that cannot be easily altered. Examples of biometric features encoded in a faceprint may include:
    • Distance between eyes
    • Width of nose
    • Distance between nose and lips
    • Depth of eye sockets
    • Shape of cheekbones
    • Length of jaw line
  8. Similarity score. Faces exhibit a wide range of variability, both in terms of their similarities and differences. Some may have virtually no similarities. Others may be similar or even identical in some respects but in others less so or not at all. Even the same face may look different depending on the circumstances, such as the level of light, orientation angle or the amount of time that has passed between images. To express the different ways faces may be similar or different, FR calculates a “similarity score,” also sometimes referred to as a “confidence score.” This is a numerical value representing the degree of similarity between two faceprints based on the biometric features encoded in them. A lower value indicates less similarity; a higher value more.
  9. Threshold. Even though two faceprints may have a positive similarity score, only those that meet or exceed a given threshold are considered potential matches. Some FR products allow the end user to set the threshold; others do not. How the threshold is set directly affects the number of results returned in a given search, with implications for the accuracy, including error rates, of the FR algorithm. Depending on the circumstances, some implementations may require higher thresholds than others.
  10. Additional FR components not mentioned in the list above include quality assessment and impersonation detection.Footnote 2

Privacy framework

  1. There are a range of potential uses and applications of FR, the risks of which exist along a spectrum that includes extremely serious privacy risks. Many of these risks may be difficult to mitigate, and can cause significant harm to individuals and groups.
  2. When considering the use of FR technology, it is therefore imperative that police agencies not only ensure they have lawful authority for the proposed use, but also that they apply standards of privacy protection that are proportionate to the potential harms involved. In some cases, potential harms may be so extreme that no amount of protections can be applied to adequately reduce the privacy risk. In other cases, it may be possible to appropriately manage risks through careful planning and diligent application of privacy protections.
  3. The framework outlined below is intended to assist police agencies in ensuring FR use is legal and developed with privacy protections that are proportionate to the specific risks involved. It is based on the application of internationally-accepted privacy principles, many of which are reflected in privacy laws. While specific legal obligations may vary by jurisdiction, we expect all police agencies to comply with the law and recommend they follow the best practices included in this framework given the high risk of harm that can result from the inappropriate use of FR technology.
  4. Police agencies – alongside their policing oversight bodies (e.g. police services boards or police commissioners) – are ultimately responsible for ensuring any use of FR technology is lawful and that privacy risks are managed appropriately. This guidance provides a baseline from which to build privacy protections into FR initiatives; police agencies may need to implement additional privacy protections depending on the nature and scope of risks to privacy posed by a specific initiative, and in accordance with the laws of the applicable jurisdiction.

Lawful authority

  1. Police agencies must have the legal authority to use FR, and use it in a manner that respects the privacy rights of Canadians. This section discusses both potential sources of legal authority for the use of FR by police agencies, as well as limits on those potential uses.
  2. Police agencies should obtain a legal opinion on whether they have the authority to implement or operate a proposed FR program, and on whether such a program adequately respects individuals’ rights. Unless these conditions are met, the proposed program cannot proceed.
  3. Canadian jurisdictions do not yet have legislation specifically addressing FR technology, with the exception of Quebec, which has enacted legislation governing biometrics.Footnote 3
  4. Since FR involves the collection and use of personal information, it is subject to applicable privacy legislation. Police agencies must also determine whether FR is compliant with the Charter and human rights laws.Footnote 4 The extent to which these laws permit police use of FR is unclear.

Sources of legal authority

  1. There is no specific legal framework for FR use in Canada. Rather, the legal framework comprises a patchwork of statutes and the common law. These include federal and provincial privacy laws, statutes regulating police powers and activities,Footnote 5 and Charter jurisprudence.
  2. As described in the previous section, FR requires that personal information be collected and used at multiple stages, such as: training a FR algorithm, creating a face database, collecting probe image(s) to be compared against that database, and possibly others. Lawful authority must exist for all steps that implicate personal information. Additionally, where police use vendors or third parties to supply FR services, including FR databases, police must ensure that these suppliers have the lawful authority to collect and use the personal information contained in their services.
  3. Sources of legal authority can include statutes or common law. Please note that the following discussion is primarily for illustrative purposes, and should not be interpreted as a comment on the validity or scope of potential legal authorities.

Judicial Authorization

  1. Police agencies may seek and obtain judicial authorization to collect and use faceprints in circumstances that merit such action. Section 487.01 of the Criminal Code provides for warrants that permit intrusion on individuals’ privacy when a judge is satisfied that there are reasonable grounds to believe that an offence has been or will be committed and that information concerning the offence will be obtained through the use of the technique or device; that it is in the best interests of the administration of justice to do so; and in instances where there is otherwise no statutory basis for doing so.Footnote 6 These authorizations are subject to the usual requirements for obtaining a warrant, in addition to any conditions or limitations imposed by courts when granting them.

Statutory Authority

  1. Police agencies may also find authority for their actions in specific statutes. For instance, the Identification of Criminals Act allows police agencies to fingerprint or photograph individuals charged with, or convicted of, certain crimes for the purposes of identification.Footnote 7 It also permits these identifiers to be published for the purposes of providing information to officers and others engaged in the administration or execution of the law. The Identification of Criminals Act does not, however, authorize the indiscriminate collection of photographs of other individuals at the broader population level. Legal advice would be required to determine if - and in what circumstances - this Act provides a legal basis for a specific use of FR, including as applied to existing mugshot databases associated with the Act.

Common law authority

  1. Police agencies have a crucial role in furthering public interests such as the preservation of peace, the prevention of crimes, and the administration of justice.Footnote 8 The common law, like statutory authorities, can authorize police actions that infringe on individual liberties in the pursuit of these societal goals. The term “liberty” in discussion of common law police powers encompasses constitutional rights and freedoms such as privacy, discussed further below.Footnote 9 For instance, courts have recognized the authority of police at common law to conduct warrantless searches in exigent circumstances.Footnote 10
  2. Canadian courts have set out limitations on police powers provided by common law.Footnote 11 In order for a police action to be authorized by common law it must:
    1. Fall within the general scope of a statutory or common law police duty; and
    2. Involve a justifiable exercise of police powers associated with that duty.Footnote 12
  3. The second requirement entails an assessment of whether the police action “is reasonably necessary for the fulfillment of the duty” and considers three factors:
    1. The importance of the performance of the duty to the public good;
    2. The necessity of the interference with individual liberty for the performance of the duty; and
    3. The extent of the interference with individual liberty.Footnote 13
  4. These factors require that the interference with liberty be necessary given the extent of the risk and the liberty at stake, and no more intrusive to liberty than reasonably necessary to address the risk.Footnote 14
  5. Judicial consideration of police use of FR has so far been limited, and Canadian courts have not had an opportunity to determine whether FR use is permitted by common law.Footnote 15 If FR use interferes with individuals’ reasonable expectations of privacy (“REP”) and is not enabled by a statute or common law, authorization under section 487.01 of the Criminal Code will be generally required for its use.

Respecting Canadians’ rights

  1. While police agencies require legal authority to proceed with a FR program, they must also protect the rights of individuals. Protections can be found in the Charter, as well as federal and provincial privacy legislation.

The Canadian Charter of Rights and Freedoms

  1. The Charter gives individuals the right to be secure against unreasonable searches and seizures by police.Footnote 16
  2. To determine whether police action constitutes an unreasonable search, a court first considers whether a search occurred. This depends on whether an individual had a REP within the context of the possible search, with consideration given to the totality of circumstances. The analysis of whether a search occurred is based on a number of interrelated factors such as the subject matter of the search and the nature of the individual’s privacy interests therein.
  3. In the context of FR, both the collection of a probe image and the collection of images for a face database must comply with the Charter. Given that these collections may raise different considerations, they may need to be assessed separately with respect to whether they engage a REP.
  4. In general, the REP assessment in the context of FR will likely need to consider, among other things, an individual’s privacy interests in the source photographs, the manner and circumstances in which they were collected, and the purposes for which they were collected. The assessment should also take into account the faceprints police generate or use and any additional information about an individual’s actions and whereabouts that may be revealed when connecting personal identity, video footage, meta-data, such as date and time stamp, and other identifying information.Footnote 17 The existence of a REP is also dependent on the context in which FR use occurs, and that may give rise to an individual’s subjective REP. Additionally, courts consider whether that REP is objectively reasonable, informed by what levels of privacy individuals ought to expect in a free and open society, highlighting that expectations of privacy are not just descriptive, but also normative in nature.Footnote 18
  5. While the assessment of whether a particular use of FR engages a REP will depend on the totality of the circumstances, it is clear that FR uses sensitive personal information that generally cannot be changed, and can be used to identify people for various privacy sensitive purposes. Furthermore, individuals’ faces being publicly visible does not automatically negate a REP. Individuals do not expect to be the subject of surveillance when going about their normal and lawful activities, and generally maintain some degree of a REP even when in public spaces.Footnote 19 Conversely, even if an individual might expect FR to be used, greater privacy invasions do not become socially acceptable simply due to technological advancements or evolving police practices.Footnote 20
  6. If a search is found to have occurred, a court will then determine whether the search was reasonable. In order for a search to be reasonable, it must be authorized by a reasonable law, and be conducted in a reasonable manner.

Privacy legislation

  1. Privacy legislation sets out the conditions under which public agencies may collect, use, disclose, and retain individuals’ personal information. Public institutions, including police agencies, are generally permitted by privacy legislation to collect personal information for valid purposes.Footnote 21 For example, some provincial public sector privacy legislation permits the collection of personal information for “law enforcement” purposes.Footnote 22 Police agencies would need to determine whether the collection of personal information by FR falls within the scope of “law enforcement” as defined by the legislation, or is otherwise within the scope of one of the other permitted purposes for collection of personal information set out in the legislation. Under federal legislation, the collection of personal information must relate directly to an operating program or activity of the federal institution collecting the personal information.Footnote 23 This means that federal institutions must ensure that they have parliamentary authority for the program or activity for which the information is being collected.Footnote 24
  2. While the requirements for valid collection of personal information vary by jurisdiction, the privacy principles discussed below are found in many privacy laws that will apply. Once collected, police agencies may generally only use personal information for the purpose for which the information was collected or compiled, or for a use consistent with that purpose, unless otherwise authorized. Compliance with privacy statutes does not necessarily cure any legal defect that may exist under the Charter.Footnote 25

Necessity and Proportionality

  1. The privacy principles of necessity and proportionality ensure that privacy-invasive practices are carried out for a sufficiently important objective, and that they are narrowly tailored so as not to intrude on privacy rights more than is necessary. In the case of law enforcement, there is a clear public interest in ensuring public safety, but also in protecting individuals’ fundamental right to privacy. While the right to privacy is not absolute, neither can the pursuit of public safety justify any form of rights violation. Therefore, police may only use means justifiable in a free and democratic society.
  2. The principles of necessity and proportionality exist in varying degrees in privacy laws, the common law test for police powers,Footnote 26 and the Charter.Footnote 27 In some jurisdictions, privacy laws require that public bodies, including police agencies, only collect personal information if it is “necessary” or “reasonably necessary” for their purposes.Footnote 28 In other jurisdictions, necessity and proportionality are considered a policy requirement or best practice under applicable privacy laws.Footnote 29 Whether as a strict legal requirement or as a best practice, in the context of FR use, necessity and proportionality will generally require an assessment of the following:
  3. Necessary to meet a specific need: Rights are not absolute, and can be limited where necessary to achieve a sufficiently important objective.Footnote 30 Necessary means more than useful. Significantly, the objective of an FR program must be defined with some specificity. It is not enough to rely on general public safety objectives to justify such an intrusive technology as FR. Police agencies should demonstrate the pressing and substantial nature of the specific objective through evidence. Further, the personal information collected should not be overbroad; it should be tailored and necessary to achieve the specific goal.
  4. Effectiveness: Police must be able to demonstrate that collection of personal information actually serves the purpose of the objective. Police should provide evidence that demonstrates why the specific use of FR being proposed will likely be effective in meeting specific program objectives. This demonstration of effectiveness should take into account any known accuracy issues associated with the specific use.
  5. Minimal Impairment: Police agencies’ intrusion on individuals’ privacy must not extend beyond what is reasonably necessary to achieve the state’s legitimate objective.Footnote 31 The scope of a program should be as narrow as possible. If using FR, police agencies should be able to demonstrate that there are no other less privacy invasive means that will reasonably achieve the objective,Footnote 32 and be able to show evidence as to why less privacy invasive measures are not used.Footnote 33
  6. Proportionality: This stage requires an assessment of whether the intrusion on privacy caused by the program is proportional to the benefit gained.Footnote 34 Police agencies should therefore seek, first, to identify how their specific use of FR will impact the privacy of individuals, having regard to general factors such as those mentioned in the introduction, but also specific impacts of their intended use of FR, for instance on certain groups. Then, police agencies should consider whether these intrusions on privacy are justified by the benefits of the specific deployment of FR. Inherent in this step is the recognition that not all objectives carry the same weight. For example, preventing a known terrorist plot would justify a greater privacy intrusion than catching someone who committed a minor act of vandalism. In assessing proportionality, police agencies must be open to the possibility that, in a free and democratic society, a proposed FR system which has a substantial impact on privacy (such as via mass surveillance) may never be proportional to the benefits gained. Where the impact is substantial, police agencies should be particularly cautious about proceeding in the absence of clear, comprehensive legal safeguards and controls capable of protecting the privacy and human rights of members of the general public. Seeking warrants and court authorizations can assist with ensuring that a proposed FR use meets the proportionality standard.

Designing for privacy

  1. It is important to design initiatives with privacy protections built in from the outset. Doing so will help to ensure that privacy protection is an essential component of any FR initiative or system. To be most effective, such protections must be incorporated during initial conception and planning, following through to execution, deployment, and beyond.Footnote 35
  2. Implementing privacy into the design of initiatives means police agencies must formally integrate privacy protections before engaging in any use of FR technology. Privacy protections must also be designed to protect all personal information involved in an initiative, including training data, faceprints, source images, face databases, and intelligence inferred from FR searches, in addition to any other personal information that may be collected, used, disclosed, or retained.

Privacy impact assessments

  1. A helpful way to design for privacy at the outset of an initiative is to conduct a privacy impact assessment (PIA). A PIA is a widely accepted tool for analyzing and addressing the impacts of initiatives on privacy. When used properly, PIAs help to ensure that programs and activities meet legal requirements and mitigate privacy risks.
  2. Police agencies should conduct a PIA before implementing or making substantial modifications to initiatives that involve the collection, use, or disclosure of personal information, including pilot projects. In some Canadian jurisdictions, government institutions are required to conduct PIAs by policy or legislation. Police should conduct the PIA in accordance with any applicable legislative and policy requirements in their jurisdiction.
  3. To conduct the PIA effectively, police should:
    • Document the PIA process in a PIA report
    • Follow any guidance issued by the relevant privacy commissioner on the PIA processFootnote 36
    • Mitigate risks raised in the PIA process
    • Designate an individual responsible for managing residual risks
    • Conduct a new PIA (or, if appropriate, amend the existing PIA) if major changes are made to the initiative that could impact the collection, use, disclosure, or retention of personal information
  4. To assist in understanding privacy risks, obligations, and mitigation measures relating to the proposed initiative, police agencies should consult relevant privacy experts and stakeholder groups, beginning early in the PIA process. Consultations should include:
    • The appropriate privacy commissioner’s office, in jurisdictions where these offices offer advisory services
    • The appropriate human rights commissioner’s office, given the strong connection between privacy rights and broader human rights, including the right to be free from discrimination
    • Representatives of equity-seeking groups, including local Indigenous groups and communitiesFootnote 37
    • Other relevant advisory boards and expert groups representing individuals impacted by the initiative, where available for consultation
  5. Risk assessment is an integral part of the PIA process. Police agencies should consider all relevant risks when conducting the PIA, including potential impacts on:
    • Individuals
    • Communities in which FR may be deployed
    • Groups that may be disproportionately harmed by privacy incursions
    • Public confidence in the collection and use of personal information by police agencies
    • Human and democratic rights, including rights to privacy, equality, peaceful assembly and free expression
  6. After assessing these potential impacts, police agencies should not proceed with further planning and implementation of the initiative unless they can clearly explain:Footnote 38
    1. Why the proposed use of FR is necessary to meet a specific need that is rationally connected to a pressing or substantial public goal
    2. What the expected benefits of the initiative consist of, and how they are proportionate to the risks involved
    3. Why other less intrusive measures are not sufficient
    4. How risks will be minimized during implementation of the initiative
  7. Police agencies should document their risk assessment, including explanations for the above, in the PIA report. To assist in doing so effectively, police agencies should ensure the PIA is conducted by individuals with appropriate competencies in identifying and analyzing associated risks. This may involve input from:
    • Legal counsel
    • Privacy staff
    • Program staff (those who will administer the initiative)
    • Stakeholder groups
    • Technical experts
    • Management
    • Third party service providers

Monitoring and re-assessment

  1. Privacy risk analysis is an ongoing process that does not stop with the conduct of a PIA or the deployment of an initiative. PIAs should be evergreen and can help to ensure the ongoing management of privacy risks as part of a police agency’s overall risk management strategy.
  2. Police agencies should monitor and re-assess privacy risks, and the effectiveness of privacy protections, on an ongoing basis. To help achieve this, police should implement the following best practices:
    • Annual compliance audits. Internal or external individuals with appropriate expertise should conduct compliance audits annually and should assess:
      • Ongoing compliance with legal requirements, including lawful authority of the initiative
      • Ongoing compliance with the overall privacy management program of the initiative, including policies, procedures and protocols related to accountability, transparency, data security, disclosure, retention, and purpose limitation
      • The initiative’s data holdings and retention system, to ensure personal information is retained and destroyed in accordance with the initiative’s retention procedures
      • Third party compliance with the privacy obligations of the initiative, including compliance with procurement requirements and information sharing agreements (see recommendations in the Accountability and Purpose limitation sections below)
    • Annual program review. To ensure program effectiveness, reviews should be conducted annually, and should assess the extent to which program activities are achieving the goals of the initiative. Assessment should be done using demonstrable criteria (for example, number of arrests, charges and convictions resulting from the use of FR).
  3. Police should continue to consult relevant stakeholders, including representatives of groups affected by FR use, throughout deployment. Such consultations can be a valuable source of feedback on the privacy impacts of initiatives.
  4. Police should update policies, procedures, and written agreements, as needed and as appropriate to ensure continued compliance with privacy responsibilities. For instance, agencies may need to adjust policies, procedures, and agreements in light of audits, program reviews, breaches, law reform, new guidance or technological developments. These changes should be communicated to relevant personnel and documented in the PIA.

Accuracy

  1. Police agencies must ensure that personal information collected and used as part of a FR initiative is sufficiently accurate and up to date. The accuracy of FR software cannot be taken for granted given the serious risks to individuals’ rights posed by the collection and use of inaccurate information.
  2. Fulfilling accuracy obligations within the context of a FR initiative requires consideration of the FR system as a whole. FR is comprised of a number of components, each of which raises unique considerations. Only when the constituent parts of a FR system process personal information accurately and fairly can the system as a whole be said to do the same.
  3. With respect to training data, one of the main considerations is the role it may play in contributing to bias in the FR system. If the training data used to generate a FR algorithm lacks sufficient representation of faces from certain demographics, the algorithm will likely produce disparate accuracy metrics across groups. It is possible for a FR algorithm to produce flawed results, particularly where it has been trained on non-representative or otherwise biased data. Studies have demonstrated considerable variation in FR algorithms with respect to the error rates they produce for faces of individuals from different racial backgrounds and across genders,Footnote 39 with other research showing that a lack of diverse and high quality training data is the main culprit.Footnote 40
  4. Regarding the FR algorithm, there are three key considerations to be aware of with respect to accuracy. The first is that accuracy is understood statistically. The output of a FR algorithm is a probabilistic inference as to the likelihood that two images are of the same person. It is not a verified fact about the individual. As such, accuracy is not a binary “true/false” measure, but rather is computed based on the observed error rates of the algorithm across searches. There are two types of errors to consider:
    1. False positives (also known as “type I” errors) where the algorithm returns a candidate match in the face database that is not of the individual in the probe image; and
    2. False negatives (also known as “type II” errors) where the algorithm fails to return a genuine match in the face database even though the database contains one.
  5. The second consideration is that there is generally a trade-off between the false positive and false negative rate of a FR algorithm. The reason for this has to do with another component, the threshold for a probable match. Depending on how high (or low) the threshold is set, a FR algorithm will generally return fewer (or more) candidate matches. However, how many results the algorithm returns has implications on its error rates. While a higher threshold will return only higher probability candidates and lead to fewer false positives, this same threshold will in general make the algorithm more likely to miss lower probability matches and potentially lead to greater false negatives.
  6. Lastly, it is important to consider that the determination of an appropriate threshold will depend on the nature, scope, context and purpose of the FR initiative, taking into account the risks to the rights and freedoms of individuals. Strictly speaking, there is no single appropriate threshold. It is a matter of prioritizing the reduction of certain types of errors based on the nature and severity of risks they pose to individuals, while at the same time ensuring the overall effectiveness of the FR system.
  7. The face database and probe images are two other components that raise important issues regarding accuracy and fairness. One consideration is the quality and/or age of the images and the effects this may have on the accuracy of the FR system. For example, studies have shown that lower quality images lead to declines in accuracy and longer time elapses between images of the same individual increase false negative rates.Footnote 41 However, it is also important to consider the demographics of FR images, in particular, who is in the face database and whether the disproportionate representation of certain groups may lead to adverse effects. A FR system may be susceptible to a “feedback loop” where the makeup of individuals in a face database leads police to repeatedly cast suspicion on them, their associates or their community, thereby increasing the disproportionality of their demographic representation over time.
  8. A final component to mention is human review. While human review serves as an important mitigation measure to reduce the risks of inaccuracy and bias, it may also inadvertently reintroduce those very same risks into the FR system. Unlike computers, humans can be overwhelmed with too much information. Effective human review requires training in forensic face comparison as well as the functioning of the FR system, including explanations for system outputs. In addition, reviewers must be granted a sufficient amount of time proportionate to the number of candidate matches they are expected to adjudicate. However, even with sufficient training and time, human review may be unduly influenced by the level of statistical precision in the FR system. It is important to avoid “automation bias” or the tendency to over-rely on automated systems when making human decisions. Just because the FR system uses mathematical calculations does not necessarily mean that its predictions are accurate or fair.
  9. Given the above, it is imperative that police agencies take steps to minimize inaccuracy and bias in any deployment of FR technology. These steps should include the following best practices:
  10. Police agencies should require FR vendors to:
    • Make their FR algorithms available for independent external testingFootnote 42
      • The testing should include an evaluation of the accuracy of the algorithm as well as its performance across distinct socio-demographic populations (for example, racial, gender and age groups)
    • Indicate, in the results of each FR search, the similarity score; that is, an estimate of the probability that a given match is accurate (for example, as a percentage)
  11. Police agencies should also:
    • Determine an appropriate threshold so as to prioritize reducing certain types of error based on the nature and severity of risks, while ensuring the overall effectiveness of the FR system
    • Internally test for bias and inaccuracy in the performance of the FR system as a whole before deployment, and routinely during deployment
    • Ensure testing is carried out by individuals or organizations qualified to independently assess the performance of FR systems
    • Ensure testing follows recognized standards and technical specifications, including for performance metrics, recording of test data, reporting of test results, test protocols and methodologies as well as demographic variationsFootnote 43
    • Include the similarity score, as indicated in FR search results, when recording or disclosing information about a match
    • Use only high-quality, timely images in the face database and as probe images
    • Consider using an image quality assessment algorithm to assist in determining the level of quality of images inputted into the FR systemFootnote 44
    • Keep their FR systems up to date as the technology for FR algorithms improves
    • Discontinue FR use if internal or external testing reveals:
      • insufficient statistical accuracy in the FR system, or
      • meaningful variation in error rates across socio-demographic populations
  12. Police agencies should not:
    • Fully automate administrative decisions based on the results of FR matching procedures
      • In other words, decisions that affect legal rights, privileges, or interests should be made by humans, including, for example, decisions to detain, investigate or charge individuals in relation to a crime
    • Act on a FR match unless that match has been reviewed within a suitable timeframe by an officer trained in FR identification technology, procedures, outputs and limitations
      • In addition, there should be a separation of roles between the reviewing officer and the requesting investigator or field agent to protect against confirmation bias

Accountability

  1. Police agencies are responsible for personal information under their control and should be able to demonstrate their compliance with legal requirements.

Privacy management program

  1. An accountable police agency should have a privacy management program (PMP) in place, with clear structures, policies, systems and procedures to distribute privacy responsibilities, coordinate privacy work, manage privacy risks and ensure compliance with privacy laws.
  2. Essential components of a PMP include:
    • Policies and procedures specifying the circumstances and methods of collecting, using, retaining, and disclosing personal information in relation to the initiative
    • Protocols specifying the circumstances under which system users are authorized to perform a FR search
    • A standard operating procedure for use of the FR system, including instructions on how to perform a FR search
    • A system for documenting FR searches, including justification for initiating the search
    • A system for authorizing use of, and access to, the FR system
    • Protocols for detecting and re-dressing non-compliance with the policies and procedures of the initiative, including annual compliance audits (see Monitoring and re-assessment section above), and other mechanisms to hold individuals meaningfully accountable for misuse of the FR system
    • Recurrent privacy training and education on program policies, procedures, protocols and controls, to be revisited and updated when necessary (see Training, below)
  3. The PMP should also establish a clear reporting structure for the initiative that designates an individual responsible for managing the privacy program and overseeing compliance with privacy obligations, including access requests. Further, it should include a mechanism for officers and other personnel to inform senior management about any incidents or issues that arise, or any new investigative tools that may involve the collection or use of biometric information. While management approval should be required before using new investigative tools, approval does not replace applicable privacy requirements, including the conduct of a PIA.
  4. Police agencies should also implement a system for logging all uses of FR software, including all searches and the credentials of individuals who performed each search. The logging procedure should be automated, beyond the control of individuals who access and use the system to perform FR searches, and included in annual audits.
  5. Similarly, police agencies should ensure that all disclosures of personal information are logged as well. These logs should include a record of the authority under which the information was disclosed (including reference to the governing contract or information sharing agreement (ISA)), to whom the information was disclosed, the means of disclosure, any conditions attached to the disclosure, and the identity of the program operator who authorized the disclosure. Police agencies should make access and disclosure logs available to oversight bodies for inspection on request.
  6. Police agencies should also implement any additional administrative, technological, and physical controls necessary to ensure that individuals can access and use the FR system only as necessary to fulfill the objectives of the initiative. These controls should be sufficient to prevent use of FR software, or other novel biometric technologies, outside of a lawful initiative that is approved and overseen by senior management.

Procurement

  1. Police agencies are responsible for ensuring third parties involved in policing initiatives comply with the privacy obligations of the initiative.
  2. Police agencies should use the procurement process to help ensure third party compliance by establishing contractual requirements for vendors of FR software and services (see Purpose limitation section below). Appropriate procurement requirements may vary by initiative, but in general they should include, at a minimum:
    • Demonstrable compliance of the vendor with Canadian privacy law, including collection and use of personal information for training purposes
    • Transparency concerning the source of the vendor’s training data
    • Requirements to ensure training data is demographically representative
    • Performance requirements assessed through independent testing (for example, percentile ranking of the vendor in NIST testing)
    • Performance requirements to be assessed through internal testing by police agencies
    • Prohibitions on retention, disclosure, and secondary use of personal information supplied to the vendor by police
    • Ability to audit vendors for compliance with contractual obligations
  3. Police agencies can also use the procurement process to help ensure compliance with internal policies and procedures. For example, police agencies can require prospective vendors to supply products that:
    • Include strong authentication procedures
    • Automatically log uses of FR software
    • Require supervisory sign-off to initiate a search
    • Require supporting documentation (for example, case ID) to initiate a search

Training

  1. Police agencies should establish a specialized training program for initiatives that involve FR technology. They should ensure individuals complete the program successfully before gaining access to the FR system. Doing so can help to ensure the agency adheres to the initiative’s policies and procedures for collecting, storing, using and disclosing personal information.
  2. Training for individuals who are authorized to use FR technology should ensure those individuals have a comprehensive understanding of the following:
    • Policies and procedures of the initiative relating to access, use, disclosure, and retention of personal information and FR systems
    • Legal responsibilities for handling personal information in the context of the initiative
    • Procedures for human review and forensic face comparison techniques
    • The functioning of FR technology, generally and as applied to the initiative, including system outputs
    • The potential for bias in FR matching procedures on the basis of race, gender and other relevant demographic characteristics
    • The potential for errors due to low quality probe images or outdated errors in the face database
    Police agencies should update training materials as needed to ensure FR system users always have sufficient knowledge, competencies and experience to fulfill their duties in compliance with legal requirements.

Data minimization

  1. Police agencies should limit the collection of personal information to that which is directly relevant and necessary for the specific objectives of a FR initiative.
  2. Police agencies should implement protocols specifying the circumstances under which faceprint and related data may be collected, used, retained, and disclosed. These protocols should include a set of clear, specific, and objective criteria used to determine which images may be included in a face database. The criteria should be established at the outset of the initiative, and should be sufficiently limiting to ensure the initiative meets the principles of necessity and proportionality.
  3. Police agencies should also take any additional steps needed to minimize the collection and use of personal information that is not necessary for the initiative. For instance, when performing a FR search, police agencies should ensure that images are cropped such that only the specific individual of interest is searched. Police should avoid cross-linking personal information between the FR initiative and other databases, except as necessary to lawfully fulfill the purposes of the initiative. To the greatest extent possible, information collected or used in an FR initiative should be stored separately from other personal information, and isolated from other networks.

Purpose limitation

  1. Police agencies must ensure that personal information is only used for the purpose for which it was collected, or a purpose consistent with that purpose. They must also ensure that each use of personal information falls within the scope of the initiative’s lawful authority.
  2. To assist in meeting these requirements, police agencies should implement a comprehensive set of administrative, technical, and physical controls for managing access to, and use of, data and software used in FR initiatives (see Accountability section for further details).
  3. Police agencies must also ensure that third parties engaged on their behalf do not use personal information transferred to them during an initiative for purposes other than those for which the information was originally transferred. For example, if a police agency transfers an image to a third party FR software vendor for identification purposes, the police agency must ensure the vendor does not use the image (or associated faceprint data) as training data, or enter that data into a face database for comparison in future searches. Training of the algorithm should be conducted by the vendor on their own lawfully obtained dataset(s), which should not include information shared by police.
  4. To help meet this requirement, police agencies should use contracts to codify the terms and conditions under which personal information is disclosed to FR software vendors or other third parties as part of an initiative, as well as other privacy protections.Footnote 45 This includes any disclosures of images, databases, results of FR searches, or other personal information.
  5. When sharing personal information with other law enforcement or public bodies, police agencies should use written information sharing agreements (ISAs) codifying the above terms and conditions, and other privacy protections, as indicated in paragraph 111.
  6. Subject to specific legal requirements in each jurisdiction, ISAs should specify, at a minimum:
    • The lawful authority under which information is to be disclosed
    • The specific elements of personal information being disclosed
    • The specific purposes for the disclosure
    • Limits on use and onward transfer
    • Specific safeguarding measures
    • Data localization requirements, if any
    • Data breach procedures
    • Retention periods and data destruction requirements
    • Accountability measures, including compliance monitoring

Retention

  1. Government institutions, including police agencies, should not retain personal information for longer than necessary to fulfill the purposes of an initiative. In the context of policing, this may include retaining information as evidence in court or administrative proceedings.
  2. To help meet these requirements, police agencies should implement retention protocols that account for considerations applicable to different components of the FR system.

Face databases

  1. Police agencies should use a set of clear, specific, and objective criteria to determine which images may be included in a face database. These criteria should be established at the outset of the initiative, and should be sufficiently limiting to ensure the initiative meets the principles of necessity and proportionality.
  2. Images should be removed from the face database and destroyed immediately if, at any time, they do not meet all criteria for inclusion. For example, if an individual’s image is retained in a face database on the basis of a criminal charge or criminal conviction, then the image should be removed from the face database and destroyed upon acquittal or the receipt of another non-conviction disposition, or the granting of a pardon. If a face database contains images that are lawfully retained as part of a separate initiative, those images may continue to be retained, for use as part of that separate initiative only, in accordance with the retention schedule applicable to that initiative. In such cases, images should be rendered inaccessible to the FR system once they no longer meet criteria for inclusion in the FR initiative.
  3. In general, the process for removing, destroying, and rendering images inaccessible to FR systems should not require any action on the part of individuals whose images are retained. Images in a face database that register a match during a lawful FR search may be retained separately from the face database, in accordance with any applicable requirements to retain evidence in policing initiatives.

Probe images

  1. Police should ensure that the FR system does not automatically save, store, or otherwise retain probe images after a FR search is conducted.
  2. In general, probe images stored outside the FR system should be destroyed if they do not register a match when searched against a face database. Probe images that consist of evidence collected lawfully in the course of a policing initiative may be retained, as needed and separately from the FR system, in accordance with retention requirements applicable to that evidence. However, images of innocent bystanders, or other individuals who are not involved in an investigation, should not be retained.
  3. Probe images that register a match when searched against a FR system may be retained in accordance with any reasonable retention requirements applicable in the circumstances, for example as evidence in a criminal proceeding or as information used for an administrative purpose.Footnote 46

Testing

  1. It may sometimes be necessary to retain probe images for longer than would otherwise be appropriate, in order to conduct internal testing of the FR system’s performance. Any retention of probe images for testing purposes should be limited to that which is strictly necessary to meet accuracy or other performance requirements for the specific initiative in which FR is being deployed. Images retained for testing purposes should be destroyed immediately once relevant testing procedures are complete.

Data security

  1. Personal information must be protected by appropriate security measures relative to the sensitivity of the information.
  2. Given the high sensitivity of facial data, police agencies are expected to implement highly protective security measures as part of FR initiatives. At a minimum, these should include the following measures, although these may not be sufficient in all cases:
    • Use encryption and other digital safeguarding tools to secure data while in storage and while in transit between databases, servers and end-user devices
    • Ensure records and equipment, including memory disks, servers and end-user devices, are used and stored only in secure physical locations
    • Log all access to, and use of, FR software and databases
    • Review and update security measures regularly to address evolving security threats and vulnerabilities
    • Use contracts and ISAs to help ensure any third parties involved in the initiative follow relevant best practices for data security
  3. Data security requirements may also require police to ensure that personal information they collect or create during a FR initiative resides within the geographic boundaries of Canada. In some Canadian jurisdictions, police agencies may be explicitly required to do so by law or policy instruments.Footnote 47 In others, they must ensure that personal information released outside of their jurisdiction will receive equivalent protection.Footnote 48

Openness, transparency and individual access

  1. Wherever possible, individuals and the public must be informed of the purpose of the collection of their personal information, including how the information may be used or disclosed.
  2. In the context of policing initiatives, however, it may not always be possible to provide individuals with access to complete information about the collection of their personal information, for example when individuals are the subject of an ongoing investigation.
  3. In general, police agencies should provide notice to individuals if their image is entered into a face database, unless doing so would defeat the purpose or prejudice the use for which the image was collected. The notice should indicate:
    • The criteria used to justify inclusion of the image in the face database
    • The conditions under which the individual’s image will be removed from the database
    • The personal information bank (PIB) applicable to the collection
  4. Similarly, police agencies should post public notice of FR use if FR technology is used in public spaces, unless doing so would defeat the purpose or prejudice the use for which facial images are collected. Any such notice should:
    • Be prominently visible within the affected space
    • Clearly indicate the use of FR in the affected space
    • Contain links or references to further information on the initiative (for example, a QR code or web link to the PIA summary)
  5. While providing individuals with direct notice will not always be possible, it is important for police agencies to implement transparency measures at the program level for FR initiatives. Doing so can help to inform the public about FR initiatives, and can increase public confidence that such initiatives are being implemented responsibly. The following should be made public via the agency’s public website, in addition to any other appropriate communications channel(s):
    • A formal policy on the agency’s use of FR that sets out the circumstances in which the agency will and will not engage in FR use and how personal information will be handled
    • A plain-language explanation of the initiative, including its objectives, sources of facial images, choice of threshold value, expected duration, and general methods of operation
    • A link to the PIA summary
    • Results of any testing for accuracy or bias performed in relation to the initiative, including a description of the testing methodology
    • Details relating to the procurement of the FR system, including information on vendors
    • Information about any ISAs in place
    • Results or summaries of stakeholder consultations
    • Annual program summaries, indicating:
      • The number of searches performed over the past year
      • Metrics concerning the effectiveness of the initiative (for example, use in making arrests or securing convictions)
      • Statistics on the size and demographic makeup of the face database
      • Any significant changes to the initiative
  6. Individuals also generally have a right of access to and correction of their personal information, and may have a right to removal in some circumstances. Police agencies should ensure a system is in place to fulfill such requests from individuals. The system should ensure access requests are responded to within applicable legal timelines.

Conclusion

  1. Police agencies are responsible for assessing the risks associated with any contemplated use of FR and to mitigate potential harms by building appropriate privacy protections into the design of proposed initiatives. If police agencies proceed with FR use, they must follow through on privacy protection over the lifespan of the initiative. While specific legal requirements vary by jurisdiction, the recommendations in this guidance document can help to ensure proposed uses of FR meet legal requirements, minimize privacy risks, and respect Canadians’ fundamental right to privacy.

Summary of Recommendations

Below is an abbreviated summary of key recommendations made in this guidance document. The summary is compiled for reference purposes only; please consult the guidance document for full recommendations.

When proposing, developing, and implementing initiatives involving the use of FR technology, we recommend police agencies:Footnote 49

  • Ensure lawful authority exists for each collection, use, retention, and disclosure of personal information.
    • Lawful authority must exist for all steps of FR use, including training a FR algorithm, creating a face database, and collecting probe images.
    • Ensure any third parties involved in the collection or use of personal information operate with lawful authority.
  • Design for privacy
    • Integrate privacy protections into proposed initiatives before using FR technology.
    • Conduct a PIA to help ensure FR programs meet legal requirements and to identify and mitigate privacy risks.
    • Monitor and re-assess privacy risks and the effectiveness of privacy protections on an ongoing basis.
  • Ensure personal information is accurate and up to date.
    • Test and configure data and systems as appropriate to identify and reduce inaccuracy and bias.
    • Keep a “human in the loop” to review FR matches.
  • Minimize collection of personal information to that which is directly relevant and necessary based on the objectives of the initiative.
  • Ensure personal information is used only for the purpose for which it was collected, or a purpose consistent with that purpose.
    • Implement administrative, technical, and physical controls for managing access to, and use of, FR data and software.
    • Use contracts and information sharing agreements to limit use of personal information disclosed to third parties.
  • Protect personal information using security measures that are appropriate given the sensitivity of the information, and use contracts and ISAs to ensure third parties are required to do the same.
  • Retain personal information for no longer than is necessary to fulfill the purposes of the initiative.
    • Implement retention protocols that account for considerations applicable to different components of the FR system, including face databases, probe images, and training data.
    • Ensure images are removed, destroyed, and/or rendered inaccessible to the FR system in accordance with retention protocols, including upon acquittal or the receipt of another non-conviction disposition, or the granting of a pardon.
    • Do not retain images of innocent bystanders or other individuals who are not involved in an investigation.
  • Implement openness and transparency measures as appropriate to allow individuals and the public to be informed about the initiative.
    • Implement transparency measures at the program level to help increase public confidence that the initiative is being implemented responsibly
    • Implement policies and procedures to accommodate access requests whenever possible.
  • Implement effective accountability measures.
    • Implement a privacy management program with clear structures, policies, systems and procedures to distribute privacy responsibilities, coordinate privacy work, manage privacy risks and ensure compliance with privacy laws.
    • Use the procurement process to establish contractual requirements for vendors of FR software and services.
    • Log all uses of FR, including all disclosures of personal information outside the organization.
    • Ensure all personnel with access to FR systems are appropriately trained.

This guidance document was prepared jointly by:

Office of the Privacy Commissioner of Canada
Office of the Information and Privacy Commissioner of Ontario
Commission d’accès à l’information du Québec
Office of the Information and Privacy Commissioner for Nova Scotia
New Brunswick Ombud’s Office
Office of the Manitoba Ombudsman
Office of the Information and Privacy Commissioner for British Columbia
Office of the Information and Privacy Commissioner for Prince Edward Island
Office of the Saskatchewan Information and Privacy Commissioner
Office of the Information and Privacy Commissioner of Alberta
Office of the Information and Privacy Commissioner of Newfoundland and Labrador
Office of the Information and Privacy Commissioner of the Northwest Territories
Office of the Yukon Information and Privacy Commissioner
Office of the Information and Privacy Commissioner of Nunavut


Date modified: