Language selection


Draft privacy guidance on facial recognition for police agencies


  1. Facial recognition (FR) has emerged as a powerful technology that can pose serious risks to privacy. The purpose of this guidance is to clarify police agencies’ privacy obligations relating to the use of this technology, with a view to ensuring any use of FR complies with the law, minimizes privacy risks, and respects privacy rights.
  2. This guidance is issued jointly by the privacy protection authorities for each province and territory of Canada, and the Office of the Privacy Commissioner of Canada.


  1. This guidance is for federal, provincial, regional and municipal police agencies. It was not written for other public organizations outside of the police that are involved in law enforcement activities (for example, border control), nor for private-sector organizations that carry out similar activities (for example, private security). However, these organizations must still ensure their compliance with all applicable laws, including privacy and human rights laws. Sections of this guidance may be helpful for that purpose.


  1. FR technology has emerged as a tool of significant interest to law enforcement. Used responsibly and in the right circumstances, FR may assist police agencies in carrying out a variety of public safety initiatives, including investigations into criminal wrongdoing and the search for missing persons.
  2. At the same time, FR has the potential to be a highly invasive surveillance technology.
  3. The use of FR involves the collection and processing of sensitive personal information: biometric facial data is unique to each individual, unlikely to vary significantly over periods of time, and difficult to change in its underlying features. This information speaks to the very core of individual identity, and its collection and use by police supports the ability to identify and potentially surveil individuals.
  4. FR technology also scales easily, costs relatively little to use, and can be deployed as an add-on to existing surveillance infrastructure. This includes the capacity to automate extraction of identifying information from a wide range of sources, including virtually any source of digital imagery, both online and off.
  5. The prospect of police agencies integrating FR technology into law enforcement initiatives thus raises the possibility of serious privacy harms unless appropriate privacy protections are put in place.
  6. The freedom to live and develop free from surveillance is a fundamental human right. In Canada, public sector statutory rights to privacy are recognized as quasi-constitutional in nature, and aspects of the right to privacy are protected by the Canadian Charter of Rights and Freedoms (the Charter). These rights dictate that individuals must be able to navigate public, semi-public, and private spaces without the risk of their activities being routinely identified, tracked and monitored. While certain intrusions on this right can be justified in specific circumstances, individuals do not forego their right to privacy merely by participating in the world in ways that may reveal their face to others, or that may enable their image to be captured on camera.
  7. Privacy is also necessary for the realization of other fundamental rights that are protected by the Charter. Privacy is vital to dignity, autonomy, and personal growth, and it is a basic prerequisite to the free and open participation of individuals in democratic life. When surveillance increases, individuals can be deterred from exercising these rights and freedoms.
  8. Surveillance is also linked with systemic discrimination, including discrimination experienced by racialized communities. Longstanding concerns about the disproportionate policing of racialized communities raise serious questions about the privacy and human rights impact of applying FR technology to, for example, historical datasets such as mugshot databases. When considering the impact of FR technology on individual privacy then, law enforcement agencies must also account for and respect the right to the equal protection and equal benefit of the law without discrimination.
  9. If used inappropriately, FR technology may therefore have lasting and severe effects on privacy and other fundamental rights. This includes not only harms to specific individuals whose personal information may be collected, processed, or disclosed, but also more general societal harms that flow from increases in the capacity of authorities to monitor the physical and virtual spaces in which we interact. Such increases can be difficult to contain once they are set in motion.
  10. The nature of these risks calls for collective reflection on the limits of acceptable FR use. These limits are defined not only by the risks associated with specific FR initiatives, but also by the aggregate effects of all such initiatives, taken together over time, on the general surveillance of public and private space. The question of where acceptable FR use begins and ends is in part, then, a question of the expectations we set now for the future protection of privacy in the face of ever-increasing technological capabilities to intrude on Canadians’ reasonable expectations of privacy.
  11. The process of establishing appropriate limits on FR use remains incomplete. Unlike other forms of biometrics collected by law enforcement such as photographs, fingerprints or DNA profiles, FR use is not subject to a clear and comprehensive set of rules. Instead, its use is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by FR. This creates room for uncertainty concerning what uses of FR may be acceptable, and under what circumstances.
  12. It is in this context that our offices issue the present guidance document. The guidance is meant to clarify legal responsibilities, as they currently stand, with a view to ensuring any use of FR by police agencies complies with the law, minimizes privacy risks and respects the fundamental human right to privacy. This guidance should not be read as justifying, endorsing or approving the use of FR by police agencies. Nor does it replace the broader need for a more robust regulatory framework for FR.
  13. While this document addresses many legal requirements that pertain to FR use, it does not necessarily address all such requirements. Police agencies remain responsible for ensuring that any use of FR complies with all applicable legal requirements.

FR technology

  1. FR technology is a type of software that uses complex image processing techniques to detect and analyze the biometric features of an individual’s face for the purposes of identification or verification (also known as “authentication”) of an individual’s identity. While early versions relied on humans to manually select and measure the landmarks of an individual’s face, today the process of creating a facial template or “faceprint” is fully automated. Using advanced, “deep learning” algorithms trained on millions of examples, FR technology is able to create three-dimensional faceprints consisting of close to a hundred biometric features from two-dimensional images.

How is FR used?

  1. Identification and verification have specific meanings within the context of FR. Identification is used in the investigative sense of determining the identity of an otherwise unknown individual. Here, FR compares the image inputted into the system (also known as the “probe” image) against all other images in a database of pre-enrolled faces in an attempt to learn the individual’s identity. This is sometimes referred to as “1:N” matching.
  2. Verification is a special case of identification. It is used primarily for security in cases where an identity is already attached to the probe image. Rather than multiple images, FR compares the probe image to the one image in the database corresponding to the identity claim. If they match, the individual’s identity is proven to a higher level of assurance. In contrast to identification, verification is sometimes referred to as “1:1” matching.
  3. This guidance is focussed primarily on FR use for the purposes of identification. While verification is a common use of FR in general (e.g., to unlock one’s phone), the mandate of law enforcement organizations aligns more closely with identification.

How does FR work?

  1. FR has a number of components that each play a role in determining how it functions in a particular set of circumstances. Depending on the FR product in use, some components may be configurable by the end user. However, in cases where FR is purchased from a vendor rather than built in-house, the functionality of some components will be hard-coded into the product itself and can only be changed through switching products or by receiving an updated version.
  2. The following list provides a brief description of the key components police agencies should be aware of when deploying FR in a law enforcement context.
  3. Training data. The image processing algorithms that power FR are generated using machine learning methods that take labelled examples of individuals’ faces as their input. This input is known as the training data of the algorithm. By tuning the parameters of a model to best fit this data, FR is able to “learn” to detect the distinguishable features of human faces, without the need for explicit programming by its developers.
  4. Algorithms. FR works by performing a series of discrete tasks. There are four key ones to be aware of. Each is automated using an algorithm. However, taken together, they form one overarching algorithm for the system. Their work may be described as follows:
    • A face detector scans an image and picks out the faces in it
    • A faceprint generator takes an image of a face and creates a faceprint of it
    • A faceprint comparator compares two faceprints and returns a similarity score and
    • A faceprint matcher searches a database of faces and (using the faceprint comparator) returns a list of candidates whose similarity score is at or above a given threshold
  5. Face database. To identify or verify the identity of an individual, FR must have access to a database of identified faces against which to match an image of the individual in question. Usually, the face database in a FR initiative is provided by the end user. In the law enforcement context, examples may include a mugshot database or missing persons database. However, some FR vendors have attempted to compile their own databases, typically of images taken from the Internet, and include use of them as part of their product, the legal basis for which is far less clear.Footnote 1
  6. Faceprint. After detecting the various features of an individual’s face, FR measures them and encodes the result in a vector of numerical values called a faceprint. A faceprint is a biometric, similar to a fingerprint—a set of unique physical characteristics inherent to an individual that cannot be easily altered. Examples of biometric features encoded in a faceprint may include:
    • distance between eyes
    • width of nose
    • distance between nose and lips
    • depth of eye sockets
    • shape of cheekbones
    • length of jaw line
  7. Similarity score. Faces exhibit a wide range of variability, both in terms of their similarities and differences. Some may have virtually no similarities. Others may be similar or even identical in some respects but in others less so or not at all. Even the same face may look different depending on the circumstances, such as the level of light, orientation angle or the amount of time that has passed between images. To express the different ways faces may be similar or different, FR calculates a “similarity score,” also sometimes referred to as a “confidence score.” This is a numerical value representing the degree of similarity between two faceprints based on the biometric features encoded in them. A lower value indicates less similarity; a higher value more.
  8. Threshold. Even though two faceprints may have a positive similarity score, only those that meet or exceed a given threshold are considered potential matches. Some FR products allow the end user to set the threshold; others do not. How the threshold is set directly affects the number of results returned in a given search, with implications for the accuracy, including error rates, of the FR algorithm. Depending on the circumstances, some implementations may require higher thresholds than others.
  9. Additional FR components not mentioned in the list above include quality assessment and impersonation detection.

Privacy framework

  1. As explained in the introduction, the use of FR technology can pose extremely serious privacy risks. Many of these risks may be difficult to mitigate, and can cause significant harm to individuals and groups.
  2. When considering the use of FR technology, it is therefore imperative that police agencies not only ensure they have lawful authority for the proposed use, but also that they apply standards of privacy protection that are proportionate to the potential harms involved. In some cases, potential harms may be so extreme that no amount of protections can be applied to adequately reduce the privacy risk. In other cases, it may be possible to appropriately manage risks through careful planning and diligent application of privacy protections.
  3. The framework outlined below is intended to assist police agencies in ensuring FR use is legal and developed with privacy protections that are proportionate to the specific risks involved. It is based on the application of internationally-accepted privacy principles, many of which are reflected in privacy laws. While specific legal obligations may vary by jurisdiction, we expect all police agencies to comply with the law and recommend they follow the best practices included in this framework given the high risk of harm that can result from the inappropriate use of FR technology.
  4. Police agencies are ultimately responsible for ensuring any use of FR technology is lawfully authorized and that privacy risks are managed appropriately. This guidance provides a baseline from which to build privacy protections into FR initiatives; police agencies may need to implement additional privacy protections depending on the nature and scope of risks to privacy posed by a specific initiative.

Lawful authority

  1. Police agencies must have the legal authority to use FR, and use it in a manner that respects the privacy rights of Canadians. This section discusses both potential sources of legal authority for the use of FR by police agencies, as well as limits on those potential uses.
  2. Police agencies should obtain a legal opinion on whether they have the authority to implement or operate a proposed FR program, and on whether such a program adequately respects individuals’ rights. Unless these conditions are met, the proposed program cannot proceed.
  3. Canadian jurisdictions do not yet have legislation specifically addressing FR technology, with the exception of Quebec, which has enacted legislation governing biometrics.Footnote 2
  4. Since FR involves the collection and use of personal information, it is subject to applicable privacy legislation. Law enforcement must also determine whether FR is compliant with the Charter and human rights laws.Footnote 3 The extent to which these laws permit police use of FR is unclear.

Sources of legal authority

  1. There is no specific legal framework for FR use in Canada. Rather, the legal framework comprises a patchwork of statutes and the common law. Federal and Provincial privacy laws provide a starting point for understanding the existing framework by requiring police agencies – or anyone acting on their behalf – to have lawful authority in order to collect and use the personal information of individuals.
  2. As described in the previous section, FR requires that personal information be collected and used at multiple stages, such as: training a FR algorithm, creating a face database, collecting image(s) to be compared against that database, and possibly others. Lawful authority must exist for all steps that implicate personal information. Additionally, where police use vendors or third parties to supply FR services, including FR databases, police must ensure that these suppliers have the lawful authority to collect and use the personal information contained in their services.
  3. Sources of legal authority can include statutes or common law. Please note that the following discussion is primarily for illustrative purposes, and should not be interpreted as a comment on the validity or scope of potential legal authorities.

Judicial Authorization

  1. Police agencies may seek and obtain judicial authorization to collect and use faceprints in circumstances that merit such action. The Criminal Code provides for warrants that permit intrusion on individuals’ privacy when a judge is satisfied that there are reasonable grounds to believe that an offence has been or will be committed and that information concerning the offence will be obtained through the use of the technique or device; that it is in the best interests of the administration of justice to do so; and in instances where there is otherwise no statutory basis for doing so.Footnote 4 These authorizations are subject to the usual requirements for obtaining a warrant, in addition to any conditions or limitations imposed by courts when granting them.

Statutory Authority

  1. The Identification of Criminals Act allows police agencies to fingerprint or photograph individuals charged with, or convicted of, certain crimes.Footnote 5 It permits these identifiers to be used for the purposes of identifying criminals, and providing information to officers and others engaged in administrating and executing the law. The Identification of Criminals Act does not, however, authorize the indiscriminate collection of photographs of other individuals at the broader population level. Legal advice would be required to determine if - and in what circumstances - this Act provides a legal basis for a specific use of FR applied to existing mugshot databases collected under this authority.

Common law authority

  1. Police agencies have a crucial role in furthering public interests such as the preservation of peace, the prevention of crimes, and the administration of justice.Footnote 6 The common law, like statutory authorities, can authorize police actions that infringe on individual liberties in the pursuit of these societal goals. The term “liberty” in discussion of common law police powers encompasses constitutional rights and freedoms such as privacy, discussed further below.Footnote 7
  2. Canadian courts have set out limitations on police powers provided by common law.Footnote 8 In order for a police action to be authorized by common law it must:
    1. fall within the general scope of a statutory or common law police duty; and
    2. involve a justifiable exercise of police powers associated with that duty.Footnote 9
  3. The second requirement entails an assessment of whether the police action “is reasonably necessary for the fulfillment of the duty” and considers three factors:
    1. the importance of the performance of the duty to the public good;
    2. the necessity of the interference with individual liberty for the performance of the duty; and
    3. the extent of the interference with individual liberty.Footnote 10
  4. These factors require that the interference with liberty be necessary given the extent of the risk and the liberty at stake, and no more intrusive to liberty than reasonably necessary to address the risk.Footnote 11
  5. Judicial consideration of police use of FR has so far been limited, and Canadian courts have not had an opportunity to determine whether FR use is permitted by common law.Footnote 12 If FR use interferes with individuals’ reasonable expectations of privacy (“REP”) and is not enabled by a statute or common law, authorization under section 487.01 of the Criminal Code will be generally required for its use.

Respecting Canadians’ rights

  1. While police agencies require legal authority to proceed with a FR program, they must also protect the rights of individuals. Protections can be found in the Charter, as well as federal and provincial privacy legislation.

Privacy legislation

  1. Privacy legislation sets out the conditions under which public agencies may collect, use, disclose, and retain individuals’ personal information. Public institutions, including police agencies, are generally authorized by privacy legislation to collect personal information for valid purposes. For example, some provincial public sector privacy legislation permits the collection of personal information for “law enforcement” purposes.Footnote 13 Police agencies would need to determine whether the collection of personal information by FR falls within the scope of “law enforcement” as defined by the legislation, or is otherwise authorized by one of the other permitted purposes for collection of personal information set out in the legislation. Under federal legislation, the collection of personal information must relate directly to an operating program or activity of the federal institution collecting the personal information.Footnote 14 This means that federal institutions must ensure that they have parliamentary authority for the program or activity for which the information is being collected.Footnote 15
  2. While the requirements for valid collection of personal information vary by jurisdiction, the privacy principles discussed below are found in many privacy laws that will apply. Once collected, police agencies may generally only use personal information for the purpose for which the information was collected or compiled, or for a use consistent with that purpose, unless otherwise authorized. Compliance with privacy statutes does not necessarily cure any legal defect that may exist under the Charter.Footnote 16

The Charter

  1. In addition to a general interest in being left alone and free from police interference,Footnote 17 the Charter gives individuals the right to be secure against unreasonable searches and seizures by police.Footnote 18
  2. To determine whether police action constitutes an unreasonable search, a court first considers whether a search occurred. This depends on whether an individual had a REP within the context of the possible search, with consideration given to the totality of circumstances. The analysis of whether a search occurred is based on a number of interrelated factors such as the subject matter of the search and the nature of the individual’s privacy interests therein. This means not only the source photographs and faceprints themselves, but also any additional information about an individual’s actions and whereabouts that may be revealed when connecting personal identity, video footage, meta-data, such as date and time stamp, and other identifying information.Footnote 19 The existence of a REP is also dependent on the context in which FR use occurs, and that may give rise to an individual’s subjective REP. Additionally, courts consider whether that REP is objectively reasonable, informed by what levels of privacy individuals ought to expect in a free and open society, highlighting that expectations of privacy are not just descriptive, but also normative in nature.Footnote 20
  3. While citizens’ reasonable expectations regarding the use of FR are not yet clearly established, it is clear that FR uses sensitive personal information that generally cannot be changed, and can be used to identify people for various privacy sensitive purposes. We therefore presume that FR will generally raise a REP, whether online or in person, despite faces being publicly visible. Individuals do not expect to be the subject of surveillance when going about their normal and lawful activities, and generally maintain some degree of a REP even when in public spaces.Footnote 21 Conversely, even if an individual might expect FR to be used, greater privacy invasions do not become socially acceptable simply due to technological advancements or evolving police practices.Footnote 22
  4. If a search occurs in a context where the concerned individual has a REP, a court will then determine whether the search was reasonable. In order for a search to be reasonable, it must be authorized by a reasonable law, and be conducted in a reasonable manner. When a police action is found to be authorized by the common law, it will generally be considered Charter compliant as the tests for common law and Charter compliance are similar.Footnote 23

Necessity and Proportionality

  1. The privacy principles of necessity and proportionality ensure that privacy-invasive practices are carried out for a sufficiently important objective, and that they are narrowly tailored so as not to intrude on privacy rights more than is necessary. In the case of law enforcement, there is a clear public interest in ensuring public safety, but also in protecting individuals’ fundamental right to privacy. While the right to privacy is not absolute, neither can the pursuit of public safety justify any form of rights violation. Therefore, police may only use means justifiable in a free and democratic society.
  2. As noted above, necessity and proportionality exist in varying degrees in privacy laws, the common law, and the Charter.Footnote 24 In the context of FR use, necessity and proportionality will generally require an assessment of the following:
  3. Necessary to meet a specific need: Rights are not absolute, and can be limited where necessary to achieve a sufficiently important objective.Footnote 25 Necessary of course means more than useful. Significantly, the objective of an FR program must be defined with some specificity. It is not enough to rely on general public safety objectives to justify such an intrusive technology as FR. Police agencies should demonstrate the pressing and substantial nature of the specific objective through evidence. Further, the personal information collected should not be overbroad; it should be tailored and necessary to achieve the specific goal.
  4. Effectiveness: Police must be able to demonstrate that collection of personal information actually serves the purpose of the objective. Police should provide evidence that demonstrates why the specific use of FR being proposed will likely be effective in meeting specific program objectives. This demonstration of effectiveness should take into account any known accuracy issues associated with the specific use.
  5. Minimal Impairment: Police agencies’ intrusion on individuals’ privacy must not extend beyond what is reasonably necessary to achieve the state’s legitimate objective.Footnote 26 The scope of a program should be as narrow as possible. If using FR, police agencies should be able to demonstrate that there are no other less privacy invasive means that will reasonably achieve the objective,Footnote 27 and be able to show evidence as to why less privacy invasive measures are not used.Footnote 28
  6. Proportionality: This stage requires an assessment of whether the intrusion on privacy caused by the program is proportional to the benefit gained.Footnote 29 Police agencies should therefore seek, first, to identify how their specific use of FR will impact the privacy of individuals, having regard to general factors such as those mentioned in the introduction, but also impacts specific to their intended use of FR, for instance on certain groups. Then, police agencies should consider whether these intrusions on privacy are justified by the benefits of the specific deployment of FR. Inherent in this step is the recognition that not all objectives carry the same weight. For example, preventing a known terrorist plot would justify a greater privacy intrusion than catching someone who committed a minor act of vandalism. In making this assessment, law enforcement agencies must be open to the possibility that, in a free and democratic society, a proposed FR system which has a substantial impact on privacy (such as via mass surveillance) may never be proportional to the benefits gained. Where the impact is substantial, law enforcement agencies should be particularly cautious about proceeding in the absence of clear, comprehensive legal safeguards and controls capable of protecting the privacy and human rights of members of the general public. Seeking warrants and court authorizations can assist with ensuring that a proposed FRT use meets the proportionality standard.
  7. Recall that these privacy principles repeat and overlap with legal authorities as well as individuals’ privacy rights. These recurrent themes reinforce the need for police agencies to respect limits to law enforcement powers, and ensure that the simultaneous goals of public safety and respect for privacy are achieved at the same time.

Designing for privacy

  1. It is important to design initiatives with privacy protections built in from the outset. This is commonly referred to as “privacy by design”. Following a privacy by design approach will help to ensure that privacy protection is an essential component of any FR initiative or system. To be most effective, such protections must be incorporated during initial conception and planning, following through to execution, deployment, and beyond.
  2. Implementing privacy by design means police agencies must formally integrate privacy protections before engaging in any use of FR technology. Privacy protections must also be designed to protect all personal information involved in an initiative, including training data, faceprints, source images, face databases, and intelligence inferred from FR searches, in addition to any other personal information that may be collected, used, disclosed, or retained.

Privacy impact assessments

  1. A key element of putting privacy by design into practice is to conduct a privacy impact assessment (PIA). A PIA is a widely accepted tool for analyzing and addressing the impacts of initiatives on privacy. When used properly, PIAs help to ensure that programs and activities meet legal requirements and mitigate privacy risks.
  2. Police agencies should conduct a PIA before implementing or making substantial modifications to initiatives that involve the collection, use, or disclosure of personal information, including pilot projects. In some Canadian jurisdictions, government institutions are required to conduct PIAs by policy or legislation.
  3. When conducting a PIA, police agencies are expected to:
    • Conduct the PIA in accordance with any applicable legislative and policy requirements
    • Follow any guidance issued by the relevant privacy commissioner on the PIA processFootnote 30
      • Where no such guidance exists, police agencies may consult with the oversight body in their jurisdiction
    • Document the PIA process in a PIA report
    • Mitigate any risks raised in the PIA and designate an individual responsible for managing residual risks
    • Publish a summary of the completed PIA report before deploying FR, and update it if planning and implementation of the initiative evolve
    • Conduct a new PIA (or, if appropriate, amend the existing PIA) if major changes are made to the initiative that could impact the collection, use, disclosure, or retention of personal information
  4. As police agencies assess privacy risks through the PIA process, they should consider all relevant privacy risks. This includes assessing the potential impacts of the initiative on:
    • Individuals
    • Communities in which FR may be deployed
    • Groups that may be disproportionately harmed by privacy incursions
    • Public confidence in the collection and use of personal information by law enforcement agencies
    • Human and democratic rights, including rights to privacy, equality, peaceful assembly and free expression
  5. After assessing these potential impacts, police agencies should not proceed with further planning and implementation of the initiative unless they can clearly explain:
    1. Why the proposed use of FR is necessary to meet a specific need that is rationally connected to a pressing or substantial public goal
    2. What the expected benefits of the initiative consist of, and how they are proportionate to the risks involved
    3. Why other less intrusive measures are not sufficient
    4. How risks will be minimized during implementation of the initiative
  6. Police agencies should document explanations for the above along with their risk assessment in the PIA report. To assist in doing so effectively, police agencies should:
    • Engage with relevant stakeholders and privacy experts about the potential privacy impacts of the proposed initiative
    • Consult their privacy commissioner’s office early in the planning phase of the initiative, in jurisdictions where these offices offer advisory services
    • Consult their human rights commissioner’s office when planning the initiative, given the strong connection between privacy rights and broader human rights, including the right to be free from discrimination
    • Ensure the PIA is conducted by individuals with appropriate competencies in identifying and analyzing privacy risks. This process should involve key parties in the initiative, which may include:
      • legal counsel
      • privacy staff
      • program staff (those who will administer the initiative)
      • stakeholder groups (those who may be affected by the initiative)
      • technical experts
      • management
      • third party service providers

Monitoring and re-assessment

  1. Privacy risk analysis is an ongoing process that does not stop with the conduct of a PIA or the deployment of an initiative. PIAs should be evergreen and can help to ensure the ongoing management of privacy risks as part of a police agency’s overall risk management strategy.
  2. Police agencies should monitor and re-assess privacy risks, and the effectiveness of privacy protections, on an ongoing basis. This can be done by implementing the following best practices as FR initiatives are deployed:
    • Audit the initiative at regular intervals
      • Audits should address both the initiative’s compliance with legal requirements, and program operators’ adherence to the policies and procedures set out for the initiative (see also recommendations in the Accountability section below)
      • Audits should also address compliance by third parties with the terms and conditions of information sharing and service agreements
    • Conduct regular reviews (for example, annually) of program effectiveness
      • Reviews should assess the extent to which program activities are achieving the goals of the initiative, using demonstrable criteria (for example, number of arrests or convictions resulting from the program, etc.)
    • Review and update safeguards, policies, and procedures as needed and as appropriate to ensure continued compliance with privacy responsibilities
      • For instance, agencies may need to adjust policies in light of audits, program reviews, breaches, law reform, new guidance or technological developments
    • Review and, where necessary, update information sharing and service agreements with third parties
    • Review and update training procedures as needed and as appropriate
      • Ensure that changes to policies and procedures are communicated promptly to relevant personnel
    • Monitor information holdings regularly (for example, databases of personal information) to ensure records are being retained and destroyed in accordance with established policies and procedures
    • Document any changes to the program in the PIA
    • Engage with external stakeholders throughout deployment (for example, privacy experts, community groups, civil society organizations)
      • Stakeholders can be a valuable source of feedback about the privacy impacts of initiatives


  1. Police agencies must ensure that personal information collected and used as part of a FR initiative is sufficiently accurate and up to date. The accuracy of FR software cannot be taken for granted given the serious risks to individuals’ rights posed by the collection and use of inaccurate information.
  2. Fulfilling accuracy obligations within the context of a FR initiative requires consideration of the FR system as a whole. FR is comprised of a number of components, each of which raises unique considerations. Only when the constituent parts of a FR system process personal information accurately and fairly can the system as a whole be said to do the same.
  3. With respect to training data, one of the main considerations is the role it may play in contributing to bias in the FR system. If the training data used to generate a FR algorithm lacks sufficient representation of faces from certain demographics, the algorithm will likely produce disparate accuracy metrics across groups. It is possible for a FR algorithm to produce flawed results, particularly where it has been trained on non-representative or otherwise biased data. Studies have demonstrated considerable variation in FR algorithms with respect to the error rates they produce for faces of individuals from different racial backgrounds and across genders,Footnote 31 with other research showing that a lack of diverse and high quality training data is the main culprit.Footnote 32
  4. Regarding the FR algorithm, there are three key considerations to be aware of with respect to accuracy. The first is that accuracy is understood statistically. The output of a FR algorithm is a probabilistic inference as to the likelihood that two images are of the same person. It is not a verified fact about the individual. As such, accuracy is not a binary “true/false” measure, but rather is computed based on the observed error rates of the algorithm across searches. There are two types of errors to consider:
    1. false positives (also known as “type I” errors) where the algorithm returns a candidate match in the face database that is not of the individual in the probe image; and
    2. false negatives (also known as “type II” errors) where the algorithm fails to return a genuine match in the face database even though the database contains one.
  5. The second consideration is that there is generally a trade-off between the false positive and false negative rate of a FR algorithm. The reason for this has to do with another component, the threshold for a probable match. Depending on how high (or low) the threshold is set, a FR algorithm will generally return fewer (or more) candidate matches. However, how many results the algorithm returns has implications on its error rates. While a higher threshold will return only higher probability candidates and lead to fewer false positives, this same threshold will in general make the algorithm more likely to miss lower probability matches and potentially lead to greater false negatives.
  6. Lastly, it is important to consider that the determination of an appropriate threshold will depend on the nature, scope, context and purpose of the FR initiative, taking into account the risks to the rights and freedoms of individuals. Strictly speaking, there is no single appropriate threshold. It is a matter of prioritizing the reduction of certain types of errors based on the nature and severity of risks they pose to individuals, while at the same time ensuring the overall effectiveness of the FR system.
  7. The face database is another component that raises important issues regarding accuracy and fairness. One consideration is the quality and/or age of the images and the effects this may have on the accuracy of the FR system. For example, studies have shown that the time elapsed between two images of the same individual increases false negative rates.Footnote 33 However, it is also important to consider the demographics of who is in the face database and whether the disproportionate representation of certain groups may lead to adverse effects. A FR system may be susceptible to a “feedback loop” where the makeup of individuals in a face database leads police to repeatedly cast suspicion on them, their associates or their community, thereby increasing the disproportionality of their demographic representation over time.
  8. A final component to mention is human review. While human review serves as an important mitigation measure to reduce the risks of inaccuracy and bias, it may also inadvertently reintroduce those very same risks into the FR system. Unlike computers, humans can be overwhelmed with too much information. Effective human review requires content moderation training, and for reviewers to be granted a non-rushed amount of time proportionate to the number of candidate matches they are expected to adjudicate. However, even with sufficient training and time, human review may be unduly influenced by the level of statistical precision in the FR system. It is important to avoid “automation bias” or the tendency to over-rely on automated systems when making human decisions. Just because the FR system uses mathematical calculations does not necessarily mean that its predictions are accurate or fair.
  9. Given the above, it is imperative that police agencies take steps to minimize inaccuracy and bias in any deployment of FR technology. These steps should include the following best practices:
  10. Police agencies should require FR vendors to:
    • Make their FR algorithms available for independent external testing
      • The testing should include an evaluation of the accuracy of the algorithm as well as its performance across distinct socio-demographic populations (for example, racial, gender and age groups)
    • Indicate, in the results of each FR search, the similarity score; that is, an estimate of the probability that a given match is accurate (for example, as a percentage).
  11. Police agencies should also:
    • Determine an appropriate threshold so as to prioritize reducing certain types of error based on the nature and severity of risks, while ensuring the overall effectiveness of the FR system
    • Internally test for bias and inaccuracy in the performance of FR system as a whole before deployment, and routinely during deployment
    • Ensure testing is carried out by individuals or organizations qualified to independently assess the performance of FR systems
    • Include the similarity score, as indicated in FR search results, when recording or disclosing information about a match
    • Discontinue FR use if internal or external testing reveals:
      • insufficient statistical accuracy in the FR system, or
      • meaningful variation in error rates across socio-demographic populations
  12. Police agencies should not:
    • Fully automate administrative decisions based on the results of FR matching procedures
      • In other words, decisions that affect legal rights, privileges, or interests should be made by humans, including, for example, decisions to detain, investigate or charge individuals in relation to a crime
    • Act on a FR match unless that match has been reviewed within a suitable timeframe by an officer trained in FR identification procedures and limitations

Data minimization

  1. Police agencies must limit the collection of personal information to that which is directly relevant and necessary for the specific objectives of a FR initiative.
  2. To assist in doing so, police agencies should implement the following practices for data minimization:
    • Minimize the amount of personal information collected and used to perform each task, based on the amount of personal information needed to complete the task
      • Images used to perform a FR search should be cropped to prevent identification of individuals in the image who are not the subject of the search.
    • Promptly and irretrievably discard personal information that falls outside the scope of the initiative, including personal information collected inadvertently during the initiative
    • Include a policy framework supported by mechanisms to systematically verify that data collected by the initiative falls within the initiative’s lawful authority to collect
  3. When collecting and storing personal information in FR initiatives, police agencies should:
    • Not cross-link the information with personal information contained in other databases, except to the extent that doing so is necessary to fulfill the lawful objectives of the initiative
    • To the greatest extent possible, store the information in separate databases from other personal information, and isolate those databases from other networks

Purpose limitation

  1. Police agencies must ensure that personal information is only used for the purpose for which it was collected, or a purpose consistent with that purpose. They must also ensure that each use of personal information falls within the scope of the initiative’s lawful authority.
  2. To assist in meeting these requirements, police agencies should implement a comprehensive set of administrative, technical, and physical controls for managing access to, and use of, data and software used in FR initiatives.
  3. These controls should include:
    • A mechanism for officers to inform senior management about new investigative tools that may involve the collection or use of biometric information
      • While management approval should be required before using new investigative tools, approval does not replace applicable privacy requirements, including the conduct of a PIA
    • A system for authorizing individuals to access and use FR software and related databases only as necessary to fulfill objectives of the initiative
    • A protocol that specifies the circumstances under which officers are authorized to perform a FR search
    • A system for documenting decisions to initiate a FR search
    • Standard operating procedures for performing a FR search, including instructions specifying what data may be used and how the search is to be performed
    • A mechanism that holds authorized individuals meaningfully accountable for misuse of FR software or related data, whether intentional or accidental
  4. These controls should prevent:
    • Access to FR software and data by unauthorized individuals
    • Use of FR software or data for unauthorized purposes
    • Use of any FR software outside of a lawful initiative that is approved and overseen by senior management
    • Experimentation with new biometric technologies on real-world data, outside of lawful initiatives that are approved and overseen by senior management
  5. Police agencies must also ensure that third parties engaged on their behalf do not use personal information transferred to them during an initiative for purposes other than those that are consistent with the original purpose of collection. For example, if a police agency transfers an image to a third party FR software vendor for identification purposes, the police agency must take reasonable steps to ensure the vendor does not use the image (or associated faceprint data) as training data, or enter that data into a face database for comparison in future searches.
  6. To help meet this requirement, police agencies should use information sharing agreements (ISAs) to codify limitations on the use of personal information disclosed to third parties during an initiative, as well as other privacy protections.Footnote 34 This includes disclosures to FR software vendors of images to be matched, as well as disclosures to any other organizations (for example, other law enforcement bodies) of images, databases, or other personal information.
  7. Subject to specific legal requirements in each jurisdiction, ISAs should specify, at a minimum:
    • The lawful authority under which information is to be disclosed
    • The specific elements of personal information being disclosed
    • The specific purposes for the disclosure
    • Limits on use and onward transfer
    • Specific safeguarding measures
    • Data localization requirements, if any
    • Data breach procedures
    • Retention periods and data destruction requirements
    • Accountability measures, including compliance monitoring

Data security

  1. Personal information must be protected by appropriate security measures relative to the sensitivity of the information.
  2. Given the high sensitivity of facial data, police agencies are expected to implement highly protective security measures as part of FR initiatives. At a minimum, these should include the following measures, although these may not be sufficient in all cases:
    • Use encryption and other digital safeguarding tools to secure data while in storage and while in transit between databases, servers and end-user devices
    • Ensure records and equipment, including memory disks, servers and end-user devices, are used and stored only in secure physical locations
    • Log all access to, and use of, FR software and databases
    • Review and update security measures regularly to address evolving security threats and vulnerabilities
    • Use ISAs to ensure any third parties involved in the initiative follow relevant best practices for data security
  3. Data security requirements may also require that all personal information collected or created by police during a FR initiative be stored in Canada. In some Canadian jurisdictions, police agencies are explicitly required to do so by law or policy instruments.Footnote 35 In others, they must first ensure that personal information released outside of its jurisdiction will receive equivalent protection.Footnote 36


  1. Police agencies should not retain personal information for longer than is necessary to fulfill the purposes of an initiative.
  2. Given the high sensitivity of facial data, it is especially important that police agencies promptly and securely destroy any personal information that does not need to be kept.
  3. In FR initiatives, some personal information may be needed for longer than other personal information. For instance, retention periods may vary for:
    • Original media from which facial data was collected (for example, a digital image or video)
    • Faceprints created by FR software when analyzing an image
    • Intelligence inferred from the outputs of FR analysis
  4. Appropriate retention periods for different components of FR data may also vary depending on the context of FR use. For example, it may sometimes be necessary to retain images or faceprints of individuals who are identified as persons of interest to the police, but images and faceprints collected from the general population should be promptly destroyed unless there is a specific and lawful purpose for retaining them, or another legal requirement to do so. Similarly, it may sometimes be necessary to retain faceprints or video surveillance data for the duration of an active police investigation, but faceprint or video surveillance data that is not relevant to a police investigation should be destroyed.
  5. In order to ensure personal information is not retained for longer than it is needed in FR initiatives, police agencies should:
    • Identify applicable retention periods for personal information at the outset of its collection
    • Apply different retention periods to different forms of personal information, as appropriate
    • Conduct reviews of data holdings at regular intervals to identify personal information that may have been retained unnecessarily
    • Implement protocols to ensure personal information is destroyed securely and promptly upon expiry of the applicable retention period
    • Use ISAs to ensure any third parties involved in the initiative destroy personal information promptly and securely at the end of the applicable retention period
    • Implement wind-down protocols for destroying personal information if the initiative is discontinued

Openness, transparency and individual access

  1. Wherever possible, individuals and the public must be informed of the purpose of the collection of their personal information, including how the information may be used or disclosed.
  2. In general, when FR technology is used, individuals should be informed at the time their image is collected that their image may be collected and stored in a face database. At the same time, individuals should be informed of the purpose(s) for which their image is collected. Such transparency measures are important, in part because they help to facilitate individuals’ right to request access to their recorded personal information.
  3. Police agencies should implement policies and procedures to accommodate access requests whenever possible, including any time faceprint data is collected from the general public.
  4. In the context of police initiatives, however, it may not always be possible to provide individuals with access to complete information about the collection of their personal information, for example when individuals are the subject of an ongoing investigation.
  5. Given the sensitivity of facial data, and the privacy risks implicated by FR use, it is therefore especially important for police agencies to implement transparency measures at the program level for FR initiatives. Doing so can help to inform the public about FR initiatives, and can increase public confidence that such initiatives are being implemented responsibly.
  6. Police agencies should implement the following transparency measures in FR initiatives:
    • Disclose the initiative on the police agency’s public website, including an explanation of the initiative and its objectives, as well as a link to the PIA summary
    • Update public disclosure of the initiative regularly as the initiative moves from planning and development to implementation
    • Publish regular reports on program activity
      • Reports should include:
        • statistics on the number of searches performed in a given period
        • the purpose for which those searches were performed
        • statistics concerning the effectiveness of the initiative, in light of its objectives
        • the results of any accuracy or bias testing performed by the police agency, with justification for any variations across groups
    • Make data on FR system use, including search data, available for oversight purposes


  1. Police agencies are responsible for personal information under their control and should be able to demonstrate their compliance with legal requirements.
  2. An accountable police agency should have a privacy management program in place, with clear structures, policies, systems and procedures to distribute privacy responsibilities, coordinate privacy work, manage privacy risks and ensure compliance with privacy laws.
  3. To help ensure accountability for FR initiatives, police agencies should implement the following accountability measures. These should be considered a baseline from which to approach program accountability, rather than an exhaustive checklist:
    • Have policies and procedures in place for handling personal information that is collected, used, created, disclosed and retained over the course of an initiative
      • Where appropriate, policies and procedures specific to FR initiatives should be integrated into the police agency’s overall privacy management program
    • Establish a clear reporting structure for the initiative that designates an individual responsible for overseeing compliance with privacy obligations
    • Establish a specialized training program for individuals authorized to use FR software
      • Completion of the training program should be required in order to gain authorization to access and use FR software and related databases
    • Log all uses of FR software, including all searches and the credentials of individuals who performed each search
      • The logging procedure should be automated, and beyond the control of individuals who access and use the system to perform FR searches.
      • Logging FR use is a key means of facilitating oversight functions by public bodies; as such, logs should be made available to oversight bodies on request
    • Log all disclosures of personal information, with a record of the authority under which the information was disclosed (including reference to the governing ISA), to whom the information was disclosed, the means of disclosure, any conditions attached to the disclosure, and the identity of the program operator who authorized the disclosure
    • Undertake periodic reviews of program activity, including assessment of compliance with privacy requirements and effectiveness of the initiative in meeting program objectives
    • Establish a clear protocol for redressing non-compliance with the policies and procedures set out in the initiative
    • Update the PIA if major changes are made to the initiative that could impact the collection, use, or retention of personal information
  4. As well, police agencies should provide appropriate training to all individuals with access to FR software and related databases, including training on policies and procedures for handling FR data. Doing so can help to ensure program operators adhere to the initiative’s policies and procedures for collecting, storing, using and disclosing personal information.
  5. As part of this training, police agencies should require program operators to understand and analyze the privacy risks associated with the initiative, including the limitations associated with FR technology. These include:
    • The potential for bias in FR matching procedures on the basis of race, gender and other relevant demographic characteristics.
    • The potential for errors due to low quality probe images or outdated errors in the face database
    • The importance of human review of FR matches for the prevention of automation bias
  6. Police agencies should update training as needed to ensure program operators always have sufficient knowledge, competencies and experience to fulfill their duties in compliance with legal requirements.


  1. Given the significant risks posed by FR, we expect police agencies to assess the risks associated with any contemplated use of FR and to mitigate potential harms by building appropriate privacy protections into the design of proposed initiatives. If police agencies proceed with FR use, they must follow through on privacy protection over the lifespan of the initiative.
  2. Above all, police agencies must ensure that any use of FR complies with the law. While specific legal requirements vary by jurisdiction, the recommendations in this guidance document can help to ensure proposed uses of FR meet legal requirements, minimize privacy risks, and respect Canadians’ fundamental right to privacy.

Summary of Recommendations

Below is an abbreviated summary of key recommendations made in this guidance document. The summary is compiled for reference purposes only; please consult the guidance document for full recommendations.

When proposing, developing, and implementing initiatives involving the use of FR technology, we recommend police agencies:Footnote 37

  • Ensure lawful authority exists for each collection, use, retention, and disclosure of personal information.
    • Lawful authority must exist for all steps of FR use, including training a FR algorithm, creating a face database, and collecting probe images.
    • Ensure any third parties involved in the collection or use of personal information operate with lawful authority.
  • Design for privacy
    • Integrate privacy protections into proposed initiatives before using FR technology.
    • Conduct a PIA to help ensure FR programs meet legal requirements and to identify and mitigate privacy risks.
    • Monitor and re-assess privacy risks and the effectiveness of privacy protections on an ongoing basis.
  • Ensure personal information is accurate and up to date.
    • Test data and systems as appropriate to identify and reduce inaccuracy and bias.
    • Keep a “human in the loop” to review FR matches.
  • Minimize collection of personal information to that which is directly relevant and necessary based on the objectives of the initiative.
  • Ensure personal information is used only for the purpose for which it was collected, or a purpose consistent with that purpose.
    • Implement administrative, technical, and physical controls for managing access to, and use of, FR data and software.
    • Use ISAs to limit use of personal information disclosed to third parties.
  • Protect personal information using security measures that are appropriate given the sensitivity of the information and use ISAs to ensure third parties are required to do the same.
  • Retain personal information for no longer than is necessary to fulfill the purposes of the initiative (or as otherwise required by law).
    • Appropriate retention periods for different components of FR data may vary depending on their use.
  • Implement openness and transparency measures as appropriate to allow individuals and the public to be informed about the initiative.
    • Implement policies and procedures to accommodate access requests whenever possible.
  • Implement effective accountability measures.
    • Implement a privacy management program with clear structures, policies, systems and procedures to distribute privacy responsibilities, coordinate privacy work, manage privacy risks and ensure compliance with privacy laws.
    • Log all uses of FR, including all disclosures of personal information outside the organization.
    • Ensure all personnel with access to FR systems are appropriately trained.
Report a problem or mistake on this page
Error 1: No selection was made. You must choose at least 1 answer.
Please select all that apply (required):


Date modified: