Police use of Facial Recognition Technology in Canada and the way forward
Special report to Parliament on the OPC’s investigation into the RCMP’s use of Clearview AI and draft joint guidance for law enforcement agencies considering the use of facial recognition technology
June 10, 2021
Office of the Privacy Commissioner of Canada
30 Victoria Street
Gatineau, Quebec K1A 1H3
© Her Majesty the Queen of Canada for the Office of the Privacy
Commissioner of Canada, 2021
Cat. No. IP54-110/2021E-PDF
Letter to the Speaker of the Senate
The Honourable George J. Furey, Senator
Senate of Canada
Ottawa, Ontario K1A 0A4
Dear Mr. Speaker:
I have the honour to submit to Parliament the Special Report of the Office of the Privacy Commissioner of Canada entitled Police Use of Facial Recognition Technology in Canada and the Way Forward. This tabling is done pursuant to section 39(1) of the Privacy Act.
Original signed by
Letter to the Speaker of the House of Commons
The Honourable Anthony Rota, M.P.
House of Commons
Ottawa, Ontario K1A 0A6
Dear Mr. Speaker:
I have the honour to submit to Parliament the Special Report of the Office of the Privacy Commissioner of Canada entitled Police Use of Facial Recognition Technology in Canada and the Way Forward. This tabling is done pursuant to section 39(1) of the Privacy Act.
Original signed by
Facial Recognition Technology (FRT) has emerged as a powerful tool of significant interest to both law enforcement and commercial entities. Used responsibly and in the right circumstances, it has the potential to offer great benefits to society. For instance, it can support national security objectives, assist police in solving crime or help authorities find missing persons.
The technology scales easily, costs relatively little to use, and can be deployed as an add-on to existing surveillance infrastructure, which might explain its growing appeal in Canada and abroad, particularly by police agencies.
At the same time, facial recognition can be a highly invasive surveillance technology fraught with many risks.
Studies have shown that it can provide racially biased results and, given the chilling effect it can have on certain activities, it has the potential to erode privacy and undermine freedoms and human rights such as free expression and peaceful assembly. Repositories of FRT data are also high value targets for malicious actors and must be safeguarded accordingly.
FRT involves the collection and processing of very sensitive personal information. Biometric facial data is unique to each individual, unlikely to vary significantly over time and it is difficult to change in its underlying features. Coupled with large data sources such as the Internet, government databanks or closed circuit television, it can be a powerful intelligence and tracking tool.
The data involved in FRT speaks to the very core of individual identity and as both commercial and government use of the technology expands, it raises important questions about the kind of society we want to live in. The deployment of FRT writ large is worthy of closer examination as to whether our laws adequately protect Canadians from potential misuses of the technology. The focus of this report, however, is on the application of privacy laws and best practices to the use of FRT by police.
OPC actions and next steps
This Special Report to Parliament includes the findings of our investigation of the RCMP’s use of Clearview AI, a technology company that has offered services of FRT to law enforcement and some private organizations.
Clearview AI itself was the subject of a previous investigation by the OPC, the results of which were published in February 2021.
Also included in this Special Report is draft privacy guidance on facial recognition for police agencies. A joint initiative with our counterparts in each province and territory in Canada, the guidance seeks to clarify police agencies’ privacy obligations relating to the use of FRT, with a view to ensuring that any use of this technology complies with privacy laws, minimizes privacy risks and respects privacy rights.
We will be consulting with police forces and other stakeholders on the guidance in the weeks and months ahead. It will be important to have a public discussion on how this technology that is potentially useful, but also comes with significant risks, should be used.
Along with an earlier investigation into Cadillac Fairview – a commercial real estate company that embedded cameras inside digital information kiosks at shopping malls to estimate the gender and age of shoppers, without their knowledge or consent – we hope our work contributes to the important conversations taking place regarding the regulation of potentially disruptive technologies such as FRT.
We welcome the fact that Parliamentarians are currently seized with the issue of FRT. Just last month, I was invited before the House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI) to discuss privacy concerns related to the technology. Committee members expressed a keen interest in our investigations into the use of FRT in the context of law enforcement.
We felt it would be a missed opportunity and a disservice to Canadians not to share details of our investigation into the RCMP’s use of Clearview AI’s facial recognition technology in a timely fashion, especially as the government looks to modernize Canada’s privacy regime.
Overview of investigation into RCMP’s use of Clearview AI
Our investigation of the RCMP’s use of FRT, the full details of which appear later in this special report, is linked to a separate investigation of Clearview AI.
In that investigation, we found the company’s technology allowed law enforcement and commercial organizations to match photographs of people against the company’s databank of more than three billion images scraped from internet websites without users’ consent.
The result was that billions of people essentially found themselves in a “24/7” police line-up. We concluded this represented mass surveillance and was a clear violation of the Personal Information Protection and Electronic Documents Act (PIPEDA), Canada’s federal private sector privacy law.
Now, in our investigation of the RCMP, we found that Canada’s national police force contravened the Privacy Act, which applies to federal government institutions, when it collected personal information from Clearview AI. In essence, a government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.
We were also concerned that the RCMP at first erroneously told our office it was not using Clearview AI. When it later acknowledged its use, it said publicly it had only used the company’s technology in a limited way, primarily for identifying, locating and rescuing children who have been, or are, victims of online sexual abuse.
However, our investigation found the RCMP did not satisfactorily account for the vast majority of the searches it made.
This highlights what our investigation revealed in more detail: that the RCMP has serious and systemic gaps in its policies and systems to track, identify, assess and control novel collections of personal information. Such system checks are critical to ensuring that the RCMP complies with the law when it uses new technology such as FRT, and new sources, such as private databases.
After we launched our investigation, the RCMP issued internal guidance to staff to restrict the use of Clearview AI and initiated a pilot “National Technology Onboarding Program” intended to systematically examine the compliance of new investigative techniques with the Privacy Act and the Canadian Charter of Rights and Freedoms.
The RCMP is no longer using Clearview AI as the company ceased to offer its services in Canada in July 2020 in the wake of our then ongoing investigation. However, we remain concerned that the RCMP did not agree with our conclusion that it contravened the Privacy Act. The RCMP argued section 4 of the Privacy Act does not expressly impose a duty to confirm the legal basis for the collection of personal information by its private sector partners. Requiring the RCMP to ensure a third party’s legal compliance with PIPEDA would create an unreasonable obligation on the RCMP, the RCMP maintained.
Nonetheless, the RCMP agreed to implement our recommendations to improve its policies, systems and training. This includes conducting fulsome privacy assessments of third party data collection practices to ensure any personal information is collected and used in accordance with Canadian privacy legislation.
Activities of federal institutions must be limited to those that fall within their legal authority and respect the general rule of law. We encourage Parliament to amend the Privacy Act to clarify that the RCMP has an obligation to ensure that third party agents it collects personal information from have acted lawfully.
Further, the common law clearly sets limits on the RCMP’s collection powers as a police body. In our view, the use of FRT by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a significant violation of privacy and clearly warrants careful consideration against these constraints.
To be effective, such an assessment must be based on a nuanced understanding of the privacy issues at play.
Draft guidance for police agencies
We and fellow privacy regulators have drafted guidance for police agencies to clarify the circumstances and conditions under which FRT use might be appropriate.
We have not yet arrived at final positions on the conditions of use of FRT and we look forward to consulting stakeholders on our recommendations before finalizing them. The draft guidance emphasizes that police agencies must have lawful authority for the proposed use of the technology, and the importance of applying privacy protective standards that are proportionate to the potential harms involved.
The privacy principles of necessity and proportionality ensure that privacy-invasive practices are carried out for a sufficiently important objective, and that they are narrowly tailored so as not to intrude on privacy rights more than is necessary.
In other words, police should not use FRT just because it is thought to be “useful” for law enforcement in general. Police should have a specific reason to use the technology and it should be based on evidence. It is not enough to rely on general public safety objectives to justify the use of such an intrusive technology. The pressing and substantial nature of the specific objective should be demonstrable through evidence.
In some cases, potential harms may be so extreme that no amount of protection will adequately reduce privacy risks. In other cases, it may be possible to appropriately manage the risks associated with FRT through careful planning and the diligent application of privacy protections.
Accuracy, data minimization, accountability and transparency are other critical principles police agencies need to consider before using FRT.
Accuracy has been raised as a global concern given the propensity of FRT to mis-identify certain gender and racial groups. As such, decisions made about individuals should not rely solely on FRT. That means police officers should review matches before any decision to detain, investigate or charge an individual is taken.
Data minimization measures can help reduce the risk of over-broad data collection and the severity of a breach should one occur.
Accountability ensures police agencies know what is being collected, how it is being collected, by whom, for what purposes and the way in which it’s being safeguarded. It ultimately ensures organizations can demonstrate their compliance with legal requirements when asked to do so.
Meanwhile, transparency measures can help ensure those who may be impacted by FRT initiatives are well informed about its use.
Besides privacy laws, the Charter also protects aspects of the right to privacy, such as the right to be secure against unreasonable search and seizure by the state. Certain intrusions on this right, however, can be justified in specific circumstances.
For instance, the Criminal Code provides for warrants that permit intrusion on people’s privacy when a judge is satisfied there are reasonable grounds to believe an offence has been or will be committed, and that evidence of the offence will be obtained through the use of a particular technique or device. Seeking warrants and court authorizations can assist with ensuring that a proposed FRT use meets the proportionality standard.
It also bears mention that private sector actors do not enjoy the same collection authority as police. Commercial FRT vendors often compile their own databases, typically of images taken from the Internet. As noted in our guidance, if police use third-party vendors to supply facial recognition services, they should ensure that these suppliers have the lawful authority to collect and use the personal information contained in their databases, and that they don’t use any data supplied to them by police for other purposes.
Law reform and other considerations
FRT raises other important considerations as the government looks to modernize Canada’s privacy and data protection regime.
Canada’s privacy laws were designed to be technology neutral, which is positive, given the pace of technological change compared to that of legislative modernization.
However, the risks of FRT are such that, due to the inalterable nature of the information involved, specific rules may be warranted. This is already the case for other forms of biometrics collected by law enforcement such as fingerprints and DNA profiles.
To date in Canada, Quebec is the only jurisdiction to enact a law that specifically addresses biometrics, which encompasses FRT. Quebec’s Act to establish a legal framework for information technology requires organizations to notify the Commission d’accès à l’information before implementing a biometrics database. The regulator may then prohibit such a database from coming into service, order changes to the project, or order the destruction of the database. Furthermore, any secondary information revealed by biometric characteristics about an individual cannot be used as a basis for a decision concerning that person.
There are jurisdictions across the U.S. that have gone so far as to ban the use of FRT by police and other public bodies. The European Data Protection Supervisor has also called for an outright ban on the use of FRT in public spaces, calling it a “deep and non-democratic intrusion into individuals’ private lives.”
While the European Commission has stopped short of a ban, it has moved to rein in the technology. In a new regulation proposed in April to address artificial intelligence, the European Commission is seeking to restrict the use of real-time facial recognition in public spaces by law enforcement to cases involving terrorism, serious criminality and targeted searches for crime victims and missing children.
Under the proposal, such use of FRT would be subject to judicial authorization. Among the key no-go zones is using the technology subliminally to manipulate an individual’s behavior in a way that is likely to be harmful. The threshold in the law is “likely to cause physical or psychological harm”.
It would also designate all other remote biometric identification, including those used by commercial organizations, as “high-risk” and impose a number of obligations such as risk and quality assessments, logging and record-keeping for traceability, human oversight and demonstrable accountability.
We have seen how public-private partnerships and contracting relationships involving digital technologies, such as FRT, can create additional complexities and risks for privacy. Common privacy principles enshrined in both our public and private sector privacy laws would help address gaps in accountability where the sectors interact. It would also help address issues related to interoperability and the misalignment between our federal privacy laws and those of other jurisdictions in Canada and around the world.
To that end, Canada’s federal privacy laws should share a rights-based foundation. They should include provisions on automated-decision making, including a definition, a right to meaningful explanation and human intervention related to its use. They should also clarify that the concept of publicly available personal information does not apply to information where an individual has a reasonable expectation of privacy. This is particularly critical in the case of FRT, which relies on massive databases of images.
Canada’s laws should also strengthen accountability provisions by requiring a demonstration of privacy compliance upon request by the regulator. They should ensure privacy protections are built into any new product or service and that risks are considered and mitigation measures are in place before launch.
Privacy is nothing less than a prerequisite for freedom – the freedom to live and develop independently as persons away from the watchful eye of state bodies or commercial enterprises.
The prospect of police agencies integrating FRT into law enforcement initiatives raises the possibility of serious privacy harms unless appropriate privacy protections are in place.
Canadians must be free to participate voluntarily and actively in the regular, and increasingly digital, day-to-day activities of a modern society. They must be able to navigate public, semi-public, and private spaces without the risk of their activities being routinely identified, tracked and monitored.
While certain intrusions on this right can be justified in specific circumstances, individuals do not forego their right to privacy merely by participating in the world in ways that may reveal their face to others, or that may enable their image to be captured on camera. Privacy is vital to dignity, autonomy, personal growth and the free and open participation of individuals in democratic life. When surveillance increases, individuals can be deterred from exercising these rights and freedoms.
The process of establishing appropriate limits on FRT use remains incomplete. Unlike other forms of biometrics collected by law enforcement, facial recognition is not subject to a clear and comprehensive set of rules.
Its use is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.
The nature of the risks posed by FRT calls for collective reflection on the limits of acceptable use of the technology. The question of where acceptable FRT use begins and ends is in part, then, a question of the expectations we set now for the future protection of privacy in the face of ever-increasing technological capabilities to intrude on Canadians’ reasonable expectations of privacy.
We look forward to engaging with police, lawmakers and other stakeholders on important questions surrounding the use of this technology.
Only through respect for the law and the values we cherish will we be able to safely enjoy the benefits of new technologies, while preserving the freedoms and rights that we proudly count on as Canadians.
Report of findings: Investigation into the RCMP’s collection of personal information from Clearview AI (involving facial recognition technology)
Complaint under the Privacy Act
- Clearview AI (“Clearview”) is a US-based company that created and maintains a large database of images containing faces (along with associated hyperlinks to the location on the Internet where the image was found). Clearview account holders can search this database for matching faces using facial recognition technology (“FRT”). The RCMP confirmed that it purchased two licenses to use Clearview in October 2019, and that RCMP members had also used Clearview via a number of free trial accounts since that time. The Office of the Privacy Commissioner (“OPC”) received a complaint under the Privacy Act (the “Act”) expressing concerns about the use of Clearview by the RCMP.
- In a related matter, on February 21, 2020, the OPC launched a joint investigation with provincial privacy authorities in Quebec, Alberta and British Columbia (“BC”), into Clearview’s collection of facial images in its database and subsequent disclosure to its customers. In that investigation, we found that Clearview’s personal information collection practices contravened the Personal Information and Protection of Electronic Documents Act (“PIPEDA”), as well as provincial privacy legislation in Quebec, Alberta, and BC.Footnote 1
- Section 4 of the Privacy Act specifies that “no personal information shall be collected by a government institution unless it relates directly to an operating program or activity of the institution.” In our view, the operating programs and activities of an institution must be limited to activities which fall within the institution’s legal authority to conduct and which respect the general rule of law.
- Following our evidentiary review and legal analyses, we find that since Clearview’s personal information collection practices were not compliant with its legal obligations, the RCMP’s subsequent collection of that information falls outside its legitimate operating programs and activities, thus representing a contravention of Section 4 of the Privacy Act.
- Clearview’s records demonstrate that the RCMP conducted hundreds of searches using Clearview FRT via at least 19 accounts across the country. In light of the significant breadth of these collections of personal information in contravention of the Act, to inform our recommendations for appropriate corrective action, we examined the adequacy of the RCMP’s controls to ensure it complies with Section 4 of the Act when it collects personal information in novel ways and from new sources.
- We found that the RCMP failed to properly assess the potential Privacy Act compliance risks that the use of Clearview’s massive database and facial recognition technology clearly presented. Further, it did not have systems in place to track, identify, assess, and control such novel collection of personal information. We therefore recommended that, within 12 months, the RCMP institute systemic measures and pertinent training to understand, track, identify, assess, and control the novel collection of personal information to ensure collection is limited as required by the Act. These recommendations are not limited to the matter at hand, but apply to any new technology involving the collection or use of personal information.
- While the RCMP disagreed with our findings that it contravened the Act, it nonetheless agreed to implement our recommendations and we therefore find the matter well-founded and conditionally resolved. Implementing the recommendations will require broad and concerted efforts across the RCMP. The RCMP has taken certain preliminary remedial steps already, such a creating a National Technology Onboarding Program unit. However, much work remains to be done to ensure adoption of changes in decision-making culture across the RCMP – supported by well-embedded processes, tools and training. We strongly encourage the RCMP to dedicate the sustained resources and senior level-championing necessary for successful implementation of its commitment to the recommendations.
- The RCMP is the Canadian national police service and a partner agency of Public Safety Canada. It provides all federal policing services in Canada and policing services under contract to the three territories, eight of the ten provinces (i.e., all except Ontario and Quebec), more than 150 municipalities, more than 600 Indigenous communities and three international airports. The RCMP have duties to enforce the law, prevent crime, and protect life. In order to meet these duties they conduct investigations, which entails the collection of information and identification of victims, offenders, or crime scenes.Footnote 2
- The Complainant, Member of Parliament for Timmins-James Bay Charlie Angus, expressed a range of concerns about the use of Clearview by the RCMP, which had not, at the time of his complaint in late January 2020, confirmed that it had used Clearview. Specifically the complainant wrote:Footnote 3
In Canada, there have been important Indigenous-led protests against mega-resource projects, and there is a history of distrust between those participating in social protest and the RCMP. Could technology such as Clearview Al be used to identify protesters and create profiles of civic dissent? Given the power of these technologies, we have to be vigilant. At this time, Canadian law enforcement agencies and Clearview Al remain quite tight-lipped about the extent of their collaboration. It is imperative that we understand more about how this technology may be used on Canadians, and for what ends.
Any use of such technology must be carried out under clear judicial oversight. However, this technology is being tested and implemented in a legislative and judicial vacuum. To this end, I ask you to launch an inquiry as to whether the RCMP or other police forces are using services such as Clearview Al. I am asking you to provide recommendations on the permissibility, limits and scope of use of facial recognition by law enforcement agencies.
- Following receipt of the complaint and in light of media reports about law enforcement use of Clearview, we first asked the RCMP, on January 29, 2020, if it was using Clearview FRT. At that time, the RCMP inaccurately informed the OPC that the RCMP had not used Clearview FRT.Footnote 4 It also committed to conduct Privacy Impact Assessments (“PIAs”) prior to deploying facial recognition technology. However, after Clearview’s client list, which included the RCMP, was reported stolen in February 2020, the RCMP then disclosed publicly, and to the OPC, that it had in fact been using personal information collected from Clearview in investigations. In response to the OPC’s recommendation that the RCMP discontinue the use of Clearview FRT during the course of the investigation, the RCMP further stated its intention to continue to use Clearview FRT within its National Child Exploitation Crime Centre (“NCECC”) and in other exigent circumstances.Footnote 5
- Moreover, despite its commitment above to the OPC to conduct PIAs prior to deploying facial recognition technology, in June 2020, eight months after commencing the collection of personal information from Clearview, the RCMP had only completed a PIA checklist – a Treasury Board Secretariat (“TBS”) tool used determine if a PIA is required under TBS directives.Footnote 6
- The RCMP had begun using Clearview FRT in October 2019. In response to our related joint investigation of Clearview’s practices, Clearview announced, on July 3, 2020 that it had ceased commercial activities related to its facial recognition tool and discontinued contracts in Canada, including those with the RCMP.
- In the course of using paid and trial accounts, the RCMP uploaded images of individuals to Clearview, which then displayed a number of matching images using Clearview FRT. Along with each matched image, Clearview provided an associated hyperlink to the webpage from which the image had been collected.
- According to the RCMP, Clearview accounts were created in five RCMP Divisions: National Headquarters (“NHQ”), BC, Alberta, Manitoba, and New Brunswick, including two paid NCECC licenses and 17 unpaid trial licenses.
- The RCMP initially indicated that it was using Clearview FRT primarily for identifying, locating and rescuing children who had been, or are, victims on online sexual abuse, and that it was “aware of limited use of Clearview AI on a trial basis by a few units in the RCMP to determine its utility to enhance criminal investigations.”Footnote 7
- In response to our question about how many search queries it made to Clearview, the RCMP told us, in May 2020, that the total number of uses by the RCMP was 78. It said 19 were for victim identification (by NCECC), 14 were to attempt to locate an identified suspect evading law enforcement, and 45 were for testing using RCMP members’ own images, images of other consenting individuals, or inanimate objects. It qualified these numbers by saying that queries were not tracked unless the application was used to support an investigation - any of its responses requiring estimates relied on the individuals who used the application to estimate “to the best of their knowledge”.
- We later learned from Clearview records that it had recorded 521 searches from RCMP accounts, including 33 searches from accounts labeled “Royal Canadian Mounted Police (Victim ID)” and 488 searches from accounts labeled “Royal Canadian Mounted Police.” When we asked the RCMP about the significant gap between their provided figures and those from Clearview, it told us that some of the ‘uses’ it originally reported to the OPC included multiple searches of the same individual. It explained that, for instance, Victim Identification specialists used images of children at different ages or different lighting, angles or focus. It also stated that prior to July 6, when it lost access to Clearview services, it was able to track the individual searches of Clearview, but that it could no longer do so.Footnote 8 We note, however, that the RCMP’s initial response (see paragraph 16) was provided to the OPC prior to July 6th, 2020 when the RCMP still had full access from Clearview, and that these additional searches had not been factored into their response at that time.
- In our view, while it is conceivable that the 521 searches recorded by Clearview could be linked to the 78 “uses” the RCMP reported, this does not constitute an accounting for those searches. Relying, therefore, on the search figures available to us (from Clearview records), we note that only approximately 6% appear to be linked to NCECC victim identification, and approximately 85% are not accounted for at all by the RCMP. In this context, the purposes for which the involved RCMP staff conducted these searches remains unknown.
- The images in question constitute “personal information” as defined in Section 3 of the Act, in that images of faces constitute information about an “identifiable individual.” In fact, facial images are considered highly sensitive biometric information given that they are a unique and permanent characteristic of our body, largely stable over time, and a key to our identity. Additional personal information is collected via association with the hyperlinks included with images.
- The methodological process for using Clearview FRT commences with an image being uploaded from the RCMP to Clearview. The image is analyzed by Clearview’s software to mathematically map facial features. The result is then used to search Clearview’s database for similar faces, with the analytical breakdown of facial features being how faces are “recognized”. Clearview displays likely matches, in the form of images, to the RCMP. This constitutes a collection of information by the RCMP from Clearview. Each picture includes a hyperlink to the internet address from whence it was scraped. This allows the RCMP to collect additional contextual information from the internet if such information is still accessible. The hyperlink itself can contain personal information in some cases, depending on the web address.
Issue: The collection from Clearview was not directly related to an operating program or activity
- Section 4 of the Act requires federal institutions to restrict their collection of personal information to information directly related to an operating program or activity. In order to determine if a collection meets this test, as a first step, the scope or nature of the “operating program or activity” must be clearly defined.
- In our view, the scope or nature of an operating program or activity under Section 4 of the Act must be limited to lawful activities of the institution and exclude those that do not respect the general rule of law. At the conclusion of the analysis below, our investigation found that the collection of personal information from Clearview was not within the legitimate scope of the RCMP’s operating programs and activities as set out in Section 4 of the Act.
- As a law enforcement agency carrying out police duties, the RCMP has broad authority to collect personal information, often without the knowledge or consent of the individual involved. These police duties range from investigating serious criminality to keeping the peace. The RCMP indicated that it “has the authority under the common law, as well as under statutory law (Section 18 of the RCMP ActFootnote 9 and 14(1)(a) of the RCMP RegulationsFootnote 10) to collect, use and disclose relevant information for criminal investigative purposes, as well as to preserve the peace and protect life.” It takes the view that this would include the collection of personal information from Clearview in the course of a criminal investigation.
- With respect to the RCMP’s position that the collection of personal information from Clearview is consistent with its common law powers, the RCMP explained:
Under Common Law, police powers are recognized as deriving from the nature and scope of police duties, including, “the preservation of the peace, the prevention of crime, and the protection of life and property.”Footnote 11
- Section 18 of the RCMP Act and paragraph 14(1)(a) of the Regulations read as follows:
RCMP Act – Duties
18. It is the duty of members who are peace officers, subject to the orders of the Commissioner,
(a) to perform all duties that are assigned to peace officers in relation to the preservation of the peace, the prevention of crime and of offences against the laws of Canada and the laws in force in any province in which they may be employed, and the apprehension of criminals and offenders and others who may be lawfully taken into custody […].
RCMP Regulations - Duties
14. (1) In addition to the duties set out in the Act, it is the duty of members who are peace officers to:
(a) enforce all Acts of Parliament and regulations and render assistance to departments of the Government of Canada as the Minister directs […].
- There is no debate that in a broad sense, the investigation of crimes falls under the RCMP’s common law powers and the authorities found in Section 18 of the RCMP Act and paragraph 14(1)(a) of the Regulations and generally constitutes a “program or activity” of the RCMP. However, even in the context of investigations, there are legal limits on the authority of the RCMP. It cannot carry out activities that are illegal or contrary to the general rule of law. To be compliant with Section 4 of the Act, a “program or activity” must be carried out in a way that falls within the institution’s legal authority and which respects the general rule of law.
- In relation to this case, we are of the view that Section 4 of the Act cannot be read to permit the collection of personal information from a third party agent that collected, used, or disclosed the information in contravention of a law that third party is subject to.Footnote 12 It necessarily follows that the RCMP’s collection of personal information through contracts with a private company that itself collected the personal information unlawfully would be a breach of Section 4 of the Act. To find otherwise would be to permit government institutions to advance their mandates while rewarding organizations whose personal information collection practices are unlawful, including non-compliance with Canadian privacy laws.
- The RCMP failed to take any active steps to verify the legality of the collection of the information. Concerning any relevant legal constraints on its programs or activities regarding the collections from Clearview, and its compliance with privacy laws, the RCMP’s written representations were that it “relied upon the assertions from Clearview AI that their images were all from publicly available information.”
- However, under PIPEDA and provincial privacy laws, private sector organizations must obtain consent from individuals for the collection of their personal information unless certain specific conditions are met (as described in the laws and specific regulations defining “publicly available” information). We could find no evidence that Clearview obtained such consent, despite collecting information from sources that do not qualify as ‘publicly available’ under PIPEDA regulations and requirements in applicable laws in Quebec, Alberta, and BC.Footnote 13 In our related joint-investigation of Clearview, Clearview confirmed that it did not seek the consent of individuals for the collection of their images, as is reflected in our conclusion that Clearview contravened PIPEDA and provincial privacy laws in Quebec, Alberta and BC.
- We are concerned by the willingness of the RCMP to abrogate its responsibility in respecting the privacy rights of Canadians in favour of accepting general assertions of a private company without any attempt at validation. In response to a preliminary version of this report of findings, the RCMP argued that Section 4 of the Act does not expressly impose a duty to conclusively confirm the collection authority of a non-governmental third party. It therefore asserted that it was within its authority to collect information from Clearview under Section 4 of the Act. It further argued that requiring the RCMP to ensure the third party's legal compliance would create an unreasonable obligation on the RCMP, as it has neither the legal expertise in PIPEDA, nor on the scope of a third party's legal obligations. It acknowledged that where there is “apparent unlawfulness” that may impact upon the RCMP’s ability to collect personal information but contends that was not the case in this instance.
- We are disappointed that the RCMP, as a state actor with coercive powers, would choose to rely on commercial players to fulfill its mandate to Canadians without accepting the responsibility of selecting partners that comply with the law. To be clear, the RCMP is obligated to inform itself of the lawfulness of the collection practices of partners from whom it collects personal information.
- Based on the foregoing, it is our view that the RCMP’s collection of personal information from Clearview, that Clearview had collected unlawfully, fell outside the scope of any “operating program or activity” of the RCMP and was therefore in contravention of Section 4 of the Act.
Issue: Appropriate corrective action is required by RCMP to develop controls to prevent future similar contraventions
- As noted above (paragraph 10), the RCMP did not conduct an assessment for compliance with Section 4 of the Act before beginning to collect the information of Canadians from Clearview. In addition, it permitted the creation and use by RCMP members of 16 accounts for which it did not provide a reasonable accounting of purpose or use (paragraph 18).
- To our knowledge, the RCMP no longer collects personal information from Clearview since Clearview withdrew its services from Canada in July 2020. However, given the widespread nature of RCMP’s collection of personal information from Clearview before that time, which included uses it could not account for, we examined the adequacy of the controls that the RCMP had/has in place to ensure compliance with Section 4 of the Act. This analysis informs our remedial recommendations in this matter.
- For the reasons described in detail below, we are concerned that without systemic changes and improved training, similar potential contraventions of the Act will continue to occur in the future with facial recognition and other technologies or strategies involving the collection of personal information.
- We would expectFootnote 14 that an institutional program collecting personal information that could present a high risk to individuals’ privacy would have in place robust structures, informed by appropriate expertise, to ensure compliance with Section 4 of the Act with respect to the personal information it collects; particularly when it considers any novel collection of personal information.Footnote 15 Novel collection includes: (i) collecting new types of personal information; (ii) collecting personal information for new purposes; (iii) collecting personal information in new ways (e.g. through biometric or other new technology); and (iv) collecting personal information from new sources.Footnote 16
- Our expectations are supported by the TBS’s Policy on Privacy Protection and Directive on Privacy Impact Assessment, to which the RCMP is also subject. The Directive specifies that:
- “The PIA is the component of risk management that focuses on ensuring compliance with the Privacy Act requirements and assessing the privacy implications of new or substantially modified programs and activities involving personal information.” The definition of substantial modification expressly includes “any change or amendment to the privacy practices related to activities that use automated or technological means to identify, create, analyze, compare, extract, cull, match or define personal information.”
- Institutions must establish a PIA development and approval process that, among other things, “is commensurate with the level of risk related to the privacy invasiveness of the institution’s programs or activities, and ensures the PIA is completed by the senior official or executive holding responsibility within the institution for new or substantially modified programs or activities.”
- For all personal information collected for use in administrative decisions about an individual, [such as law enforcement decisions], PIAs are also to be completed, among other situations, when contracting out activities to the private sector results in substantial modifications to the program or activities.
- Core PIAs must be provided to the Office of the Privacy Commissioner, and, “under the TBS Policy on Privacy Protection, heads of government institutions are required to notify the Privacy Commissioner of any planned initiatives (legislation, regulations, policies, programs) that could relate to the Privacy Act or to any of its provisions or that could have an impact on the privacy of Canadians. This notification is to take place at a sufficiently early stage to permit the Commissioner to review and discuss the issues involved” [emphasis added].
- All institutions are required by TBS Policy to conduct PIAs for new or substantially modified programs and activities involving personal information.Footnote 17 All institutions should also ensure that their decision makers understand and act on their obligations to prevent contraventions of the Act. However, we recognize that not all institutional programs present the same level of risk. Where the potential for privacy invasive collections is higher, we expect the related measures to ensure these risks are properly assessed to be more rigorous.Footnote 18 At a minimum, therefore, we would expect an institution with programs collecting personal information that could present a high risk to individuals’ privacy to have in place all of the following:
- Knowledge of obligations: Training programs to ensure all individuals empowered to make decisions about the collection of personal information understand the limitations on collection under Section 4 of the Act.
- Awareness of novel collections: Systems and procedures to track potential and actual novel collection of personal information.
- Processes to identify potential compliance issues: Procedures, including checkpoints within processes where novel collection may become known, to alert decision-makers that an assessment to ensure compliance with Section 4 of the Act may be warranted.
- Processes to complete timely assessments where warranted: Systems, procedures, and training on roles and responsibilities to ensure that if a fulsome assessment for compliance is warranted, that such assessments are completed in a timely way, before collection begins.
- Effective controls on collection: Effective controls, including dynamic monitoring, to limit the collection of personal information by staff at an institution to what it has validated as permissible under the Act.
A) Knowledge of obligations
- Knowledge by decision-makers of the relevant limits on collection of personal information, specific to a particular operating program or activity, is critical to compliance with the Act. This should be supported by training programs and access to expertise, including legal services and privacy subject matter experts, to ensure all individuals empowered to make decisions about the collection of personal information have a meaningful understanding of the limitations on collection under Section 4 of the Act.
- The RCMP defended the actions of its decision-makers (those who collected personal information from Clearview). Specifically, it stated, in response to our questions about why it considered the collections lawful, that “The RCMP relied upon the assertions from Clearview AI that their images were all from publicly available information. It was reasonable to rely upon their assertions. There was no requirement for the RCMP to investigate further into the constitution of their database.”Footnote 19
- As noted above (paragraph 32), our office found that Clearview collected personal information unlawfully, and therefore the collection by the RCMP of this information from Clearview is a contravention of the Act.Footnote 20 However, this is not the only potential issue with respect to compliance with the Act that the RCMP had an obligation to consider.
- Under the Charter,Footnote 21 a collection of personal information by law enforcement constitutes a search or seizure where an individual has a reasonable expectation of privacy with respect to the information at issue. The Supreme Court of Canada has also found that individuals may still possess a reasonable expectation of privacy even in public places.Footnote 22 The Courts may consider the invasiveness of a technology when determining whether an individual had a reasonable expectation of privacy,Footnote 23 and they have recognized informational privacy in relation to anonymity in Internet activity.Footnote 24 While we are not making any conclusions as to the RCMP’s compliance with the Charter in using Clearview technology, in our view, it should have been clear to the RCMP that both the collection from a privately-collected database, and the collection of information via facial recognition technology, warranted assessment from the RCMP for compliance with the Charter and common law principles.
- Government institutions require lawful authority to collect personal information. The RCMP represented that its collection of personal information from Clearview was authorized under the RCMP Act and under common law. Further, the common law clearly sets limits on the RCMP’s collection powers as a police body. Among other constraints, there are limitations set out by the courts in WaterfieldFootnote 25 and affirmed by the Supreme Court of Canada in StenningFootnote 26 and more recently in Fleming.Footnote 27 Specifically, to determine whether a police action (such as a search) is authorized under common law, the following two factors must be considered:
- Whether such conduct falls within the general scope of any law enforcement duty imposed by statute or recognized at common law; and
- whether such conduct constitutes a justifiable exercise of police powers associated with that duty.
- The second stage of the test assesses whether the police action is reasonably necessary for the fulfillment of the duty, which involves consideration of three factors: (1) the importance of the performance of the duty to the public good, (2) the necessity of the interference with individual liberty for the performance of the duty, and (3) the extent of the interference with an individual’s liberty and privacy.Footnote 28
- With respect to the first element of the test, as noted above (paragraph 18), the RCMP did not provide an accounting for 85% of the searches it conducted through its Clearview accounts. As a consequence, it seems the RCMP is not able to demonstrate that an evaluation with the first element of the Waterfield test was carried out. The RCMP noted that prior to its directive on the use of Clearview made after the OPC commenced its investigation, usage of Clearview outside an investigation would not have been tracked as there was no established requirement to do so. This represents a significant failure in accountability by the RCMP.
- With respect to the second element of the test, we note that the use of facial recognition technology, with its power to disrupt anonymity in public spaces, can constitute a meaningful interference with liberty and privacy. Further, the RCMP’s use of a service such as Clearview, that is based on the systematic extraction and processing of billions of images of individuals innocent of any crime, is a major and substantial intrusion by the state into the private lives of Canadians.
- Before using such a service, a police body must, at a minimum, examine whether such a service is reasonably necessary to the investigation and consider the proportionality of the intrusion against the specific public interest being pursued.Footnote 29
- In this context, the fact that multiple decision makers within the RCMP did not feel it necessary to consider what limits on the use of such technology might be required under common law suggests a lack of meaningful understanding of the RCMP’s relevant privacy obligations.
- We recommended, and the RCMP has agreed, to engage in dialogue with the OPC and other privacy regulators on the privacy issues surrounding the use of facial recognition technology.
- We also recommended, and the RCMP has agreed, to institute a training program commencing no later than 12 months from the receipt of this report to ensure that all decision-makers are trained on the limitations on collection of personal information under the Act, including:
- When contracting for personal information collection services, not using service providers that are collecting personal information in violation of Canadian privacy laws.
- Nuanced and meaningful understanding of the limitations associated with the collection of personal information, particularly when using new privacy-invasive technology, including mass surveillance tools.
- As noted above (paragraph 30), the RCMP disagreed with our finding that it is obligated to inform itself of the lawfulness of the collection practices of partners from whom it collects personal information. We appreciate that despite the RCMP’s disagreement with the finding, it acknowledges that deficiencies exist in its current practices. To that end, the RCMP has agreed to implement the recommendations above, including specifically that it will conduct fulsome assessments of third parties’ compliance with Canadian privacy laws those third parties are subject to.
B) Awareness of novel collections
- Awareness as to what novel personal information collection techniques are taking place (or being considered/tested) is a critical foundation to turning a theoretical understanding of the obligations under Section 4 of the Act into action. This is necessary to ensure collection is appropriately limited, and to be able to provide accurate assurances to external stakeholders in support of the public trust.
- The facts of this case clearly demonstrate significant gaps in the RCMP’s systems for ensuring its own awareness of novel personal information collection practices that it is undertaking. As noted above (paragraph 10), the RCMP inaccurately informed the OPC on January 29, 2020 that the RCMP had not used Clearview in an investigative context, despite the fact that it had purchased licenses for Clearview in October 2019, and had since made extensive use of Clearview’s services. According to the RCMP, this erroneous declaration occurred because none of the internal experts in the RCMP’s Technical Investigative Services who were consulted by members of the RCMP’s Access to Information and Privacy Unit were aware of its use, despite the fact that Clearview was used broadly across 5 different RCMP Divisions.
- Only after Clearview’s client list, which included the RCMP, was reported stolen in February 2020, did the RCMP correct the erroneous statement to the OPC.
- Further, as already noted, the RCMP only provided a reasonable accounting for approximately 15% of the more than 500 searches that, according to Clearview’s records, the RCMP made. The purposes for the remaining searches are unknown.Footnote 30
- The RCMP did not have any system in place, either locally or nationally, to track the consideration of, or actual use of, new investigative technologies or other novel personal information collection.
- We recommended, and the RCMP has agreed, that no later than 12 months from the receipt of this report, it will institute systems and procedures to ensure that all novel collections of personal information across the RCMP are reliably tracked internally in such a way that they can meaningfully inform, and be meaningfully informed by, the RCMP’s decision-making on such collections.
- With respect to the above commitment, as a preliminary step, the RCMP launched a National Technology Onboarding Program unit in March 2021. The program is intended to create a framework to implement a centralized system to enable the RCMP to identify, assess and track new and emerging investigative techniques that involve the collection of novel types of information for investigational purposes.
C) Processes to identify potential compliance issues
- The identification of actual or potential collections that warrant an assessment for compliance with the Act before they are undertaken is crucial. We would expect these checks to be commensurate to the potential risks to individuals’ privacy, based on the volume, sensitivity and complexity of the relevant personal information collection activities. We would expect the checks to be embedded in appropriate processes depending on the activities of the institution. In this way, decision-makers empowered to make potentially high-risk novel personal information collection decisions can demonstrate how they have accounted for compliance with the Act.
- In addition to being embedded into processes for the development and approval of new programs, we would expect such checks to be embedded in other processes where new types of personal information, new purposes for collection, new ways of collecting information, or new sources of personal information could arise. For instance, as a few illustrative examples:
- processes for entering into information sharing arrangements;
- procurement processes;
- processes governing pilots and the use of trial services; and
- processes for vetting new technology.
- As a key example of gaps, the RCMP indicated it has references to considering the need for PIAs only in IM/IT manuals. PIAs (or alternative measures to assess compliance with the Act) are necessary outside of IM/IT processes. Any branch and level of the RCMP that can commence the novel collection of personal information must have procedures in place to initiate an assessment of compliance with the Act where warranted.
- The RCMP indicated that RCMP members are empowered to use their discretion to try new personal information collection techniques with appropriate approvals at the local level. The decision of whether or not to conduct a PIA is left to supervisory staff to determine. In this case, the RCMP opened a total of 19 paid and trial accounts to collect information from a materially new private sector source, including two paid contracts, all without triggering an assessment for compliance with Section 4 of the Act. In addition, the RCMP’s collection via a new privacy-invasive technology (facial recognition), which is clearly a substantial modification to the collection of personal information that could affect compliance, did not trigger an internal assessment for compliance with the Act before use.
- The RCMP’s representations indicated that it was only on February 6, 2020, more than four months after contracting with Clearview, that it started to consider conducting an assessment for compliance with the Act for the collection of information from Clearview. Its records further indicate that it only began filling in a PIA checklist (a TBS tool used determine if a PIA is warrantedFootnote 31) in March 2020. This preliminary issue identification step was not completed until June 2020. As demonstrated in this case, adopting an ad-hoc, minimally supervised and ex-post approach will raise the likelihood of privacy contraventions and result in otherwise preventable damages.
- The OPC is of the view that with a proper vetting process, the contravention at issue could have been prevented. This could have been done if the RCMP had identified the potential compliance issues related to the use of Clearview AI in a timely way in order to begin an appropriate assessment for compliance with the Act, informed by appropriate consultation with subject matter experts. Had the RCMP notified the OPC, as required by the TBS Policy on Privacy Protection (see paragraph 37), then early discussion of the privacy implications could have taken place. It is a serious concern that the RCMP failed to identify the need for a compliance assessment (in the form of a PIA or otherwise) prior to collecting information via any of its 19 accounts.
- We recommended, and the RCMP has agreed, that no later than 12 months from the receipt of this report, it will institute a system of checks where potentially high risk novel collection may become known, to alert decision-makers that a compliance assessment may be needed. Decision-makers can then demonstrate that they have considered compliance with the requirements of Section 4 of the Act before they begin collecting personal information, and can initiate full assessments where needed.
D) Processes to complete timely assessments
- Where a potential compliance issue is identified through the checks described above (i.e., where compliance with the Act may be at issue), assessments for compliance with the Act should be commenced and completed before collection starts. This reduces the risk of unlawful collection, and prevents associated damage to individuals who could otherwise be affected by such collection of their personal information. These assessments should be informed by appropriate privacy and legal subject matter expertise, and where warranted, the OPC.
- While, as described above, the RCMP’s internal processes failed to proactively identify the potential privacy concerns in this instance, it should nonetheless have been apparent to the RCMP that a fulsome assessment of the collections from Clearview was warranted after the issue was raised prominently in the media in January 2020. Indeed, the RCMP expressly committed to the OPC in January (when it indicated it was not using Clearview) that it would conduct a PIA before actively deploying such technology.
- Despite both the clear need and this statement to the OPC, the RCMP provided no evidence that it had commenced an actual Privacy Act compliance assessment (beyond the issue identification checklist referenced above), before Clearview AI withdrew its facial recognition services from Canada in July 2020.
- The RCMP indicated that it does not have training material for staff on conducting PIAs, despite the significant amount of sensitive personal information it collects as a law enforcement agency.
- Of further concern is that none of the RCMP’s decision-makers conducted such an assessment before beginning collection. This is despite the RCMP’s previous assertions to our office that it recognized the potential privacy impact of Clearview’s services and was committed to doing a PIA before its use. It is our view that this represents a critical failure by the RCMP to ensure it meet its obligations under the Act.
- We recommended, and the RCMP has agreed, to institute a training program for all decision-makers (i.e., anyone empowered to make decisions to collect novel personal information) on their roles and responsibilities in identifying, assessing, and avoiding collection in contravention of the Act, commencing no later than 12 months from the receipt of this report.
- We also recommended, and the RCMP has agreed, that no later than 12 months from the receipt of this report, it shall demonstrate that it has dedicated the resources and put in place the processes to ensure that assessments for compliance with the Act are carried out every time they are warranted and applied consistently by all parts of RCMP. These resources and processes should ensure assessments are completed in a timely way, informed by appropriate subject matter expertise commensurate to the issues identified, before personal information is collected for any law enforcement purposes.
- As a preliminary step towards satisfying these commitments, as noted above, the RCMP has launched a National Technology Onboarding Program unit in March 2021.
E) Effective controls on collection
- When the RCMP first created paid Clearview accounts in October 2019, it did not have any specific policies in place limiting the purposes for which this information could be collected, despite the clear privacy implications described above. After the expression of public concern and the announcement of our investigation the RCMP did issue internal direction to members, on March 5, 2020, imposing limitations on collection using Clearview. Specifically, the directive provided information and instruction to RCMP members including:
Given the speed at which technology is evolving, the RCMP continues to explore the broader use of emerging technologies to determine how they could potentially benefit police operations. While leveraging new technology can enhance our ability to conduct investigations more efficiently and effectively, we need to balance this against an individual’s right to privacy.
[…] The RCMP and other law enforcement agencies are continuously trying to identify, test and potentially acquire new and innovative technologies to further criminal investigations within the scope of their authorities. Discovery, successful testing/piloting and adoption of a new technology requires the development of operational policies to ensure governance and accountability on their use, as well as their associated data acquisition, use and storage. These policies are developed in consideration of the Charter, the Privacy Act and other relevant legislation, regulations and policies.
Our review of the continued use of this technology and particularly Clearview AI is ongoing. In the interim, given the sensitivities surrounding facial recognition technology, we will only be using it in very limited and specific circumstances.
Going forward, Divisions are asked to carefully scrutinize the use of facial recognition technologies, including Clearview AI, and only use it in exigent circumstances for victim identification in child sexual exploitation investigations or in circumstances where threat to life or grievous bodily harm may be imminent. Divisions should ensure that the Criminal Operations Officers (CROPS) approve any requests to use facial recognition technology. In the case of NHQ, any use is to be approved by the Director General of the requesting Program. You are further asked to advise [the Assistant Commissioner of Technical Operations, SPS] of any use of facial recognition technology. This will provide national oversight and ensure that the organization is informed on an ongoing basis of the limited use of this technology.
- During the course of our investigation, the RCMP also centralized the management of the Clearview accounts and created an audit function to monitor RCMP members’ use of Clearview.
- We appreciate these positive indicators that the RCMP recognized, after concerns were identified externally, that the situations in which it could lawfully collect information from the internet via facial recognition technology can be constrained under the common law (and consequently, Section 4 of the Act).
- However (as noted in paragraph 18 above), only 6% of the RCMP’s searches using Clearview appeared to be related to victim identification, and the RCMP accounted for only an additional approximately 9% of searches. It did not account for the vast majority (85%) of the searches it conducted according to Clearview records. We cannot say with any confidence that the searches were limited to the purposes identified above, or even that they were for professional purposes. A consequence of the absence of records relating to professional purposes is that it points to the opposite.
- Further, the RCMP did not indicate that any of the RCMP members who created and collected information through any of the Clearview trial accounts contravened any internal policies or procedures when they did so. This lack of control over a significant novel collection of personal information is a significant concern. We acknowledge that in order to operate efficiently and effectively the RCMP requires that its staff take initiative and innovate. However, it is important that there be measures in place to ensure that such innovation does not cross legal lines.
- We recommended, and the RCMP has agreed, to institute clear controls in effect no later than 12 months from the receipt of this report and including: (i) policies to clarify who can make decisions to undertake novel collections of personal information, and what steps staff need to take to determine if a collection they are considering is permissible, and (ii) systems to monitor for unauthorized collections, including collections for inappropriate purposes.
- We also recommended, and the RCMP has agreed, to establish a clear methodology and chain of responsibility for members to suggest novel collection techniques to trained decision makers, commencing no later than 12 months from the receipt of this report.
- Given that the collection of personal information from Clearview by the RCMP was not consistent with Section 4 of the Act, and that the RCMP is no longer collecting information from Clearview, we did not examine in detail the RCMP’s ancillary procedures relating to its use.
- However, with a view to informing future practices by the RCMP, we provide certain observations relating to other sections of the Act that would be applicable to any future similar activities by the RCMP.
Protection against unintended use or disclosure
- Section 8 of the Act requires that an institution not disclose personal information except for the purpose for which it was collected or a consistent use. This would apply, for instance, to images collected by the RCMP and uploaded to Clearview for facial recognition matching purposes.
- In order for institutions to ensure they respect these limitations, including when contracting for services (as was the case here), the TBS Directive on Privacy Practices requires that when personal information is disclosed to a private sector institution, contracts established with private sector entities must outline measures and provisions to address privacy issues. This includes provisions to limit inappropriate handling, use, or disclosure by the contracted entity and to ensure adequate protections of personal information from unintended disclosures to third parties. In addition, the protection provisions are to ensure government security standards are respected.
- Such measures are demonstrably important. We note in this regard that Clearview suffered a data breach in February 2020, though we have no indications that this breach affected images uploaded to Clearview by the RCMP.
- According to its representations, the RCMP took certain measures to protect personal information disclosed to Clearview from subsequent inappropriate use or disclosure:
- For the small subset of searches of Clearview that the RCMP could account for, it had procedures in place that required that photos be cropped to just the face being searched before uploading to Clearview to avoid uploading irrelevant information to limit disclosure risk.
- It indicated that Clearview confirmed to the RCMP that data transmissions are encrypted, and that uploaded images are not added to Clearview’s database.
- It requested that Clearview reduce the retention period for uploaded images to 45 days, and subsequently retain only a low resolution thumbnail for 6 months.
- Subsection 6(2) of the Act specifies that “A government institution shall take all reasonable steps to ensure that personal information that is used for an administrative purpose by the institution is as accurate, up-to-date and complete as possible.”
- The RCMP submitted that with respect to the accuracy of the results obtained by Clearview and used by the RCMP, it directed its members via the NCECC Standard Operating Procedure to treat all information as leads, not confirmed identity matches. Whether or not to pursue further steps was based on the assessment of the members.
- With respect to the use of facial recognition technology in general, we believe that it is important to recognize that concerns exist with respect to the accuracy and algorithmic bias in facial recognition technologies. This includes the potential for ‘false positive’ identifications that could affect law enforcement decisions taken about individuals, such as whether to make further enquiries about them. Such actions can in turn have a significant impact on individuals’ privacy.
- For example, there is extensive scientific literature detailing the inaccuracy of anthropometric dataFootnote 32 for the purposes of racial or ancestral determination in the field of forensic anthropology at the level of skeletal analysis, particularly the skull and face, and the use of external soft tissue markers used for forensic facial approximation (reconstruction) appears to be even less accurate. According to Ubelaker et al., “Techniques of facial approximation are improving with enhanced information regarding the relationship of facial hard and soft tissues and more sophisticated computer technology. Despite these advancements, facial approximation does not represent a method of positive scientific identification.”Footnote 33 We believe that this is important to recognize because facial recognition technology is based on anthropometric data that makes certain assumptions about common facial features of various ethnic/racial groups.
- A recent study by the US National Institute of Standards and Technology (“NIST”) as part of its “Face Recognition Vendor Test”Footnote 34 series focused on demographic differentials for contemporary face recognition algorithms (NISTIR 8280).Footnote 35 In summary, the NIST study made the following findings:
- For one-to-one matching, the team saw higher rates of false positives for Asian and African American faces relative to images of Caucasians.
- Among U.S.-developed algorithms, there were similar high rates of false positives in one-to-one matching for Asians, African Americans and native groups.
- However, a notable exception was for some algorithms developed in Asian countries.
- For one-to-many matching, the team saw higher rates of false positives for African American females.
- However, not all algorithms give this high rate of false positives across demographics in one-to-many matching.Footnote 36
- To be compliant with Section 6 of the Act, to the extent that the RCMP considers future use of specific facial recognition technologies, it will be important for it to carefully assess measures that may be needed to address such accuracy-related concerns, particularly the potential for false positives. We look forward to further engaging with the RCMP on this matter in planned dialogue with the OPC and other privacy regulators in the context of our development of guidance on the use of facial recognition technology by law enforcement.
- We find that the RCMP’s collection of personal information from Clearview was in contravention of Section 4 of the Act. The basis for this finding is that Clearview’s collection of personal information of Canadians was in contravention of the law. It therefore follows that that the RCMP contravened the Act when it subsequently collected personal information that was unlawfully gathered by Clearview.
- There were serious and systemic failings by the RCMP to ensure compliance with the Act before it collected information from Clearview and, more broadly, before novel collection of personal information in general. This includes widespread failures to know what it was collecting, control how collection occurs, identify potential compliance issues, and assess and prevent contraventions of the Act.
- Prior to the OPC providing the recommendations above, in response to questions we raised, the RCMP had already begun to explore ways to improve its review of collection practices for compliance with its legal obligations, and in March 2021 it launched a National Technology Onboarding Program unit. Further, the RCMP committed to implementing the recommendations made by our office in the report above.
- We remain concerned that the RCMP disagreed with our findings that it contravened the Act, and takes the position that it is not obliged to ensure that third party agents it collects personal information from have acted lawfully with respect to the collection and use of that personal information. However, we are encouraged by the preliminary steps the RCMP has already taken above, and its commitment to implement all of our recommendations. We therefore find the matter well-founded and conditionally resolved.
- Implementing the recommendations will require broad and concerted efforts across the RCMP. Much work remains to be done to ensure adoption of changes in decision-making culture across the RCMP – supported by well-embedded processes, tools and training. We strongly encourage the RCMP to dedicate the sustained resources and senior-level championing necessary for successful implementation of its commitment to the recommendations. By fully implementing our recommendations, the RCMP will be able to more effectively explore and responsibly use new technologies to advance its critical mandate.
Report a problem or mistake on this page
- Date modified: