Language selection

Search

Letter to the Standing Committee on Access to Information, Privacy and Ethics on their Study of the Use and Impact of Facial Recognition Technology

The Privacy Commissioner of Canada, Daniel Therrien has sent the following letter to the Standing Committee on Access to Information, Privacy and Ethics to provide information requested during his appearance before the Committee on May 2, 2022.


BY EMAIL

May 13, 2022

Mr. Pat Kelly, M.P.
Chair
Standing Committee on Access to Information, Privacy and Ethics
House of Commons
Sixth Floor, 131 Queen Street
Ottawa, Ontario, K1A 0A6

Dear Chair:

Thank you for the opportunity to appear before your committee on May 2, 2022 in relation to the committee’s study, Use and Impact of Facial Recognition Technology. As requested by the committee during my appearance, I am writing to provide additional information concerning the following:

Recommended legal framework for police use of facial recognition technology

During the appearance, I undertook to provide the committee with a copy of our Recommended legal framework for police agencies’ use of facial recognitionFootnote 1, which was issued jointly by Federal, Provincial and Territorial Privacy Commissioners on May 2, 2022. Our recommended framework sets out our views on changes needed to ensure appropriate regulation of police use of facial recognition technology (FRT) in Canada. A future framework should, we believe, establish clearly and explicitly the circumstances in which police use of FRT is acceptable – and when it is not. It should include privacy protections that are specific to FRT use, and it should ensure appropriate oversight when the technology is deployed. While developed specifically for the policing context, there are many elements of our proposed framework that could be leveraged beyond this context.

Best practices for FRT regulation

The committee requested that I provide examples of best practices for regulating FRT from jurisdictions where regulatory frameworks have been enacted or proposed. Several international jurisdictions have enacted or proposed regulatory frameworks for FRT specifically, or biometrics more broadly that would also apply to FRT, which could inspire Canada’s approach. In particular, I would draw your attention to a number of notable measures worthy of consideration:

  • The European Union’s proposed Artificial Intelligence ActFootnote 2, if adopted, would outlaw public and private sectors from using harmful AI applications that, among other things, manipulate individuals or exploit vulnerabilities of individuals due to certain personal characteristics. Non-prohibited applications that are high-risk (including use of biometrics for identification and categorization) are subject to specific legal requirements such as risk management measures and systems; logging and record-keeping; general human oversight; accurate and representative data for AI training; ex-ante conformity assessments; and, demonstrable accountability.
  • Under the European Union’s General Data Protection RegulationFootnote 3, biometrics (including facial images as defined under Art. 4) are considered a special category of data that is prohibited, unless the controller can rely upon a legal ground under Art. 6 (e.g. explicit consent) and a ground for processing under Art. 9 (2) (e.g. proving the processing is necessary for reasons of substantial public interest, etc.).
  • Federal laws proposed in the US contain provisions that would impact FRT use by private and public bodies. The Fourth Amendment Is Not for Sale ActFootnote 4, introduced in April 2021, seeks to stop data brokers selling personal information to law enforcement agencies without court oversight. The Bill would also ban the use of data by public agencies that was illegally obtained. The Algorithmic Accountability ActFootnote 5, introduced in February 2022, would require new transparency and accountability for automated decision systems. The Bill would require private organizations to conduct assessments for algorithmic bias, effectiveness and other factors.
  • US state laws enacted in Washington and Utah regulate government use of FRT by specifying conditions for permitted use. Utah's law on Governmental Use of Facial Recognition TechnologyFootnote 6, for example, requires government entities to notify individuals whenever they are capturing images that could be used in conjunction with FRT and to provide notice 30 days prior to the proposed use. Under Illinois’ enacted Biometric Information Privacy ActFootnote 7, companies are prohibited from selling or otherwise profiting from consumers’ biometric information (that includes one’s face geometry as per Sec. 10).

Further, as noted by Commissioner Poitras during our appearance, Quebec’s Act to establish a legal framework for information technologyFootnote 8 gives its privacy regulator, the Commission d'accès à l'information du Québec (CAI), broad powers relating to private and public entities’ creation and use of databases of biometric characteristics, including prohibiting a database from coming into service and ordering changes or its destruction if the database violates the CAI’s orders or infringes on privacy.

RCMP compliance with OPC privacy recommendations

In June 2021, the OPC released our Report of findings: Investigation into the RCMP’s collection of personal information from Clearview AI (involving facial recognition technology).Footnote 9 The report included several recommendations to improve the RCMP’s compliance with the Privacy Act, and the RCMP agreed to implement these recommendations within one year.

During my appearance, I undertook to report to the committee on the RCMP’s timeline to implement these recommendations. Since June 2021, the OPC has been engaged in regular conversations with the RCMP on this matter, and the RCMP has provided two progress reports to the OPC, in October 2021 and February 2022, with a final report due at the end of June 2022. The RCMP appears to be on track to implement our recommendations by June 30, 2022.

Government use of IntelCenter database

I was asked by the committee to provide any information the OPC may have with respect to government use of IntelCenter’s FRT database and services. I can confirm that the OPC is not aware of any use or testing of IntelCenter in an operational capacity by the RCMP or any other government institutions.

Artificial intelligence

The committee expressed an interest in receiving further information from OPC relating to work we have done on artificial intelligence, social media, and political campaigning. The following OPC publications may be of interest in that regard:

  • A Regulatory Framework for AI: Recommendations for PIPEDA ReformFootnote 10, which sets out our views on what an appropriate law for AI would include.
  • Joint investigation of Facebook, Inc. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia: Report of findingsFootnote 11, in which we found that Facebook had committed serious contraventions of Canadian privacy laws and failed to take responsibility for protecting the personal information of Canadians. I would note that because Facebook refused to implement our recommendations, we are seeking a binding order from the Federal Court to require Facebook to take action to correct its privacy practices and comply with PIPEDA. Facebook has since responded by filing a judicial review, challenging the investigation.
  • Joint investigation of AggregateIQ Data Services Ltd. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British ColumbiaFootnote 12, in which we found that AggregateIQ failed to meet its obligations under Canadian privacy laws when it used and disclosed the personal information of millions of voters in British Columbia, the United States, and the United Kingdom.

OPC review of Clearview AI contract

I noted that during your May 9, 2022 meeting the question was raised as to whether, in the course of the OPC’s investigatory work, we had reviewed contractual agreements between the RCMP and Clearview AI. This will confirm that, during the course of our investigation, the RCMP provided OPC with copies of Clearview AI’s general Terms of Service, Code of Conduct, and Privacy Policy.

As explained at paragraph 79 of our Report of Findings, the RCMP submitted that its licenses – whether paid or trial – were subject only to these terms.

Thank you again for the opportunity to appear before the committee on this important issue. I hope the additional information herein will be useful as you conclude your study.

Sincerely,

(Original signed by)

Daniel Therrien
Commissioner

c.c.: Nancy Vohl
Clerk of the Committee

Date modified: