Appearance before the Standing Committee on Industry and Technology (INDU) on the Study of Bill C-27
October 19, 2023
Opening statement by Philippe Dufresne
Privacy Commissioner of Canada
(Check against delivery)
Good afternoon, Mr. Chair, Members of the Committee,
I am pleased to be back to assist the Committee in its study of Bill C-27, the Digital Charter Implementation Act, which would enact the Consumer Privacy Protection Act (CPPA), the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (AIDA). I am accompanied by Michael Maguire, Director of PIPEDA Investigations, and Lara Ives, Executive Director, Policy, Research & Parliamentary Affairs.
When I previously appeared before the Committee three weeks ago, I delivered opening remarks about Bill C-27 and my 15 key recommendations to improve and strengthen the Bill.
Today, I want to briefly highlight and respond to the Minister of Innovation, Science and Industry’s letter to the Committee of October 3, 2023, and to answer any questions that you might have.
I welcome the Minister’s stated position on the amendments being developed with respect to the proposed CPPA, where he seems prepared to agree with four of my Office’s 15 key recommendations. Namely, by explicitly recognizing privacy as a fundamental right; by strengthening the protection of children’s privacy; by providing more flexibility for my Office to use “compliance agreements”, including through the use of financial penalties; and by allowing greater cooperation between regulators.
I also note and commend his statement of openness to further amendments following the study by this committee.
I would like to take this opportunity to highlight other ways in which the Bill should be strengthened and improved in order to better protect the fundamental privacy rights of Canadians, which are addressed in our remaining recommendations to the Committee. I will briefly highlight 5 of our recommendations that stand out in particular in light of the Minister’s letter, and would be happy to speak to all of our recommendations in the discussion that will follow.
First, Privacy Impact Assessment (or PIAs) should be legally required for high-risk activities, including AI. This is critically important in the case of AI systems that could be making decisions that have major impacts on Canadians, including whether they get a job offer, qualify for a loan, pay a higher insurance premium, or are suspected of suspicious or unlawful behaviour. While AIDA would require those responsible for AI systems to assess and mitigate the risks of harm of high impact AI systems, the definition of harm in the Bill does not include privacy. This means that there would be proactive risk assessments for non-privacy harms, but not for privacy harms. This is a significant gap given that in a recent OECD report on generative AI, threats to privacy were among the top 3 risks recognized by G7 members. In my view, responsible AI must start with strong privacy protections.
Second, the Bill does not allow for fines for violations of the appropriate purposes provisions, which require organizations to only collect, use and disclose personal information in a manner and for purposes that a reasonable person would consider appropriate in the circumstances. This omission would leave the federal private sector privacy law as a standout when compared with the European Union (EU) and the Québec regime, which allow the imposition of fines for such important privacy violations. If the goal is, as the Minister has indicated, to have a privacy law that includes tangible and effective tools to encourage compliance and to respond to major violations of the law in appropriate circumstances, surely this shortcoming needs to be addressed for such a critical provision.
Third, there remains the proposed addition of a new tribunal, which would become a fourth layer of review in the complaints process. As indicated in our submission to the Committee, this would make the process longer and more expensive than the common models used internationally and in the provinces. That is why we have recommended two options to resolve this problem. The first option would be to have decisions of the proposed tribunal reviewed directly by the Federal Court of Appeal, and the second option would be to provide my Office with the authority to issue fines and to have our decisions reviewable by the Federal Court without the need to create a new tribunal, which is the model that we most commonly see in other comparable jurisdictions.
Fourth, the Bill as drafted continues to allow the government to make exceptions to the law by way of regulations without the need to demonstrate that those exceptions are necessary. This needs to be corrected as it provides too much uncertainty for industry and for Canadians and it could significantly reduce privacy protections without Parliamentary oversight.
Fifth, and finally, the Bill would limit the requirement for organizations to explain, upon request, the predictions, recommendations or decisions that are being made about Canadians using AI, to situations that have a “significant impact” on an individual. At this crucial time in the development of AI, and given the privacy risks that have been recognized by the G7 and around the world, I would recommend more transparency in this area rather than less.
With that, I would be happy to answer any questions that you might have.
- Date modified: