Language selection

Search

Keynote remarks at the 26th Annual Vancouver International Privacy & Security Summit (VIPSS)

March 8, 2024

Address by Philippe Dufresne
Privacy Commissioner of Canada

(Check against delivery)


Introduction

Good morning. Thank you for the kind introduction and for the invitation to be here today. At the outset, I want to wish you all a very happy International Women’s Day!

I am honored to be among so many impressive privacy and cybersecurity champions to discuss one of the biggest challenges and opportunities facing our world – that is, the proliferation of artificial intelligence and generative AI, and its impact on privacy and security.

I also want to acknowledge the presence of my fellow provincial and territorial Privacy Commissioners, including Michael McEvoy from British Columbia, Diane McLeod from Alberta, Diane Aldridge for the Saskatchewan Commissioner, Andrew Fox from the Northwest Territories, and Patricia Kosseim from Ontario.

It is my second time at the VIPSS conference, and the topics and discussions that we are having are so timely and important to the privacy and security sectors, to Canadians, and to the world. They have been insightful, thoughtful and inspiring, and have given me an even greater appreciation for the importance of the work that we do, and the space to have these conversations.

I always enjoy being here in Vancouver, which has the distinction of being a top location for fantasy and sci-fi film and television productions.

From the X-Men to the X-Files, Vancouver has long played host to cutting edge films and television shows, so perhaps it is fitting to be here exploring how to address current realities that not so long ago would have been considered science fiction.

AI has shifted the landscape in all aspects of our lives. As Dr. Kate Crawford said during her fireside chat yesterday, AI systems will impact all of us, even if we are not using them directly.

The entertainment industry is no exception.

Shortly after the screenwriters strike last year, the headline of a local entertainment and lifestyle news site asked: “Will artificial intelligence drive the film industry out of Vancouver?”

It is a concern that made its way to Parliament a few weeks ago, when groups representing those in the television, film, and music industries called on the federal government to do more to ensure that their jobs are not replaced by AI.

Technology brings great potential and opportunities, but it also has challenges, risks, implications and costs. We all play a key role, and we need to work together in facing the challenges and opportunities, and that is why the conversations that we have had this week on privacy and security are so important.

Just last week, the Supreme Court of Canada issued its split decision in the case of R. v. Bykovets, where the Court majority held that ISP addresses give rise to a reasonable expectation of privacy, and that police needed a warrant to obtain them from private sector third parties. The majority decision reiterated that “[p]ersonal privacy is vital to individual dignity, autonomy, and personal growth [and that] its protection is a basic prerequisite to the flourishing of a free and healthy society.”

The majority added that “by concentrating this mass of information with private third parties and granting them the tools to aggregate and dissect that data, the internet has essentially altered the topography of privacy under the Charter [and has] added a third party to the constitutional ecosystem, making the horizontal relationship between the individual and the state tripartite.”

In other words, privacy is essential to our democracy, and technology is changing our world in significant ways. This applies in other contexts, too, because as was noted by Brenda McPhail during yesterday’s panel discussion with Marty Abrams, privacy is instrumental to the exercise and enjoyment of many other fundamental rights and freedoms, such as the right to equality.

The Court’s decision, dealing with public-private partnerships, touched on an issue that was at the heart of my recently released report of findings in a matter involving the RCMP’s use of third-party search tools that can access restricted sites on the internet, including the dark web.

In the report tabled in Parliament last month, I recognized the need for the police to have the tools that they need to combat crimes, but reiterated the principle that when the police use private sector contractors to obtain information about Canadians, they must take the necessary steps to verify that the contractor is complying with its own privacy obligations and be more transparent about its use. This is part of the management of this tripartite relationship between individuals and the public and private sectors.

When I spoke here last year, I shared the three pillars of my vision for privacy: that privacy is a fundamental right, that it supports the public interest and Canada’s innovation and competitiveness; and that privacy accelerates the trust that Canadians have in their institutions and in their participation as digital citizens. This vision continues to inspire me and serve as a compass, and it has formed the basis of the OPC’s strategic plan, which I launched in late January.

I want to talk about this strategic plan that will guide the work of my Office over the next three years, focusing on three priority areas:

  • Protecting and promoting privacy with maximum impact by using business intelligence to identify trends that need attention, producing focused guidance and outreach, leveraging strategic partnerships, and preparing for the implementation of potential new privacy legislation;
  • Addressing and advocating for privacy in this time of technological change with a focus on artificial intelligence and generative AI, the proliferation of which brings both potential benefits, and increased risks to privacy; and
  • Championing children’s privacy rights to ensure that their unique privacy needs are met, and that they can exercise their rights.

All of these priorities include the themes of engagement, partnerships, collaboration and continued learning.

Protecting and promoting privacy with maximum impact

Protecting and promoting privacy with maximum impact is the bedrock for fulfilling our current mandate and preparing for potential changes to federal privacy laws.

This priority commits us to strengthen our governance and capacity, foster internal communications and collaboration, and nurture partnerships and networks, including with many of you.

As part of our efforts to cultivate and leverage strategic partnerships, we have created a new directorate of International, Provincial and Territorial Relations to bolster our engagements with other regulators and privacy organizations and to begin the work towards Canada’s G7 presidency in 2025, when I will host my G7 counterparts in Canada for our annual G7 DPA roundtable.

International cooperation is increasingly important as data knows no borders. Common rules will help to avoid a patchwork of regulations that make compliance difficult. It will also give citizens peace of mind that their personal information will enjoy similar protections when they – or their data – cross borders.

Collaboration also ensures that we can provide focused guidance and outreach that is informed by data, business intelligence, and stakeholder input, which is so important.

For instance, my Office has had an open consultation on our draft guidance for businesses and public institutions on biometric technologies. We are now reviewing the feedback that we received from a broad range of stakeholders to ensure that the final version is as clear, complete, and as useful as possible.

This priority also includes a commitment to provide input on law reform and to prepare the OPC for its eventual implementation.

I submitted 15 key recommendations to Parliament to strengthen Bill C-27 and was pleased to see that a number of them were endorsed by the government, at least in part. This includes one aimed at explicitly recognizing privacy as a fundamental right, another at strengthening children’s privacy, and expanding the scope of compliance agreements.

Bill C-27 will now proceed to a clause-by-clause analysis by the Committee next month. I remain optimistic that the end result will be a stronger law that supports innovation, the interests of organizations, and protects Canadians’ fundamental privacy rights.

Bill C-27 would also provide additional mechanisms for my Office to assist organizations in meeting their obligations under the law, including by providing guidance and approving codes of practice and certification programs, which are an excellent means of bringing the Act’s privacy principles to a more concrete level, adding certainty for both organizations and consumers.

Should Parliament adopt the Bill, I look forward to consulting and collaborating with many of you to actualize these changes.

There is also Bill S-10 before the Senate.

But privacy law currently provides some important protections in this space.

Protecting privacy with maximum impact also means using our existing laws to deal with new and increasing challenges. This is certainly true in the case of generative AI, where my Canadian and international colleagues have stated very clearly that while new and modern laws on AI may be necessary, our current privacy laws apply to AI, and we will enforce them.

This is also true in the case of online harms. As you know, the government has recently tabled Bill C-63 which would provide new mechanisms and remedies to deal with non-consensual sharing of intimate images and other types of online harms, including harms to children and other vulnerable groups.

Last week, I issued and made public our report of findings in a complaint against Aylo (formerly known as MindGeek), where I reiterated that the non-consensual sharing of intimate images was also a serious privacy violation, and that organizations have an obligation under privacy law to prevent and remedy this. The organization had sought an order from the court preventing me from issuing this decision.

Specifically, I found that Aylo had obligations under the current private sector privacy law to obtain meaningful and direct consent from all of the individuals whose images are collected and posted on the website. I determined that Aylo had failed to meet these obligations because it relied exclusively on uploaders to provide the consent of others. I also found that the takedown process was excessively onerous for the complainant and other victims of image-based abuse.

I made a number of recommendations, including that Aylo adopt measures to obtain direct, express and meaningful consent from each individual who appears in uploaded content on its sites; that Aylo immediately stop sharing user-created intimate content until it has adopted such measures; and that Aylo delete all content for which valid consent had not been obtained directly.

To date, the organization has refused to implement these recommendations, and I hope that it will reconsider. I expect that this issue will continue to be debated in Parliament in the coming weeks, and I will be considering all available next steps to have our recommendations complied with.

Addressing and advocating for privacy in a time of technological change, especially in the world of artificial intelligence (AI) and generative AI

Turning now to my second priority, AI, and particularly generative AI, have been the subject of heightened interest both inside and outside of the privacy community, particularly since ChatGPT took the world by storm in late 2022.

Just last month, the company unveiled its latest project, Sora – a tool to make instant videos from written prompts – opening up a whole new set of opportunities and risks.

This is the purpose of my second priority.

By fostering a culture of privacy, encouraging the use of privacy-by-design principles, and establishing privacy standards, we can promote innovation while also leveraging innovation to protect the fundamental right to privacy.

To advance this priority, we are focusing on enhancing our internal capacity, forging strategic partnerships, fostering technological knowledge, and establishing concrete privacy standards for existing and emerging technologies.

As part of these efforts, the OPC hosted its first international symposium on privacy and AI in December. The event brought experts from academia, industry, civil society, and government, as well as fellow data privacy authorities from around the world, to Ottawa to discuss the opportunities and risks involved in generative AI and how all sectors can best work together to address them.

Together with my provincial and territorial counterparts, we used the event to launch our joint principles for the responsible and trustworthy use of generative AI.

Alex MacLennan said earlier this morning that digital tools can help protect privacy. I agree. At our Symposium last year, I said and repeat that we use data to fuel innovation, and we also need to use innovation to protect data.

The document lays out our expectations on how key privacy principles apply when developing, providing, or using generative AI models, tools, products, and services, and provides examples of best practices, like “privacy by design” and labelling content that is created by generative AI.

We also called on developers to take into consideration the unique impact that these tools could have on vulnerable groups and children.

At the international level, I issued a joint statement on AI with my fellow G7 data protection and privacy authorities last June – the first global statement that current privacy laws apply to AI – and last October, we adopted a resolution on responsible generative AI with other members of the Global Privacy Assembly.

That statement and resolution called on developers and providers of generative AI to embed privacy in the design, conception, operation, and management of new products and services.

We also reminded organizations that existing privacy laws apply to generative AI products and services, even as governments around the world seek to develop laws and policies specific to AI.

I was pleased to see that our G7 resolution on AI was referred to in ISED’s voluntary code of conduct on the responsible development and management of advanced generative AI systems.

This was also a key message last spring, when I launched a joint investigation, along with my provincial counterparts in BC, Alberta, and Quebec, into OpenAI, the company behind ChatGPT.

While I strongly believe that Canada’s privacy laws need to be modernized, our current laws apply to the use of AI technologies today, and I am committed to their application in this space.

Championing children’s privacy rights

Which brings me to my third priority: championing children’s privacy rights.

This priority recognizes the unique sensitivities around young people’s privacy and the need to ensure that their rights are protected so that they can benefit from technology without compromising their privacy and well-being.

To advance this priority, I have recommended that Bill C-27 recognize the best interests of the child. We are also seeking to increase our knowledge regarding key children’s privacy risks, issues, and gaps, as well as to better understand how and where children consume content.

We will expand our partnerships to amplify the uptake of our resources, guidance, and advice. We will also apply a children’s privacy lens to our enforcement activities and leverage our findings to inform and incentivize organizations to develop products and services with better privacy protections for children.

Last October, my provincial and territorial counterparts and I adopted a resolution that called on governments and organizations to adopt practices that promote the best interests of young people, to ensure not only the safeguarding of young people’s data, but also to empower them with the knowledge and agency to navigate digital platforms and manage their data safely, and with autonomy.

My ongoing joint investigation with my Alberta, British Columbia and Quebec colleagues into TikTok is another initiative under this priority. Our investigation focuses on TikTok’s privacy practices as they relate to younger users, including whether the company obtains valid and meaningful consent for the collection, use, and disclosure of their personal information.

Most recently, it was my great pleasure to attend a meeting of the Young Canadians’ Parliament in January, where I discussed privacy with a group of highly engaged youth. I was heartened by the insightful questions that I received, and the genuine concern that these young people have for their privacy.

For instance, one young person expressed their concern about AI using personal information to impersonate people, while another indicated that they feel that they have no control over the use of technology in school and worry about the overcollection of personal information.

I also heard concern over the lack of privacy controls and encryption on certain messaging apps, and the need for more privacy education in the curriculum.

I look forward to continuing these important conversations.

Breaches and OPC tools

I want to take a moment to talk about breaches because it seems that not a day goes by anymore without a media report about a breach in one sector or another.

So far this fiscal year, 573 breaches have been reported to my Office by private-sector organizations, affecting nearly 19 million Canadian accounts. Almost half (47%) were identified as cyber incidents.

Besides the human costs – like reputational harm, financial loss, and identity fraud – IBM estimates the current average financial cost for businesses of a data breach to be around $4.45 million US.

Again, this is an area where privacy law and privacy principles apply and can help to guide us. The theme of the conference is “Guarding Privacy, Fortifying Security: Navigating the AI Frontier”. I absolutely believe that guarding privacy helps to fortify security. In fact, one of the 10 PIPEDA principles is about safeguarding.

Businesses that are subject to PIPEDA are required to report all privacy breaches where it is reasonable to believe that the breach poses a real risk of significant harm. My Office is currently creating web tools to make breach reporting easier, which we aim to launch soon.

Organizations are also required to notify individuals who have been affected by a major breach. Those that knowingly fail to meet their breach reporting or notification obligations can face a financial penalty.

My Office has created guidance to help organizations assess the real risk of significant harm. We have also developed an online tool to guide risk assessments through a series of questions to help determine whether it is reasonable to believe that a privacy breach creates a risk of significant harm. It does not replace human judgment, but it does provide data to inform that judgment.

We have piloted the tool with 20 organizations that have provided very positive feedback and comments that will inform a future public-facing version.

Of course, the public sector is not immune from breaches. In November, my Office opened Privacy Act investigations into Public Services and Procurement Canada and the Treasury Board Secretariat, as well as PIPEDA investigations into the two companies that were contracted to provide relocation services for members of the Canadian Armed Forces, the RCMP, and other federal departments.

That incident, which was the result of a cyberattack, impacted sensitive personal and financial data, and the information compromised dates as far back as 1999.

Just last month, the OPC opened another investigation after Global Affairs Canada’s internal network suffered a cyberattack that exposed the personal information of employees and other users.

Also last month, I issued a report of findings into cyberbreaches involving credential stuffing at the Canada Revenue Agency and Employment and Social Development Canada that resulted from weak cybersecurity safeguards.

These attacks compromised the sensitive financial, banking, and employment data of tens of thousands of Canadians, leading to numerous cases of fraud and identity theft – including many fraudulent applications for COVID-19 Emergency Response Benefits.

The case highlighted important lessons for other organizations, such as the need to improve communications and decision-making frameworks to facilitate a rapid response to attacks, and the importance of developing comprehensive incident-response processes to effectively prevent, detect, contain, and mitigate breaches, including by conducting regular security assessments.

This is why I recommend that organizations conduct Privacy Impact Assessment (or PIAs). This risk management process can help organizations to demonstrate that they are accountable for the personal information under their control, ensure that they are complying with the law, and limit their risk of privacy breaches.

The Treasury Board Secretariat requires that federal institutions conduct PIAs prior to establishing any new or substantially modified program or activity involving personal information.

Though not currently a legal requirement under PIPEDA, I nonetheless recommend them and have urged the government to amend Bill C-27 to require organizations to conduct PIAs for high-risk initiatives, such as AI.

My Office is also here to help organizations to achieve their important objectives in a privacy-protective manner. Our Business and Government Advisory directorates are great resources that offer contextualized and relevant support and guidance to both the public and private sectors.

Conclusion

Protecting privacy is one of the paramount challenges of our time.

As Canada’s Privacy Commissioner, I am certainly committed to doing my part, through strong advocacy, education, promotion, and enforcement, but none of us can do it alone. Collaboration is essential. As Dr. Crawford noted yesterday, it is a collective challenge that needs a collective response. Looking around the room today, I feel encouraged knowing that we have incredible, smart, principled and thoughtful privacy champions like yourselves working together to advance this important discussion towards just such a solution.

Thank you. I wish you all a wonderful rest of the conference and would be happy to take your questions.

Date modified: