Language selection

Search

Keynote remarks at the International Association of Privacy Professionals (IAPP) Canada Privacy Symposium 2023

May 25, 2023
Toronto, Ontario

Address by Philippe Dufresne
Privacy Commissioner of Canada

(Check against delivery)


Good morning,

It is a pleasure to be here among fellow privacy champions at this, my first IAPP Canada Symposium since being appointed Privacy Commissioner of Canada.

As the professionals on the front lines making sure that your organizations take privacy and the protection of the personal data in their care seriously, you are important partners. I value the work that you do and look forward to working with you as we continue our efforts to bolster privacy protections for Canadians.

As you know, I am approaching my one-year anniversary as Privacy Commissioner. In the first year of my mandate, I made it my mission to meet and engage with key stakeholders from across Canada in both the public and private sectors, and to actively listen to what they have to say.

Those discussions have been one of the highlights of the last year, and as a result, I have had the privilege of meeting many of you already. I have absolutely enjoyed those discussions which I found principled, pragmatic, insightful and, frankly, inspirational.

I am certain that many of you have heard me discuss the three pillars of my vision for privacy:

That privacy is a fundamental right; that privacy supports the public interest and Canada’s innovation and competitiveness; and that privacy accelerates the trust that Canadians have in their institutions and in their participation as digital citizens.

I have expanded on these three pillars in previous speeches but let me summarize them as briefly as I can:

  1. Treating privacy as a fundamental right means treating it as a priority, and it means that in clear cases of conflict with private and public interests, privacy should prevail. Treating privacy as a fundamental right can be done within consumer protection legislation under the trade and commerce power.
  2. Privacy supporting the public interest and Canada’s innovation means that it is not a zero-sum game between privacy rights and public and private interests. We can have both, and Canadians deserve nothing less.
  3. Privacy as an accelerator of trust means that we all gain by protecting privacy and being seen to be doing so. It generates trust and engagement with our public institutions which is good for the public interest, and it sustains trust and loyalty from clients which is good for innovation and economic success.

These three pillars reflect the reality that Canadians want to be active and informed digital citizens, able to fully participate in society and the economy without having to choose between this participation and their fundamental privacy rights.

They frame how I look at privacy issues, and how I will consider and address the ongoing and fast-moving challenges of our time. They have guided our submission on Bill C-27, the proposed Digital Charter Implementation Act, and will help shape the OPC’s strategic priorities in the year ahead, which will include: (1) keeping up with and staying ahead of technological advancements and their impact on privacy – particularly with respect to artificial intelligence and generative AI, (2) protecting children’s privacy and (3) subject to Parliament’s directions, preparing the OPC for potential law reform which would include new responsibilities and powers in our mandate to promote and protect the fundamental privacy rights of Canadians.

Keeping up with and staying ahead of technological advancements and their impact on privacy

Staying ahead of technological advancements is one of my key focus areas.

Technology is developing at an increasingly rapid pace. This offers encouraging and exciting possibilities in meeting many of the collective challenges that we face, and improving the lives of Canadians. We need only think of health care and climate change, but also government delivery of services, whether in ordinary times or in a national emergency. At the same time, this raises potential risks with respect to privacy, human rights, transparency and accountability that our institutions must address and must be seen to address.

As technology plays an increasingly central role in our world, our lives, and our economy, ensuring that we can benefit from these advances, innovations, and conveniences while protecting privacy will be critical to our success as a free and democratic society.

An obvious example of this is the world of artificial intelligence (AI) and generative AI, which has become a key focus not only for Data Privacy Authorities, but for governments and industry at the highest levels.

Just last week, OpenAI chief executive Sam Altman appeared before the US Congress and called for a coordinated global regulatory response to the technology given that “we don’t yet know exactly what the emerging capabilities will be.” He went on to add that regulation “doesn’t slow down innovation, […] it helps protect us from the most serious downside cases”.

I agree. AI chatbots like ChatGPT, Bing and Google’s Bard offer awe-inspiring possibilities. But the risks of these privacy impactful tools – which some have likened to a Pandora’s box – must be addressed appropriately.

As you may know, my Office launched an investigation into the company behind ChatGPT in April following the receipt of a complaint.

I can announce today that this will now be a joint Commissioner-initiated investigation with our provincial counterparts in British Columbia, Alberta, and Québec.

We will be investigating whether OpenAI’s practices comply with Canadian privacy law in relation to consent, openness and transparency, access, accuracy, and accountability, as well as whether the organization is collecting, using and disclosing personal information for an appropriate purpose. We are still in the early stages of the investigation but are encouraged by the cooperation of OpenAI and its counsel to date.

This joint investigation reflects the strong collaboration between privacy authorities in Canada in dealing with key issues that impact Canadians.

AI technology and its impact on privacy is indeed a global issue, and we will continue to engage with both our domestic and international partners on these matters.

At the G7 meeting last Friday, leaders acknowledged the need for governance of generative AI and immersive technologies, saying that they plan to discuss and report results by the end of the year.

I also want to commend the IAPP for having launched its AI Governance Centre earlier this month, which will provide important support, resources, and training on how to respond to the complex challenges in this field.

We know that privacy matters to Canadians and that they are concerned about the impact of technology on their privacy. Our latest, and soon to be released, survey of Canadians found that 93 per cent have some level of concern about protecting their personal privacy, and that half do not feel that they have enough information to understand the privacy implications of new technologies.

Meanwhile, only 4 in 10 Canadians feel that businesses generally respect their privacy. Social media companies, big tech, retailers, and the telecommunications industry are among the sectors that Canadians are most concerned about, according to our poll. They were also the subject of more than a quarter of the complaints that we received last year.

These figures tell us that Canadians want and need to trust that their privacy rights are being protected so that they can feel confident about participating freely in the digital economy, which in turn is good for businesses and the economy. They also show that my Office has an important role to play because we know that organizations themselves are having to adapt to the scale and pace of technological change, and we can help them to operate and innovate in a privacy-protective manner which will generate trust with their clients and customers.

Children’s right to privacy

When it comes to children, we know that they are going online at younger ages each year for school and to connect with their friends.

We are already seeing the first generation of children born into a world where their digital life is a daily reality.

As a parent, it is a reality that I have lived, and as Privacy Commissioner, it is an issue that I take very seriously.

We want children to be able to benefit from technology and to be active online, but we want them to do so safely and free from fear that they may be targeted, manipulated, or harmed as a result.

The Brookings Institution recently published a policy brief noting the adverse effects that social media companies can have on the mental health of minors, who often deal with online bullying and sexual harassment on these platforms. Last week, online safety campaigners in the UK were calling for stronger legislation to shield young people from damaging digital content after Kate Winslet won the leading actress award at the BAFTA’s for her portrayal of a mother whose teenage daughters suffers from mental health problems as a result of viewing damaging online content.

Young people are also less able to understand and appreciate the long-term implications of consenting to their data collection, which is why they need even greater privacy safeguards.

My Office recently announced that with my privacy protection colleagues in British Columbia, Alberta, and Québec, we have launched a joint investigation into TikTok.

We will be investigating whether the organization’s practices comply with Canadian privacy law, with a particular focus on their privacy practices as they relate to younger users and whether they are transparent and obtain valid and meaningful consent for the collection, use and disclosure of their personal information.

Children’s privacy is a topic of global interest, and we are seeing industry innovations, like the introduction of a Children’s Code by the UK Information Commissioner’s Office that sets standards for online services to follow to ensure age-appropriate design. We are also seeing legislative proposals emerging across the United States and around the world – some of which are being lauded as a step forward, while others raise important questions about how to achieve privacy while implementing appropriate safeguards.

For example, online age verification laws and associated technologies, such as facial recognition to determine age, seek to protect children from inappropriate or harmful content, but can potentially raise privacy issues in the process.

In my submission on Bill C-27, the proposed Digital Charter Implementation Act, the second of my 15 key recommendations relates specifically to children’s privacy and the best interests of the child.

While we support the Consumer Privacy Protection Act’s (CPPA) efforts to address minors, including clarification that their information is sensitive, the measures should in my view go further to deal with uses that could be harmful, such as using information to nudge children to turn off privacy controls, or for behavioural or targeted advertising.

In the absence of specific “no-go” zones related to minors’ data, we are recommending that the preamble to the Bill recognize that the processing of personal data should respect children’s privacy and the best interests of the child. We believe that this would encourage organizations to build privacy for children into products and services from the start and by design. By including this language in the preamble, it would also ensure that these values would apply to both the CPPA and the Artificial Intelligence and Data Act (AIDA).

This leads me to another one of my key priorities – law reform.

Legislative Reform

Bill C-27 was referred to the House of Commons’ Standing Committee on Industry and Technology for study last month, and my written submissions and recommendations to the Committee are now publicly available on our website. I am looking forward to appearing before the Committee to discuss these recommendations with Parliament.

Bill C-27 is a step in the right direction, but it can and must go further to protect the fundamental privacy rights of Canadians while supporting the public interest and innovation.

I am also encouraged by the remarks of Justice Minister David Lametti who, following the tabling of Bill C-27, said that public sector privacy reform is not far behind.

Let us not forget that the Privacy Act, which marks its 40th anniversary in July, has seen few updates since The Police topped the music charts with their hit song, Every breath you take, which, ironically, is a song about surveillance.

Or that when PIPEDA became law in the year 2000, the world was still coming to terms with the fact that we had survived “Y2K”, which at the time was the single greatest legal and technological challenge that we faced. Suffice it to say that a lot has happened since then and in hindsight, those look like much simpler times.

It will be important that the legislative regimes are harmonized to ensure that both public and private sector privacy laws are grounded in the same principles – especially given the increasing prevalence of public-private partnerships.

With respect to Bill C-27, it is in many ways an improvement over both PIPEDA and the former Bill C-11, by establishing stronger privacy protections for individuals and creating incentives for organizations to comply while allowing greater flexibility to innovate.

The introduction of the AIDA would also make Canada one of the first countries to regulate AI, which is important given the technology’s potential implications for other fundamental rights.

The Bill, however, can and must go further. As mentioned, our submission sets out 15 key recommendations that align with the three elements of my vision for privacy. Given our limited time today, I will expand on just a few of them.

We recommend strengthening the preamble and purpose clause to explicitly recognize privacy as a fundamental right, so that this important principle informs the interpretation of all aspects of the legislation. We also recommend that an organization’s purposes for collecting, using or disclosing information be specific and explicit, and that penalties be available in cases where the personal information of Canadians is collected, used or disclosed for inappropriate purposes.

We recommend that organizations be required to implement privacy by design and that Privacy Impact Assessments be prepared in high-risk cases. We also call for the definition of “de-identified information” to be modified to include the risk of re-identification, and the government’s authority to issue certain regulations to be more narrowly defined.

I can tell you that my Office is already looking ahead and preparing for law reform so that should Parliament adopt the Bill, we will be ready to take on the new responsibilities it lays out and to provide support to Canadians and businesses as they navigate the new legislative framework.

Importance of protecting privacy

Privacy is a part of everything that we do. Children’s rights, competition, broadcasting, cybersecurity, democratic rights, international trade, national security, equality rights, public health, ethical corporate practices, and the rule of law – all of these have privacy implications and impacts.

The right to protect our personal information is also foundational to our individual dignity.

Some of my Office’s recent investigative findings illustrate why the right to decide whether, when and how to share information about ourselves is essential – even more so in today’s increasingly digital world.

Our Tim Horton’s investigation last year described how the company’s app was tracking user location even when the app was not in use, and without the users’ knowledge or consent.

Earlier this year, we released the results of our investigation into Home Depot’s sharing of personal information with Facebook when their customers opted for an electronic receipt at checkout instead of a printed one.

Our investigation confirmed that this was not consistent with privacy, and we are working with industry to ensure that these principles are understood and adopted by other organizations.

This month, following the Federal Court’s decision dismissing our application in the Facebook – Cambridge Analytica matter, I announced that we would be appealing the decision as it raised important questions with respect to the interpretation and application of privacy law.

We are also looking forward to the Federal Court of Appeal’s decision in our Google reference with respect to the application of PIPEDA to search engines.

On the public sector side, I have appeared before parliamentary committees to make recommendations on the use of RCMP investigative tools and on ensuring that privacy principles are incorporated when the CRTC exercises its new powers under the Bill C-11 amendments to the Broadcasting Act.

More recently, I appeared before a Senate Committee to recommend that political parties be governed by privacy law and that an independent third-party decision-maker be given jurisdiction to oversee compliance.

I am also pleased to share that I expect to table a Special Report to Parliament next week following investigations of the federal government’s privacy practices in relation to pandemic measures, so stay tuned for that.

Lastly, I am heading to Tokyo, Japan in the next few weeks where I look forward to continuing the work with my G7 DPA colleagues on Data Free Flow with Trust.

Conclusion

In closing, I want to leave you with a quote from one of Canada’s early Privacy Commissioners, the late John Grace, taken from an Annual Report to Parliament over 30 years ago. He said, “Privacy protectors cannot be staled by custom or allowed to be complacent. The challenges to privacy are new, urgent, various and ingenious, brought about by technology that never sleeps and is rarely denied.”

These words ring even more true in today’s digital world, and that is why I believe that the protection of privacy is one of the key challenges of our time. But we can and will meet the challenges of these pivotal times. We can have privacy and the public interest. We can have privacy and innovation.

I look forward to continuing to find ways that we can work together as industry leaders, regulators, consumers, and citizens so that people in Canada, including children, can benefit from the many advantages and conveniences that technology affords without having to look over their shoulders while they do.

Thank you.

Date modified: