Language selection


Keynote Remarks at the 29th Annual Flagship Conference on Regulatory Compliance for Financial Institutions

November 22, 2023

Toronto, Ontario

Address by Philippe Dufresne
Privacy Commissioner of Canada

(Check against delivery)

Good morning,

I am pleased to be here today in Toronto among so many privacy and compliance leaders from within the financial sector.

According to the Bank of Canada, 30 million financial transactions worth more than $210 billion take place across this country every day. As Canada’s financial hub, many of these transactions happen right here in Toronto.

Indeed, the City of Toronto itself boasts that it is the second-largest financial services centre in North America with one of the highest concentrations of financial services company headquarters in the Americas.

It is described as a “global destination for financial services” with a “reputation for safety, soundness and stability.”

And while Canadians are known for their humility – this reputation is something that we as a country and people should be very proud of and let the world know.

Behind each of these transactions – many of them global given our increasingly borderless world and the nature of modern data flows – are individual Canadians who are saving for retirement and their children’s university educations, financing new homes or just trying to make enough to feed their families. Without confidence in our financial institutions, none of this would be possible.

Individuals’ financial data is personal information. It is highly sensitive and it must be carefully protected. The financial sector knows this and by and large have very robust privacy management programs in place.

As an example, just last week we saw media reports about payment processing giant Moneris and how it thwarted a ransomware attack. Our work with this sector suggests that financial institutions take this duty seriously.

Not surprisingly, however, the financial sector is an attractive target for bad actors and many of our interactions are in the context of breach reports and investigations, both of which I will discuss more later.

Your sector manages an enormous amount of highly sensitive personal information and so it is not surprising that the financial industry generates the highest proportion of privacy-related complaints by sector to my Office, last year accounting for 26% of all accepted complaints.

Of the 101 complaints involving the financial sector that we closed last year, 70% of them were resolved to the satisfaction of stakeholders through mediation. Just 6 of the cases that we investigated were well-founded. The organizations involved were generally cooperative and willing to implement our recommendations.

All this to say that I recognize the leadership role financial institutions play in taking privacy and the protection of personal information seriously. Indeed, I see that “Strengthening the culture of compliance” was the topic of one of yesterday’s sessions.

This also reflects the importance for organizations to adopt a culture of privacy, through privacy by design and privacy by default.

It is by limiting the collection, use, retention and disclosure of personal information to what is demonstrably necessary and proportional, adequately training employees on protecting privacy and having monitoring mechanisms in place to ensure that policies are working, that we can build this culture.

Since my appointment as Privacy Commissioner in June 2022, I have set my vision for privacy as one that reflects the reality that Canadians want to be active and informed digital citizens, able to fully participate in society and the economy, without having to choose between this participation and their fundamental privacy rights.

My vision has three pillars, which are:

  1. Privacy is a fundamental right;
  2. Privacy supports the public interest and Canada’s innovation and competitiveness; and
  3. Privacy accelerates the trust that Canadians have in their institutions and in their participation as digital citizens.

We know that privacy matters to Canadians. In our last survey of Canadians, 93% of them expressed some level of concern about the protection of their privacy rights. They want and need to trust that their privacy rights are being protected so that they can feel confident about participating freely in the digital economy.

We also know that organizations in both the public and private sectors are having to adapt to the scale and pace of technological advancements that we are seeing and are working hard to operate and innovate in a manner that protects the privacy of Canadians, their customers and their clients.

So today, I would like to talk about the importance of protecting privacy and the work that my Office is doing domestically and internationally in that regard. As well, I will discuss how we help guide and support organizations as they comply with applicable privacy laws now and in the future, and why it is not only necessary, but also a smart investment to make.

Law reform

This fall, I had the opportunity to present my views on Bill C-27, the Digital Charter Implementation Act, before the House of Commons Standing Committee on Industry and Technology, or INDU. The Bill includes the Consumer Privacy Protection Act, or CPPA, which would essentially replace PIPEDA.

It addresses concerns that were previously raised by my Office and others. For example, it requires that information used to obtain consent be in understandable language; it provides my Office with order-making powers; and it includes an expanded list of contraventions to which administrative monetary penalties may apply in cases of violations.

Overall, the Bill is a step in the right direction but as I have said, it can and must go further to protect fundamental privacy rights.

Last month, in a letter to INDU, the Minister of Innovation, Science and Industry proposed certain amendments to the Bill that would address some of the 15 key recommendations that were put forward by my Office to strengthen the legislation.

These include explicitly recognizing privacy as a fundamental right; strengthening privacy protections for young people and providing more flexibility for my Office to use compliance agreements to correct privacy behaviours, including through the use of financial penalties.

The Minister also stated that he would propose amendments allowing for greater cooperation between regulators.

The Bill currently expands my ability to collaborate with domestic organizations to a limited few: provincial and territorial information and privacy commissioners, the CRTC and the Competition Bureau.

I believe that it could be expanded even further to include other regulators, including perhaps those from the financial sector.

Flexibility to work with other regulators would be valuable where the conduct in question falls within the scope of multiple jurisdictions and would also be consistent with our current ability to cooperate with international partners.

This collaboration would help reduce costs and duplicative efforts for regulators and organizations alike, and help avoid conflicting or inconsistent outcomes, which can make compliance difficult for organizations, and increase risks for consumers.

My Office has provided the government with several recommendations that may interest you.

We have called on the government to strengthen the framework for de-identified information which, I understand, runs counter to what the Canadian Bankers Association, for instance, is proposing.

In our view, technical and administrative measures used to de-identify data must be proportionate to the sensitivity of the data and the risk that it might be reidentified, and as you know, financial data is generally considered highly sensitive.

We also recommended that an organization’s purposes for collecting, using or disclosing information be specific and explicit.

Generative AI and Privacy

Of course, Bill C-27 also introduces the Artificial Intelligence and Data Protection Act or AIDA. The adoption of AIDA could make Canada one of the first countries to regulate AI, which is important given the technology’s potential risks. Although AIDA does not specifically address privacy risks, the Consumer Privacy Protection Act would apply to the processing of personal information within AI systems and I have recommended ways to improve this.

Among them is a recommendation that organizations be required to conduct Privacy Impact Assessments to ensure that privacy risks are identified and mitigated for high-risk activities.

Given the concerns around how AI systems reach decisions, as well as issues of fairness, accuracy, bias and discrimination, organizations should also be required to explain, on request, all predictions, recommendations, decisions and profiling made using automated decision systems.

Such decisions can have a profound impact on individuals’ lives and Canadians should have the right to request an explanation if they find themselves on the receiving end of an automated decision.

We have seen this in the financial sector, for example, with respect to the granting of loans. When the stakes are life changing, human intervention is imperative.

No AI technology has captured the public interest over the past year quite so much as generative AI.

In September, ISED launched a voluntary code of conduct on the responsible development and management of advanced generative AI systems to identify risk identification and mitigation measures that should be applied in advance of binding regulation.

More than a dozen companies and organizations have signed on to the voluntary code, including BlackBerry, OpenText, Telus and the Council of Canadian Innovators, which represents more than 100 start-up companies across Canada, including a number of FINTECHs.

The Code states that it does not change existing legal obligations that organizations may have, for example under PIPEDA.

In a recent OECD report on Generative AI, threats to privacy were among the top 3 risks that were identified by G7 members to achieving national and regional goals.

Given all the privacy implications, it will be important to ensure that there is a formal mechanism for my Office to be consulted in the drafting of AI regulations and that we are fully integrated into Canada’s AI regulatory framework.

Earlier this year, my Office also launched a joint investigation with several provinces into OpenAI, the company behind ChatGPT, to determine whether the organization’s practices comply with Canadian privacy law.

That investigation is ongoing, along with our efforts to promote the use of privacy-enhancing technologies, and collaborate with national and international partners to advance privacy best practices with regard to generative AI.

This past June, in a joint statement issued in Tokyo with G7 data protection and privacy authorities, we called on developers and providers of generative AI to embed privacy in the design, conception, operation, and management of their new products and services.

We also urged companies to consider globally recognized privacy principles such as data minimization, data quality, purpose specification, use limitation, security safeguards, transparency, rights for data subjects including the right to be informed about the collection and the use of their personal data, and accountability.

This was followed by a G7 Leaders Statement in October, which included many of these within a set of Guiding Principles and a Code of Conduct for organizations developing advanced AI systems (such as generative AI foundation models), and our G7 Data Protection Authorities’ joint statement was referenced in the voluntary AI code of conduct that was recently unveiled by Minister Champagne.

Continuing our international collaboration, my Office will be hosting a meeting of the Global Privacy Assembly’s International Working Group on Data Protection Technologies alongside my colleague from the German DPA in Ottawa two weeks from now.

Open Banking

We also recognize that C-27 will allow for data mobility, though it leaves much of the details to the regulations.

Data mobility will certainly facilitate open banking. I know last week dozens of fintech leaders wrote to the finance minister calling for the immediate adoption of open banking which was subsequently in yesterday’s Fall Economic Statement. My Office supports open banking and we look forward to reviewing the particulars of the initiative announced yesterday when the budget bill is tabled.

On the subject of open banking in Canada, to date we have recommended it be built upon a foundation that includes respect for privacy and other fundamental rights at its core.

For instance, individuals should have the right to control their information, including to meaningfully consent to how their information is used and to whom it is disclosed.

Providing Canadians with this control will help to foster trust and assure them that they are not the “product” in open banking.

To ensure consistent ground rules for open banking, we recommend the development of technical and privacy standards, and have indicated to Parliament that we would be pleased to provide privacy expertise to support the development of Canadian standards.

To ensure responsible collection, use and disclosure of sensitive financial information, we further recommend that financial institutions and FinTechs be required to analyze and document privacy risks and plans to mitigate them.

Lastly, I note that public trust in our federal and financial institutions is essential to the success of an open banking system.

A strong regulator is essential to building that trust. Any changes in financial policy and legislation will require concurrent updating of Canada’s privacy laws.

Open banking is also a good example of an area where the ability to cooperate and share information with other regulators, such as the Competition Bureau, would be helpful.

Anti-money laundering

I also wanted to touch upon efforts to strengthen Canada’s anti-money laundering regime which, I see, is the subject of a couple of panels during this conference. Again, yesterday’s Fall Economic Statement also included plans to amend legislation in this area which I will be taking a closer look at in due course.

My Office has a mandate to conduct a review of FINTRAC every two years. Last month, I submitted my views to the Department of Finance as part of its consultation aimed at informing its upcoming statutory review of the Proceeds of Crime and Terrorist Financing Act.

I was pleased that the consultation paper identified privacy protection as a foundational aspect of the regime.

The Consultation Paper proposes a “keep open regime” whereby financial institutions could be ordered, or provided with the authority, to keep a personal financial account open at the request of law enforcement.

We recommend that the government carefully consider the potential for adverse consequences for individual account holders.

Should Canada adopt a “keep open” regime, it is critical that privacy safeguards, such as use limitations, time limits, record keeping and reporting requirements, be incorporated.

The consultation paper also discusses options, such as a safe-harbour provision, to enhance information sharing within the private sector to identify and disrupt money laundering and terrorist financing activities.

We have mentioned to the government that the current framework in PIPEDA already contains a legal framework for the collection, use and disclosure of personal information without knowledge or consent, for instances related to fraud or when related to investigations.

We also encourage the government to explore implementing guardrails and safeguards such as those in other jurisdictions that have adopted a safe-harbour provision. We further encourage the government to consult our Office and conduct PIAs.

With regard to suspicious transaction reports, my Office has raised concerns in previous audits about over-reporting, for instance by financial institutions that would rather over-report than face potential sanctions for under-reporting.

FINTRAC recently reported that out of 24.7 million records received in a fiscal year, only 2,015 actionable disclosures took place, yet all records were retained, in some instances for up to 15 years.

That raises privacy risks and questions about proportionality. Once information is analyzed and leads to the conclusion that someone is not a threat, it should no longer be retained.

Efforts should also be made to improve communication between FINTRAC and reporting entities to reduce the volume of unfounded STRs.

Finally, I understand that banks and financial services spend billions annually to fulfill their anti-money laundering requirements and that they are always looking at new ways to detect such criminal activity.

The consultation report floats the idea of regulatory sandboxes where financial institutions can experiment with new technologies.

Our position has been to stress that privacy and data protection laws remain wholly at play given that these are fundamental rights, and that experimentation needs to be tightly circumscribed.

We invite FINTRAC to leverage our expertise as this initiative takes shape and remind reporting agencies in the financial sector that are considering the use of AI to ensure that individuals can exercise their rights to access their personal information, rectify inaccurate personal information, and refuse to be subject to solely automated decisions with significant effects.

Breaches and the RROSH tool

On the subject of leveraging our expertise, my Office has many resources available on our website to help organizations such as yours comply with federal privacy law, and our Business Advisory Directorate may be a resource for specific compliance questions.

On that note, I would like to shift my remarks to compliance and discuss breaches and some key lessons learned from recent investigations.

Breaches of personal information are a perennial problem. Of the nearly 700 breaches reported to my Office last year, more than a quarter of them – 27% – involved the financial sector.

The telecommunications, professional services, sales and retail and insurance sectors round out the top 5 sectors with the largest number of breach reports.

Last year, nearly 760,000 Canadian accounts were affected by breaches that were reported to our Office by the financial sector and 55 of the breaches were bona fide cyber-attacks initiated through malware, compromised credentials or phishing schemes.

The financial sector also suffered 56 breaches related to unauthorized disclosure and a dozen each for loss and theft of personal information.

Besides the human cost – reputational harm, financial loss and identity theft – IBM estimates the current average financial cost of a data breach to be around $4.45 million US.

We know that personal data is extremely valuable, and we know that financial data is an especially attractive target for bad actors.

Organizations must make security a priority by enhancing protections for employee credentials, applying security patches as they become available, requiring two-factor or multi-factor authentication, properly training employees, having a robust privacy management program and investing in cybersecurity to prevent unauthorized access.

That being said, we also know that companies can do everything right and still suffer a privacy breach. Breaches are not synonymous with non-compliance and breach reporting does not necessarily result in an investigation by my Office.

Businesses subject to PIPEDA – which generally include financial institutions and insurance companies – are required to report all privacy breaches where it is reasonable to believe that the breach poses a real risk of significant harm.

Organizations are also required to notify affected individuals. Organizations that knowingly fail to meet their breach reporting or notification obligations can face a financial penalty.

My Office has created guidance to help organizations assess real risk of significant harm which is a broad concept that includes: bodily harm, humiliation, damage to reputation or relationships, loss of employment, financial loss, identity theft, negative effects on credit record and damage or loss of property.

In determining whether there is a real risk of significant harm, it is necessary to consider the degree of sensitivity of the personal information involved and the probability that the information has been, is being, or will be misused.

We have also developed a desktop app to guide risk assessments by asking a series of questions to help determine whether it is reasonable to believe that a privacy breach creates a risk of significant harm. It does not replace human judgment, but it does provide data to inform that judgment.

The first phase of the tool was launched internally in March 2022. In a recent pilot project, 20 organizations were given access, including 5 from the financial sector, to try the tool and provide feedback. The feedback has been very positive, and will inform the in-house and a future public-facing version of the tool.

Lessons learned from recent cases

Lastly, before I conclude and take your questions, I would like to touch on a few lessons learned from recent cases.

I noticed a session later today on identifying and reducing third-party risks. Our recent investigation into Home Depot has some important takeaways in this area.

Home Depot

Our investigation found that Home Depot was collecting customer email addresses at store checkouts for the stated purpose of providing them with an electronic copy of their receipt.

However, unbeknownst to most customers, the retailer was also sharing email addresses in a coded format, along with high-level details about each customer’s in-store purchases, with Facebook, via a tool called “Offline Conversions” to measure the effectiveness of the ads it delivered for Home Depot, but also for Facebook’s own business purposes, like profiling and targeting.

As businesses increasingly look to deliver services electronically, they must carefully consider any consequential uses of personal information, which may require additional consent.

In this case, customers would not have reasonably expected that their personal information would be shared with a third-party social media platform.

Consumers need clear information at key transaction points so that they can make informed decisions about how their personal information is being used and provide meaningful consent. Consent fatigue, an excuse that was raised, is not a valid reason for failing to obtain meaningful consent.

BMO Breach

This one goes back a few years but offers some good lessons learned.

The Bank of Montreal was the victim of a series of sophisticated cyber-attacks by malicious actors who identified and exploited a security vulnerability in the company’s online banking software to exfiltrate personal information of more than 110,000 customers.

Beyond certain technical safeguarding gaps, we identified two general lessons that I would like to share.

First, the online banking application had been developed in-house. The vulnerability was unintentionally built-in and was not caught due to gaps in the bank’s development and testing cycle.

A thorough development cycle and testing protocols are critical to finding weaknesses in software before it goes live.

Second, the bank did not discover the breach until it received a ransom demand months after the attack had begun.

While it had detected suspicious activity months earlier and moved rapidly to mitigate financial harms by securing financial accounts, it did not conduct a thorough review of the attack at that time, which would have revealed ongoing data exfiltration. For financial institutions in particular, the valuable data that you hold, and not just money, can be prime targets for malicious actors.


Privacy matters to Canadians more today than ever before.

The more that individuals trust that their information is being protected, the more confident that they will feel participating freely in the digital economy.

Resources spent on protecting and promoting privacy are investments in trust that will pay dividends as we work towards a culture of privacy.

I look forward to continuing to find ways that we can work together so that Canadians can benefit from the many conveniences that technology affords without having to sacrifice their own personal privacy.

Canada can be an innovation hub and a model of good government that demonstrates the value and benefits of protecting the personal information of Canadians.

I wish you all a wonderful rest of the conference and would be pleased to take a few questions.

Date modified: