Language selection

Search

Projecting our values into laws

Laying the foundation for responsible innovation

2020-2021 Annual Report to Parliament on the Privacy Act and the Personal Information Protection and Electronic Documents Act

Office of the Privacy Commissioner of Canada
30 Victoria Street
Gatineau, Quebec K1A 1H3

© Her Majesty the Queen of Canada for the Office of the Privacy Commissioner of Canada, 2021
ISSN 1913-3367


Letter to the Speaker of the Senate

December 9, 2021

The Honourable George J. Furey, Senator
The Speaker
Senate of Canada
Ottawa, Ontario
K1A 0A4

 

Dear Mr. Speaker:

I have the honour to submit to Parliament the Annual Report of the Office of the Privacy Commissioner of Canada, for the period from April 1, 2020 to March 31, 2021. This tabling is done pursuant to sections 38 for the Privacy Act and 25 for the Personal Information Protection and Electronic Documents Act. Within the aforementioned, the Report of the Office of the Privacy Commissioner of Canada entitled Review of the Financial Transactions and Reports Analysis Centre of Canada is included. This tabling is done pursuant to section 72(2) of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act.

Sincerely,

Original signed by

Daniel Therrien
Commissioner


Letter to the Speaker of the House of Commons

December 9, 2021

The Honourable Anthony Rota, M.P.
The Speaker
House of Commons
Ottawa, Ontario
K1A 0A6

 

Dear Mr. Speaker:

I have the honour to submit to Parliament the Annual Report of the Office of the Privacy Commissioner of Canada, for the period from April 1, 2020 to March 31, 2021. This tabling is done pursuant to sections 38 for the Privacy Act and 25 for the Personal Information Protection and Electronic Documents Act. Within the aforementioned, the Report of the Office of the Privacy Commissioner of Canada entitled Review of the Financial Transactions and Reports Analysis Centre of Canada is included. This tabling is done pursuant to section 72(2) of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act.

Sincerely,

Original signed by

Daniel Therrien
Commissioner


Commissioner’s message

Daniel Therrien

Annual Reports present an opportunity to take stock of the past year, and to highlight both progress made and challenges that persist within the broader context of what we have been trying to achieve.

Over the course of my mandate, it has become increasingly clear that we need a stronger privacy framework to protect the rights of Canadians in an increasingly digital world. This would allow Canadians to safely participate in the digital economy and confidently embrace new technologies.

I wish I could begin this Annual Report to Parliament by stating that this has been achieved, with contemporary privacy laws fit for the digital age firmly in place to adequately protect Canadians. Unfortunately, we are not there yet, although I can report that on some issues we did make important progress.

There is no person or industry that was not affected by the pandemic in some way. At the Office of the Privacy Commissioner of Canada (OPC), we adapted smoothly to work-from-home and maintained all our services. Several of the privacy issues we tackled involved initiatives related to COVID-19.

Despite the fact that we were still in the midst of the pandemic, 2020 marked a milestone for privacy law in Canada.

Following calls to action over the course of many years by my office, industry stakeholders and civil society, the government finally tabled Bill C-11 to overhaul Canada’s federal private sector privacy law. It also put forward a comprehensive public consultation document aimed at modernizing our 40-year-old public sector law. While proposed reforms did not progress into new laws before the election called in August, I welcome what is hopefully only a brief pause as a sign of reflection. On the private sector side in particular, significant amendments were needed to ensure the rights and values of Canadians are adequately protected. It is my hope that in the coming months we will see the introduction of a revised private sector law as well as legislative proposals to update the public sector act.

I was deeply concerned that Bill C-11, which died on the order paper when the election was called, would be a step backwards. With the opening of a new Parliament, I am hopeful that the government will make some of the changes we have proposed and we look forward to working with them on legislative reform.

Although the future of privacy protection in Canada remains unsettled and the issues we are facing are complex, I believe the road ahead is quite clear.

As a society we must project our values into the laws that regulate the digital space. Our citizens expect nothing less from their public institutions. It is on this condition that confidence in the digital economy, damaged by numerous scandals, will return.

Evolution of privacy in recent years

To know where we are going, it is useful to remember where we have been and to look at trends we are seeing, for example, in terms of state surveillance, surveillance capitalism and public-private partnerships.

There’s no doubt the privacy landscape has shifted since I began my mandate as Commissioner in 2014. Top of mind back then was the Edward Snowden revelations on how national security practices of governments could intrude into the lives of ordinary citizens. “Privacy is dead,” was also a familiar refrain at that time.

Today, the privacy conversation is dominated by the growing power of tech giants like Facebook and Google, which seem to know more about us than we know about ourselves. Terms like surveillance capitalism and the surveillance economy have become part of the dialogue.

After 9/11, Canada and its allies enacted many laws and initiatives in the name of national security. Some of these laws went too far. The Arar inquiry, Snowden affair and reports involving metadata collection by the Communications Security Establishment (CSE) and the Canadian Security Intelligence Service (CSIS) reminded us that clear safeguards are needed to protect rights and prevent abuse.

Fortunately, after healthy democratic discussions, we have seen improvements. New oversight bodies were introduced, as were amendments to national security legislation to ensure more appropriate thresholds are in place before personal information can be collected or shared between security agencies or law enforcement bodies.

But while we have seen state surveillance modulated to some extent, the threat of surveillance capitalism has taken centre stage. Personal data has emerged as a dominant and valuable asset and no one has leveraged it better than the tech giants behind our web searches and social media accounts.

The risks of surveillance capitalism were on full display in the Facebook/Cambridge Analytica scandal that is now the subject of proceedings in Federal Court because my office did not have the power to order Facebook to comply with our findings and recommendations, nor issue financial penalties to dissuade this kind of corporate behaviour.

The newest frontier of surveillance capitalism is artificial intelligence (AI). AI has immense promise in addressing some of today’s most pressing issues, but must be implemented in ways that respect privacy, equality and other human rights. Our office’s investigation of Clearview AI’s use of facial recognition technology was an example of where commercial AI deployment fell significantly short of privacy laws.

Digital technologies like AI, that rely on the collection and analysis of personal data, are at the heart of the fourth industrial revolution and are key to our socio-economic development. However, they pose major risks to rights and values.

To draw value from data, the law should accommodate new, unforeseen, but responsible uses of information for the public good. But, due to the frequently demonstrated violations of human rights, this additional flexibility should come within a rights-based framework.

Another trend we are seeing is an increase in public-private partnerships and the use of corporate expertise to assist the functioning of the state. Our investigations of Clearview AI and the Royal Canadian Mounted Police (RCMP) this past year are prime examples. Clearview AI violated the private sector privacy law by creating a databank of billions of images scraped from the internet without consent in order to drive its commercial facial recognition software. Our Special Report to Parliament tabled last June highlighted that a federal government institution, such as the RCMP, cannot collect personal information from a third-party agent like Clearview AI if that third-party agent collected the information unlawfully.

Privacy issues arising from public-private partnerships were also evident in a number of government-led pandemic initiatives involving digital technologies this past year. These issues underscored the need for more consistency across the public and private sector laws.

Progress on privacy priorities

While the privacy landscape has certainly evolved in the past few years, the strategic priorities we identified back in 2015, following significant public consultations, have remained relevant and useful in terms of guiding and focusing our efforts.

The 4 priorities we identified were:

Equally important were the 5 strategies we said we would employ to advance them: exploring innovative and technological ways of protecting privacy; strengthening accountability and promoting good privacy governance; protecting Canadians’ privacy in a borderless world; enhancing our public education role; and enhancing privacy protection for vulnerable groups.

The vision behind the priorities was an admittedly ambitious goal in the age of big data: to increase the control Canadians have over their personal information.

We wanted to better understand privacy in each of these areas, and in turn, to inform organizations and the public of the issues at stake, to influence behaviour and to use our regulatory powers most effectively.

Several years on, I remain convinced that our identified priorities were the right ones for these times when massive amounts of personal information are being collected, and powerful algorithms are being used to detect patterns for purposes that range from marketing to national security.

My office can be proud of the important work completed under each of our priority banners. We have made some progress – more so on some priorities than others.

Ultimately, our efforts were limited by the tools at our disposal. In each of the 4 areas of strategic focus, we saw again and again that the real solution to protecting rights is through the modernization of federal privacy laws.

Below, we offer some examples of what we have advanced in each area.

Strategic privacy priority: The economics of personal information

Our goal under the economics of personal information was to enhance the privacy protection and trust of individuals so that they may confidently participate in an innovative digital economy. The centerpiece of our work in this area were our Guidelines for obtaining meaningful consent.

We have also produced guidance on “no go zones” for the collection, use and disclosure of personal information. In it, we outline a number of practices that would be considered “inappropriate” by a reasonable person, including personal information practices that involve profiling or categorization that leads to unfair, unethical or discriminatory treatment, contrary to human rights law.

We have also conducted investigations that address this strategic priority. While these cases have helped to raise awareness of privacy issues, they have also demonstrated the severe limitations of existing laws.

Case in point: our 2019 investigation into the Facebook/Cambridge Analytica scandal. The investigation found Facebook, despite its detailed privacy policies, had failed to obtain meaningful consent and failed to take responsibility for protecting the personal information of Canadians. Despite its public acknowledgement of a “major breach of trust,” Facebook disputed the findings and refused to implement recommendations to address deficiencies.

This case demonstrates the weakness of the current law in forcing companies to be accountable and makes plain that Canadians cannot rely exclusively on companies to manage their information responsibly.

Our investigation of technology company Clearview AI also revealed some of the shortcomings in our law.

We, along with 3 provincial counterparts, found that Clearview AI’s scraping of billions of images of people from across the Internet represented mass surveillance and was a clear violation of the privacy rights of Canadians.

The investigation found that Clearview AI had collected highly sensitive biometric information without the knowledge or consent of individuals. Furthermore, Clearview AI collected, used and disclosed Canadians’ personal information for inappropriate purposes, which cannot be rendered appropriate via consent.

Despite our findings, the company continued to claim its purposes were appropriate, citing the requirement under federal privacy law that its business needs be balanced against privacy rights. We have urged Parliamentarians to ensure a new federal law stipulates where there is a conflict between commercial objectives and privacy protection, Canadians’ privacy rights should prevail.

Our goal was to increase consumer confidence, however, it is clear that we continue to see a crisis of trust. Polling tells us that the vast majority of Canadians are very concerned about their inability to protect their privacy and trust levels remain low.

Canadians want to enjoy the benefits of digital technologies, but they want to do it safely. To make real progress, we need to improve our privacy laws. It is the role of government, and Parliament, to give Canadians the assurance that legislation will protect their rights.

This past year, with significant research and submissions to government and Parliament on both the public and private sector laws, we strongly advocated for better legislation to protect Canadians.

Strategic privacy priority: Government surveillance

Our goal for this priority was to contribute to the adoption and implementation of laws and other measures that demonstrably protect both national security and privacy.

As mentioned earlier, government surveillance occupied much of our energy in the early part of my mandate. We have provided advice to government aimed at achieving national security and public safety with measures that appropriately protect privacy rights.

We also called for greater oversight and were pleased to see Bill C-22 introduce the National Security and Intelligence Committee of Parliamentarians, and Bill C-59 create a new expert national security oversight body, the National Security and Intelligence Review Agency (NSIRA). Since its inception, we have regularly engaged with NSIRA, for example, on a collaborative review under the Security of Canada Information Disclosure Act (SCIDA), which is anticipated to conclude with a report to be tabled later this year.

We have advised the RCMP on its use of drones and body-worn cameras, and we provided advice to the Canada Border Services Agency (CBSA) and the public on electronic device searches at border crossings.

This past year, and as detailed in the following pages, we saw a welcome rise in requests for consultations on national security and public safety issues. We continue to advise the government on these matters, for example in relation to the Passenger Protect Program.

Also, when it comes to government surveillance, we made a series of recommendations to Statistics Canada following our investigation into its plans to collect highly detailed financial information about individuals.

The initiatives involved the collection of credit histories and the proposed mass collection of line-by-line financial transaction information from banks without the knowledge or consent of affected individuals.

We found the program’s design raised significant privacy concerns and illustrated the interdependence of privacy, trust, and social acceptance. Importantly, it highlighted the inadequacy of existing legislation insofar as it lacks the necessity and proportionality standard that exists in other privacy laws around the world.

Necessity and proportionality means organizations should only pursue privacy-invasive activities and programs where it is demonstrated that they are necessary to achieve a pressing and substantial purpose and where the intrusion is proportional to the benefits to be gained.

During the investigation, Statistics Canada officials spoke about their objectives, but did not demonstrate the necessity of collecting so much highly sensitive information about millions of Canadians.

While we welcomed the agency’s willingness to redesign the initiatives to respect the principles of necessity and proportionality, this is not currently a legal requirement. We have also called for amendments to the Privacy Act to expressly include a necessity and proportionality requirement for the collection of personal information.

Strategic privacy priority: Reputation and privacy

Our intention with this priority was that we would help create an online environment where individuals may use the Internet to explore their interests and develop as persons without fear that their digital trace will lead to unfair treatment.

A key ongoing initiative related to this priority is our Draft Position on Online Reputation, which we developed after a consultation and call for essays from various stakeholders. The Draft Position highlighted the OPC’s preliminary view on existing protections in our private-sector privacy law, which includes the right to ask search engines to de-index web pages that contain inaccurate, incomplete or outdated information and removal of information at the source. The Draft Position also emphasized the importance of education to help develop responsible, informed online citizens.

In 2018, we filed a Reference with the Federal Court seeking clarity on whether Google’s search engine service is subject to federal privacy law when it indexes web pages and presents results in response to a search for a person’s name. We asked the Court to consider the issue in the context of a complaint involving an individual who alleged Google was contravening the Personal Information Protection and Electronic Documents Act (PIPEDA) by prominently displaying links to online news articles about him when his name was searched.

The Federal Court issued its decision on the merits of the Reference questions in July 2021. We welcomed the Court’s decision, which aligned with our position that Google’s search engine service is collecting, using, and disclosing personal information in the course of commercial activities, and is not exempt from PIPEDA under the journalistic exemption.

Since it is ultimately up to elected officials to confirm the right balance between privacy and freedom of expression in our democratic society, our preference would be for Parliament to clarify the law in this regard. For the time being, in the absence of such clarification in the legislation, we will continue our investigations.

Strategic privacy priority: The body as information

With this priority, we wanted to promote respect for the privacy and integrity of the human body as the vessel of our most intimate personal information.

We have issued draft guidance for police on the use of facial recognition technology as well as guidance for individuals on smart devices and wearable technologies. In cooperation with our international counterparts as part of the Global Privacy Enforcement Network, we also conducted a sweep of Internet-connected health devices, such as fitness trackers and sleep monitors, and found many such devices fall short on privacy.

We have provided advice to individuals and developed guidance for businesses on direct-to-consumer genetic testing and intervened before the Supreme Court of Canada to defend the position that individuals should not be compelled to disclose their genetic test results to an employer or insurance company or any other business. We welcomed the Court’s decision to uphold the constitutionality of the Genetic Non-Discrimination Act.

Without question, the global pandemic has also shone a light on this priority. Among other activities, over the past year we advised government on and monitored the implementation of various contact-tracing initiatives, including the COVID Alert app. Along with our provincial and territorial counterparts, we offered privacy advice on vaccine passports.

As mentioned, we also investigated and prepared guidance on the use of facial recognition technology. Our investigations into Clearview AI and the RCMP’s use of the company’s facial recognition technology demonstrate that significant gaps remain in appropriately protecting this highly sensitive information. At present, the use of facial recognition technology is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances. The path forward is uncertain but as part of our ongoing consultation on guidance for police use of facial recognition, we have been engaging with police, civil society and other stakeholders on important questions surrounding the use of this technology, including whether the current legal framework needs to change.

Next steps

The strategic priorities that have helped to guide our work remain extremely relevant today. We have made some headway, but our ultimate objective of restoring Canadians’ trust in government and the digital economy remains elusive. Indeed, that goal will remain out of reach until the government enacts new federal laws that appropriately protect privacy rights in Canada.

I remain fully committed to ensuring we achieve this urgent goal.

2020-21, a year of collaboration

As we compiled the content for this Annual Report, I was struck by the degree of collaboration reflected in its pages and I think this is an area where we made important progress.

Collaboration is critical to increasing privacy awareness and compliance. Over the past year, we were increasingly proactive in reaching out to, and bridging relationships with, federal institutions, our provincial, territorial and international privacy counterparts, the business community, civil society and individuals.

In the last year alone, we provided advice to government institutions through advisory consultations more than 130 times, a significant increase over past years.

We also provided advice to businesses, for example, through 13 advisory engagements. Our consultations with businesses have demonstrated that privacy is not an impediment to public health, other government objectives or business interests. The key is good design, which ensures that all of these interests can be achieved concurrently.

On the investigative side, we collaborated with our provincial colleagues on an unprecedented number of joint investigations, including our first-ever joint investigation with all provinces with substantially similar legislation. We also jointly produced guidance on several important topics, including vaccine passports and facial recognition.

We also saw a high degree of collaboration on the international front. We continue to lead in the area of international enforcement collaboration, including in our co-chair roles for the Global Privacy Assembly’s International Enforcement Cooperation Working Group and its Digital Citizen and Consumer Working Group, the latter focused on fostering greater cross-regulatory cooperation across the privacy, competition and consumer protection spheres. More detail on our international work and its impact is found later in this report.

Meanwhile, in conjunction with Innovation, Science and Economic Development Canada, the Canadian Radio-television and Telecommunications Commission and the Competition Bureau, we issued advice to the mobile app industry in relation to their obligations under anti-spam legislation. As mentioned earlier, we also had an opportunity to work collaboratively on issues of mutual interest with the NSIRA.

As I noted earlier, protecting privacy in a borderless world was one of the approaches we recognized as being essential to advancing privacy priorities for Canadians in a global economy. Ideally, maximizing collaborations would be facilitated by interoperable laws, domestically and internationally. This is an essential way forward and a trend that must not only continue, but grow.

What’s next

At the time of writing this report, my mandate as Privacy Commissioner was extended for a year. This will be an important transition period, both for law reform and for the OPC as an institution. I appreciate having the opportunity to continue advising Parliament and working with the talented and deeply devoted OPC staff to prepare for a new era in privacy protection.

The plan for Privacy Act (the public sector privacy law) reform looks promising, and I hope to see a bill tabled soon in Parliament. Meanwhile, I am hopeful the government will seriously consider needed improvements to Bill C-11, so that we can see an updated private sector privacy law that more effectively achieves responsible innovation and the protection of rights.

Once new laws are in place, we anticipate becoming a substantially different Office of the Privacy Commissioner – notably one with greater enforcement powers and an enhanced role in developing guidance, approving codes of practice and working with public- and private-sector institutions towards greater respect for privacy rights.

While it will likely take some time, perhaps a few years, before the OPC exercises new duties, I think it is important that we start preparing as soon as possible, so that we are fully effective when new laws come into force. This planning exercise will be conducted with a view to being even more transparent and fair towards regulated entities and other stakeholders. We want to deepen our engagement activities. The role of a regulator like the OPC is first and foremost to help organizations comply with the law; and when enforcement is required, to exercise this authority quickly but fairly.

This is an important moment for the future of privacy protection in Canada and these are certainly exciting times for the OPC. I am proud of the work we have done to get us to this pivotal point, but there is much more that needs to be done. I look forward to the ongoing collaboration with our many partners in the year to come, to continuing to advocate for Canadians’ privacy rights, for responsible innovation and to helping lay the foundation for what comes next.

Legislative reform: for effective privacy protection, responsible innovation and strengthened consumer trust

By the end of 2020, Canada had marked a pivotal moment for privacy law.

In rapid succession, the federal government unveiled both Bill C-11, which sought to overhaul the federal private sector law, as well as a comprehensive public consultation laying out a plan for modernizing Canada’s nearly 40-year-old public sector law.

These developments followed years of persistent calls for action from our office, as well as stakeholders from across industry and civil society. It has been clear for many years that Canada’s 2 federal privacy laws are not suited to the task of protecting privacy rights in a digital world. We need to better protect Canadians at a time when their confidence in the digital economy is needed to fuel a post-pandemic economic recovery.

The government’s plans for a way forward hit the mark on some fronts, but not others. We were quite pleased with Privacy Act public sector reform proposals contained in a Department of Justice discussion paper. However, the government also introduced Bill C-11, which died on the Order Paper when a federal election was called in August 2021. As imperfect as that bill was, we believe it is possible to bring substantial improvements to the legislation within its existing structure and without having to start over from scratch.

This chapter outlines some of our office’s main concerns and recommendations related to Bill C-11, as well as our substantially more positive submission to the Department of Justice consultation on reform of the Privacy Act.

It also provides an overview of our consultation on regulating artificial intelligence (AI), a key question related to both federal laws, as AI technology holds immense promise but can also have serious consequences for privacy. Further, it discusses a clear gap in the legislative framework for Canadians’ privacy protection, in light of a complaint against 3 federal political parties.

Finally, we provide a summary of our submission to a government review of the Access to Information Act.

Bill C-11 needs significant amendments

In November 2020, the Government of Canada tabled Bill C-11, an important and concrete step toward modernizing our federal private sector legislation. The bill, which died on the Order Paper when a federal election was called in August, would have enacted the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act.

After several months of internal analysis by the OPC, in May 2021, Commissioner Therrien shared his submission and research on the legislative proposal with the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

He noted to Parliament how the bill was misaligned and less protective than the laws of other jurisdictions in a number of ways. He called it a step backward overall. Bill C-11 did not include the privacy protective measures that exist in other countries with similar economies or even in the privacy laws of some provinces in Canada.

Of particular concern if adopted, the bill would have given consumers less control and organizations more flexibility in monetizing personal data, without increasing their accountability. Meanwhile, the proposed penalty scheme was unjustifiably narrow and protracted.

The Commissioner also noted that the OPC would be subject to several new constraints and limitations, when in fact the OPC needs more flexible tools to achieve its mandate in a global, complex and rapidly evolving environment.

He stressed that privacy is not an impediment to economic innovation or technological adaptation. On the contrary, data protection legislation that effectively safeguards privacy and ensures trust can contribute to economic growth. Modernized effective legal regimes achieve this by providing consumers with confidence that their rights are respected. Many countries with strong privacy laws are also leaders in innovation.

The submission set out 60 recommendations aimed at enhancing the bill’s privacy protections for Canadians while enabling responsible innovation for businesses. The recommendations were grouped under 3 over-arching themes:

  • Achieving a more appropriate weighting of privacy rights and commercial interests;
  • Establishing specific rights and obligations; and
  • Ensuring access to quick and effective remedies and the role of the OPC.

Jurisdictional Comparison: Privacy protections

Figure 1: Jurisdictional comparison: Privacy protections: see text version.
Jurisdictional comparison: Privacy protections
  European Union
(GDPR)
United Kingdom New Zealand Australia California Alberta British Columbia Quebec Canada
(Bill C-11)
Coming into force/last major update 2018 2018 2020 2018 2020 2014 2004 2021 2020
(introduced)
Defining privacy as a human right Yes Yes Yes Yes No No No Yes No
Individual knowledge and understanding Yes Yes Yes Yes Yes Yes Yes Yes No
Accountability: compliance with the law as objective standard Yes Yes Yes Yes N/A Yes Yes Yes No
Audit: proactive to verify compliance*,** Yes Yes No Yes Yes Yes No Yes No
Administrative monetary penalties: broad list of violations Yes Yes N/A Yes Yes N/A N/A Yes No
Absence of appeal before privacy-specific tribunal Yes Yes Yes Yes Yes Yes Yes Yes No
Broad discretion to decline/discontinue complaints*,** Yes Yes Yes Yes Yes Yes Yes Yes No
Full discretion for public education and guidance Yes No Yes Yes Yes Yes Yes N/A No
Codes approval: under DPA procedures Yes Yes Yes Yes N/A No N/A N/A No
Trans-border: specific provisions Yes Yes Yes Yes No Yes No Yes No
* MP Nathaniel Erskine-Smith introduced Bill C-413 (42nd Parliament) to provide the OPC with these authorities.
** Proposed by Justice Canada in Respect, Accountability, Adaptability: A discussion paper on the modernization of the Privacy Act (November 2020).

Key issues to consider when designing a modern law

Digital technologies that rely on the collection and analysis of personal information are at the heart of the fourth industrial revolution and are key to our socio-economic development. They can serve the public interest, if properly designed.

But they can and have also disrupted rights and values that have been centuries in the making: fundamental rights such as privacy, equality and democracy.

In our view, if Canadian laws are to be fit for purpose in these modern times, they must address the following:

Issue 1: Defining permissible uses

The challenge is to define permissible uses of data so as to both enable responsible innovation and protect the rights and values of citizens.

Our private sector law, PIPEDA, is currently based on the consent model. Consent has its place in data protection, if truly meaningful and in relatively straightforward business relationships between organizations and consumers, but it cannot be the only means of protecting privacy. In fact, consent can be used to legitimize uses that are objectively completely unreasonable and contrary to our rights and values. And refusal to provide consent can sometimes be a disservice to the public interest.

Bill C-11 rightly introduced certain exceptions to consent, giving businesses greater flexibility in the processing of personal information. Unfortunately, some of the exceptions were too broad or ill-defined to foster responsible innovation.

As suggested in the recommendations we issued on artificial intelligence last fall, we could authorize the use of data for legitimate business interests, but within a rights-based framework.

Such a provision would give considerable flexibility to use data for new purposes unforeseen at the time of collection, but would be based on the particular and knowable purposes being pursued by the organization, and subject to regulatory oversight.

What we need is not Bill C-11’s model of self-regulation (where vaguely worded legal standards are left to be clarified by commercial organizations and the role of regulators would be severely limited) but true regulation, meaning objective and knowable standards adopted democratically, enforced by democratically appointed institutions, like the Office of the Privacy Commissioner.

We need sensible legislation that allows responsible innovation that serves the public interest and is likely to foster trust, but that prohibits using technology in ways that are incompatible with our rights and values.

Issue 2: The need for a rights-based framework

The greater flexibility to use personal information without consent for responsible innovation and socially beneficial purposes should occur within a legal framework that would entrench privacy as a human right and as an essential element for the exercise of other fundamental rights.

Only a rights-based law can provide adequate protection, not from theoretical risks to privacy, but from the kind of actual harms we’ve seen time and time again in Canada and abroad.

Unfortunately, Bill C-11 assumed that privacy and commercial interests are competing interests and that a balance must be struck between the two. In fact, it arguably gave more weight to commercial interests than the current law by adding new commercial factors to be considered in the balance, without any reference to the lessons of the past 20 years on technology’s disruption of rights.

In our view, it would be normal and fair for commercial activities to be permitted within a rights framework, rather than placing rights and commercial interests on the same footing. Generally, it is possible to concurrently achieve both commercial objectives and privacy protection. However, where there is conflict, rights should prevail.

In adopting a rights-based approach, we would send a powerful message as to who we are and what we aspire to be as a country. This is the approach adopted in Quebec and the approach suggested in recent proposals for law reform by the Government of Ontario.

That being said, some have argued that a rights-based framework is not possible in a federal law in Canada due to our Constitution.

We agree that the principal basis for a federal private sector privacy law is Parliament’s jurisdiction over trade and commerce. However, if the law is in pith and substance about regulating trade and commerce, then it can include privacy protections, including privacy as a human right. Indeed, recent jurisprudence from the Supreme Court of Canada shows that a preamble would strengthen the constitutional footing of the legislation by identifying the purpose and background to the legislation.

Recognizing privacy as a human right is also fully compatible with our existing technology-neutral, principles-based data protection framework and it need not result in a law that is overly prescriptive.

The prescriptive nature of a law is often related to the level of detail associated with the definition of specific privacy principles. A rights-based framework operates at the same level of generality as a principles-based law. Neither is strictly prescriptive. They are both equally flexible and adaptable to regulate a rapidly changing environment such as the world of technology and the digital economy.

The bottom line is that commercial activities should be permitted within a rights framework, rather than placing rights and commercial interests on the same footing.

Our 2021 investigation into Clearview AI is a good example of what can go wrong when this important principle is ignored.

Clearview AI: Striking an appropriate balance between privacy rights and commercial interests

In February 2021, our office, along with 3 provincial counterparts, released the results of an investigation into technology company Clearview AI, which sells a facial recognition tool that allows law enforcement to match photographs of unknown people against a massive databank of 3 billion images, scraped from the Internet.

In Canada, this data was primarily used for policing purposes without the knowledge or consent of those involved. The result was that billions of people essentially found themselves in a police line-up. We concluded this represented mass surveillance and was a clear violation of PIPEDA.

Clearview AI put forward a series of arguments based on PIPEDA’s approach that privacy rights and commercial interests must be balanced against one another. It claimed that individuals who placed or permitted their images to be placed on the Internet lacked a reasonable expectation of privacy in their images, that the information was publicly available and that the company’s appropriate business interests and freedom of expression should prevail.

Although we rejected these arguments, some legal commentators have suggested our findings may be inconsistent with PIPEDA’s purpose clause by not giving sufficient weight to commercial interests. If Bill C-11 had been passed as tabled, Clearview AI and these commentators could still have made such arguments.

We argued that Bill C-11 should be amended to make clear that, where there is a conflict between commercial objectives and privacy protection, the latter should prevail.


Further Reading

Issue 3: How to define corporate accountability

Accountability is one of the primary counter-balances to increased ability for organizations to use information without consent. Given this, we believe it is critical that the accountability principle be clearly defined in the law and that the legislation provide protective measures such that the accountability of organizations is real and demonstrable.

The Consumer Privacy Protection Act as drafted failed to define accountability as an objective standard requiring policies and procedures that ensure compliance with the law. Instead, it defined accountability in descriptive terms, akin to self-regulation: the adoption of policies and procedures that an organization decides to put in place.

It also importantly failed to provide for demonstrable accountability, meaning accountability that is demonstrated to the regulator, an independent third party. In today’s world where business models are opaque and information flows are increasingly complex, individuals are unlikely to file a complaint when they are unaware of a practice that may harm them. This is why it is so important for the regulator to have the authority to proactively inspect the practices of organizations. Where consent is not practical and organizations are expected to fill the protective void through accountability, these organizations must be required to demonstrate true accountability upon request.

Bill C-11 did not provide our office with these essential tools that many of our counterparts in other jurisdictions have.

The privacy laws of Quebec and Alberta, and those of several foreign jurisdictions, including common law countries such as the United Kingdom, Australia and Ireland, each have some or all such provisions to ensure organizations are held accountable for the way they use their increased flexibility to collect, use and disclose the personal information of consumers.

Issue 4: The need for common, or at least similar, principles for the public and private sectors

A fundamental aspect of the environment within which our privacy laws must be defined is the increased role of public-private partnerships and contracting relationships.

We have seen how public-private partnerships and contracting relationships involving digital technologies can create additional complexities and risks for privacy.

The pandemic certainly underscores this. Videoconferencing services and online platforms are allowing us to socialize, work, go to school and even see a doctor remotely but they also raise new privacy risks. Telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. Meanwhile, e-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.

Our investigations into Clearview AI and the RCMP discussed elsewhere in this report are further examples of the risks involved in public-private partnerships.

Common privacy principles enshrined in both our public and private sector privacy laws would help address gaps in accountability where the sectors interact.

Issue 5: The need for interoperable laws, internationally and domestically

An important impetus for the Consumer Privacy Protection Act was the desire to maintain Canada’s European Union adequacy status. It is vital for the data that supports trade to travel outside our borders, without infringing upon the rights and values that we broadly share with our partners.

Interoperability between laws helps to facilitate and regulate these exchanges and it reassures citizens that their personal information is subject to similar protections when it leaves Canada. It also benefits organizations by reducing compliance costs and increasing competitiveness.

In August 2021, the OPC updated several guidance documents to reaffirm some of the types of personal information generally considered sensitive in the context of PIPEDA as a means of addressing European concerns about adequacy. The update sets out that certain types of information will generally be considered sensitive and require a higher degree of protection. This includes health and financial data, ethnic and racial origins, political opinions, genetic and biometric data, an individual’s sex life or sexual orientation, and religious/philosophical beliefs. Other jurisdictions have defined specific categories of personal information in their laws, including the European Union’s General Data Protection Regulation (GDPR). The updated guidance aims to better explain the concept of sensitive information under PIPEDA so it can be evaluated more accurately against the GDPR.

Interoperability is also an important domestic factor. Currently, with Bill C-11 now defunct and a number of proposals being put forward at the provincial level, there is the potential for a patchwork of privacy laws across Canada.

Quebec’s Bill 64, adopted into law in September 2021, and Ontario’s recent white paper on privacy law reform issued in June, both set out a rights-based approach to privacy as well as expansion of the scope of their laws. For instance, in the case of Quebec, it has expanded the scope of its law to include political parties which our office has also advocated for. Both also include more efficient adjudication and financial penalty regimes.

A jurisdictional comparative chart submitted to Parliament along with the OPC’s proposed amendments to Bill C-11 underscored how the Consumer Privacy Protection Act  was frequently misaligned and actually less protective than the laws of other jurisdictions in Canada, Europe and elsewhere. Canada aspires to be a global leader in privacy and it has a rich tradition of mediating differences on the world stage. Adopting a rights-based approach, while maintaining the principles-based and not overly prescriptive approach of our private sector privacy law, would situate Canada as a leader showing the way in defining privacy laws that reflect various approaches and are interoperable.

Issue 6: The need for quick and effective remedies and the role of our office

Adopting adequate privacy legislation is not sufficient in itself. Laws must be enforced through quick and effective mechanisms. In many countries, this is done through granting the regulatory authority the power to issue orders and impose significant monetary penalties.

Such legislation does not seek to punish offenders or prevent them from innovating. It seeks to ensure greater compliance, an essential condition of trust and respect for rights.

It must be said that many businesses and organizations take their privacy obligations seriously. However, not all of them do. It is important that legislation not benefit the offenders.

Penalties must be proportional to the financial gains that businesses can make by disregarding privacy. Otherwise, organizations will not change their practices; minimal penalties would represent a cost of doing business they are willing to accept in order to generate profits. The proportional nature of penalties is also an advantage for smaller enterprises.

Unfortunately, the penalty provisions in Bill C-11 were largely hollow. First, the bill listed only a few violations as being subject to administrative penalties. The list did not include obligations related to the form or validity of consent, nor the numerous exceptions to consent, which are at the core of protecting personal information.

It also didn’t include violations to the principle of accountability, which, as noted, is required as an important counterbalance to the increased flexibility given to organizations in the processing of data.

Moreover, Bill C-11 proposed an additional layer of decision-making in the form of the Personal Information and Data Protection Tribunal that would have been responsible for imposing monetary penalties and hearing appeals against decisions of the OPC.

Such a tribunal does not exist in this form anywhere else and, we believe, would create unnecessary delays for consumers. Worse, it would encourage companies to choose the route of appeal rather than find common ground with the OPC when we are about to issue an unfavourable decision. We believe that the addition of this tribunal would only delay access to justice for consumers. The courts are perfectly capable of reviewing the legality of OPC decisions.

This does not mean we do not welcome transparency and accountability. On the contrary, we would gladly consult stakeholders in developing rules of practice that would ensure fairness towards parties in proceedings that may lead to the imposition of orders and penalties.

Finally, Bill C-11 would have imposed on the OPC new responsibilities, including the obligation to review codes of practice and certification programs, and advise individual organizations on their privacy management programs upon request; the obligation to rule on complaints before consumers can exercise a private right of action; and the obligation to rule within strict time limits.

We welcome the opportunity to work with businesses to ensure their activities comply with the law. However, adding all of these non-discretionary responsibilities, without giving the OPC the authority to manage its workload, is problematic.

An effective regulator is one that prioritizes its activities based on risk. Like other regulators, we need the legal discretion to manage our caseload so that we can respond to the requests of organizations and complaints of consumers in the most effective and efficient way possible, while reserving a portion of our time for activities we initiate, based on our assessment of risks to Canadians. A future law must take that into consideration.

We are confident it is possible to bring substantial improvements to this bill within its existing structure and without having to start over from scratch. This would be the most efficient and expedient way forward and we are ready to work with the government to this end.

Further Reading

Artificial intelligence consultation

As part of our legislative reform policy analysis work, our office also launched a public consultation to examine AI as it relates to the private sector. We received 86 submissions and held 2 in-person consultations. We received feedback from industry, academia, civil society and the legal community. Based on that feedback, we issued recommendations for regulating the technology in November 2020, some of which also factored into our broader recommendations on Bill C-11. Our approach to AI in the private sector also informed our Privacy Act reform recommendations issued in March.

AI has become a reality of the digital age, supporting many of the services individuals use in their daily lives. It offers the potential to help address some of today’s most pressing issues affecting both individuals and society as a whole. AI can improve efficiency in the public sector and industry, and allow for new methods and solutions in fields such as public health, medicine and sustainable development. It also stands to increase efficiency, productivity, and competitiveness – factors that are critical to economic recovery post-pandemic and long-term prosperity.

However, uses of AI that involve personal information can have serious consequences for privacy. AI models have the capability to analyze, infer and predict aspects of individuals’ behaviours and interests.

AI systems can use such insights to make automated decisions about people, including whether they get a job offer, qualify for a loan, pay a higher insurance premium, or are suspected of unlawful behaviour. Such decisions have a real impact on lives, and raise concerns about how they are reached, as well as issues of fairness, accuracy, bias and discrimination.

We concluded AI presents fundamental challenges to all of PIPEDA’s foundational privacy principles. More specifically, it highlights the shortcomings of the consent principle in both protecting individuals’ privacy and in allowing its benefits to be achieved.

Our key recommendations included:

  • Amending PIPEDA to allow personal information to be used for new purposes towards responsible AI innovation and for societal benefits;
  • Creating the right to meaningful explanation for automated decisions and the right to contest those decisions to ensure they are made fairly and accurately; and
  • Requiring organizations to design AI systems from their conception in a way that protects privacy and human rights.

With respect to AI, Bill C-11 included an approach that allowed for flexibility in the use of de-identified information for research and development purposes, while keeping such information within the confines of privacy law. Given the persistent risk of re-identification of de-identified information, we welcomed this approach.

However, significant improvements were needed to other aspects of Bill C-11 that concern AI in order to adequately protect people’s privacy rights. While Bill C-11 included transparency provisions related to AI, it only required that an explanation be provided of how personal information was obtained, instead of how it was used by an AI system to arrive at a decision. There were also no obligations in Bill C-11 for algorithmic traceability to support such explanations, or that would allow individuals to contest such decisions or request a human review of them. Clarity regarding the role of inferential data in categorizing and profiling individuals was also absent.

The risks of AI systems in undermining human dignity, self-determination, and fairness further demonstrate why a rights-based approach to privacy law is needed. Without it, AI can not only have severe consequences for individuals but also broader society, such as heightening inequality, discrimination, and societal divisions.

Further Reading

Privacy and federal political parties

An important matter the federal privacy laws do not address is the application of privacy law to political parties. Bill C-11 made no progress in this regard.

Furthermore, in May 2021, our office closed the file on a complaint related to alleged violations of PIPEDA by the New Democratic Party of Canada, the Liberal Party of Canada and the Conservative Party of Canada regarding alleged violations of PIPEDA in its current form.

The complainant alleged the parties were contravening PIPEDA by not properly informing Canadians on how they collected, used and/or disclosed their personal information to conduct political advertising, including “micro-targeted” advertising based on detailed profiles of individuals.

The complainant also claimed that activities undertaken by the parties were commercial activities and were therefore subject to the Act.

After reviewing the extensive representations and evidence submitted, our office concluded that the activities of the 3 federal political parties at issue in the complaint were not commercial in character within the meaning of the current Act.

We strongly believe that privacy laws should govern political parties, as is the case in some provinces and as we recommended in our Bill C-11 submission. However, we are required to apply the law as it stands today and that is why we have dismissed the complaint against certain federal parties.

Further Reading

Privacy Act reform

Turning now to regulation and reform over public sector privacy matters, the government’s Privacy Act reform consultation document did a considerably better job overall of addressing many similar concerns to those we raised in the context of Bill C-11.

It proposed substantive changes that represented significant strides toward a law that is in step with modern data protection norms.

Unlike Bill C-11, it included a proposed purpose clause that specifies that a key objective of the law is “protecting individuals’ human dignity, personal autonomy, and self-determination,” thereby recognizing the broad scope of privacy as a human right.

The government also proposed measures aimed at providing meaningful oversight and quick and effective remedies, such as order-making powers for our office and expanded rights of recourse to Federal Court.

As noted in our submission to the government’s consultation, one of the more fundamental changes proposed is the inclusion of foundational and broadly recognized data protection principles – including a new “identifying purposes” principle, in addition to a “limiting collection” principle to restrict the types and amount of personal information federal public bodies may collect. As well, under the proposed collection threshold, a federal public body would be limited to collecting only the personal information “reasonably required” to achieve a purpose relating to its functions and activities, or where it is otherwise expressly authorized by another act of Parliament.

We noted that the shift towards digitization has made collection, use, disclosure and retention of information much easier for government and that imposing a stricter threshold for collection would limit potential over-collection by federal government institutions.

Our investigation into Statistics Canada’s collection of Canadians’ personal information from a credit bureau and planned collection from financial institutions underscored the importance of limiting the collection of personal information to what is necessary and is proportional in the circumstances. So too has much of our pandemic-related work, including that involving the COVID Alert exposure notification app and ongoing discussions on vaccine passports.

In fact, the assessment framework we released in April 2020 to assist government institutions faced with responding to the COVID-19 crisis echoed a number of the measures we’ve put forward in our legislative reform submissions. For instance, we recommended the collection of personal information by federal institutions be governed by the widely accepted necessity and proportionality standard, which is the norm in many modern laws internationally and in Canada, at the provincial level.

The Department of Justice has said its “reasonably required” collection standard, in practice, would be “essentially equivalent to leading international standards.”

If that’s the case, we welcome the proposal to raise the current threshold (relevance) and accept that a “reasonably required” standard may be workable if the aim is to add clarity to the law, while yielding results similar to the longstanding principles of necessity and proportionality.

To ensure alignment with modern standards, we recommended amendments to the proposed factors to be taken into account as part of the “reasonably required” assessment, including that the specific, explicit and lawful purposes be identified, and personal information collection be limited to what is reasonably required in relation to those purposes, as well as whether the loss of privacy or other fundamental rights and interests of the individual is proportionate to the public interests at play.

The consultation document also included proposals to address concerns regarding AI, including alignment with federal policy instruments to ensure that individuals are aware when they are interacting with these systems, that they understand what types and sources of personal information these systems use, and that they generally know how they function. While these are good objectives for increasing transparency, it’s important to provide individuals with actionable rights, given that the use of automated decision-making based on personal information can be the determinant of whether an individual receives government services, benefits, or is eligible for various programs.

Consistent with our recommendations regarding Bill C-11, our submission to the Department of Justice made in March reflected our belief that individuals should be provided with a right to a meaningful explanation of automated decisions (including a standard for the level of specificity such explanations contain), and a right to contest such decisions. These are important in ensuring that the traditional data protection principles of accuracy and accountability can continue to adequately protect privacy in the context of AI, which fundamentally challenges such privacy principles. They are also particularly important in the public sector context to respect natural justice and procedural fairness.

We look forward to continued collaboration with the Department of Justice on this important initiative for Canadians to ensure strong protections for their privacy rights.

Further Reading

Review of the Access to Information Act

The Access to Information Act (ATIA) and the Privacy Act both play a central role in preserving our information rights as our society becomes increasingly digital. Both laws are essential to fostering a more open, transparent government and upholding the tenets of democracy.

Our federal access to information regime can ensure that Canadians have open, accessible and trustworthy information from government. Where personal information is at stake, Canada’s privacy laws limit the circumstances under which that information can be disclosed.

In February 2021, our office participated in the Treasury Board of Canada Secretariat’s statutory review of the ATIA. This review presented an opportunity to examine issues that lie at the intersection of both the ATIA and the Privacy Act.

Our submission expressed support for the modernization of Canada’s access laws to facilitate greater openness and transparency, but not at the expense of privacy.

We noted modern access and privacy laws are needed if we are to draw value from data while preserving our democratic values and the protection of our rights in a digital environment.

Given the high degree of intersection between the ATIA and the Privacy Act – particularly with respect to concepts such as de-identification, publicly available information, and the definition of personal information – we strongly recommended amendments to the laws be made concurrently. This is fundamental to both Acts continuing to being read as a seamless code of information rights. To that end, the OPC and the Office of the Information Commissioner of Canada (OIC) have implemented a Memorandum of Understanding to help identify and guide instances where the OIC may consult with the OPC on intersecting issues.

Further consideration should be given to how competing privacy and information interests should be balanced in the context of public interest disclosures.

We note that the Department of Justice has indicated that it will factor in the comments received during the review of the ATIA when it undertakes its review of the Privacy Act. We hope these reviews will present an opportunity for both ministers to address the interplay between the ATIA and the Privacy Act through concurrent amendments.

Further Reading

Other legislative initiatives in Canada

Over the last year, we have seen a number of provincial jurisdictions take steps in the right direction to enhance their privacy laws to better protect Canadians.

In Quebec, the provincial government, through the passage of Bill 64, has granted citizens clear, enforceable rights such as the right to erasure. Under provisions that will come into force in 2 years, the enforcement powers of the Commission d’accès à l’information (CAI) will also be significantly increased.

In September 2020, Commissioner Therrien appeared before the Committee on Institutions of the National Assembly of Quebec while the bill was being considered. He noted that a number of elements were consistent with the law reform proposals our office has put forward.

For example, it includes provisions that address profiling and protecting the right to reputation, which are consistent with our approach to rights-based legislation. It subjects political parties to the private sector law, and has a more efficient adjudication and financial penalty regime.

Our office also participated in the public consultation conducted by the Special Committee to Review the Personal Information Protection Act of British Columbia.

Our submission included several points made in our Bill C-11 submission to the House of Commons Standing Committee on Access to Information, Privacy and Ethics and supported recommendations put forth by British Columbia Information and Privacy Commissioner Michael McEvoy. This included the fundamental benefits of a mandatory breach reporting regime, the need for modern enforcement mechanisms to include both order-making as well as the ability to issue fines, and how coordination at the domestic and international level is required to effectively protect privacy in today’s digital economy.

Finally, in June 2021, the Government of Ontario released a white paper seeking input from privacy stakeholders and Ontarians on elements of a modern privacy law for Ontario’s private sector. Among the highlights, the white paper proposes that, consistent with the recommendations of the OPC’s Bill C-11 submission, Ontario could establish a fundamental right to privacy as the underpinning principle for its new privacy law, ensuring that Ontarians are protected.

The proposal also includes more effective remedies than those found in Bill C-11, specifically, IPC-issued penalties rather than the tribunal model provided for in Bill C-11. In September 2021, the Information and Privacy Commissioner of Ontario released its response to the white paper urging the Ontario government to move forward with its plans despite the uncertainty of law reform at the federal level.

Further Reading

Conclusion

In Canada, we are now in the midst of a fourth industrial revolution. Digital technologies are being adopted at a staggering pace, from our largest cities to our most remote northern communities. New technologies can plainly provide important economic and social benefits, but they also present huge challenges to legal and societal norms that protect fundamental Canadian values.

Privacy is a fundamental right, recognized as such by Canada as a signatory to the Universal Declaration of Human Rights which was proclaimed in 1948. It is nothing less than a prerequisite for the freedom to live and develop independently as persons away from the watchful eye of a surveillance state or commercial enterprise, while still participating voluntarily and actively in the regular, and increasingly digital, day-to-day activities of a modern, democratic society.

Only through respect for the rights and values we cherish will Canadians be able to safely enjoy the benefits of these technologies. Unfortunately, bad actors have eroded our trust and without law reform, trust will continue to erode.

Lawmakers have an opportunity to restore and build confidence in our digital economy. We can do this by integrating core Canadian values such as respect for human rights into our laws.

Our office is committed to working with the government and with Parliament to develop laws that will effectively protect the privacy rights of Canadians in a rapidly evolving digital environment.

Privacy by the numbers

Privacy Act complaints accepted 827
PIPEDA complaints accepted 309
Data breach reports received under PIPEDA 782
PIPEDA complaints closed through early resolution 210
Privacy Act complaints closed through early resolution 441
Advisory engagements with private-sector organizations 13
Privacy Act complaints closed through standard investigation 414
Well-founded complaints under the Privacy Act 64%
PIPEDA complaints closed through standard investigation 86
Well-founded complaints under PIPEDA 73%
Bills, legislation and parliamentary studies
reviewed for privacy implication
17
Data breach reports received under the Privacy Act 280
Advice provided to public-sector organizations
following PIA review or consultation
136
News releases and announcements 54
Privacy impact assessments (PIAs) received 81
Public interest disclosures by federal organizations 491
Information requests 7,090
Tweets sent 443
Advisory consultations with government departments 109
Parliamentary committee appearances on
private- and public-sector matters
3
Speeches and presentations 32
Twitter followers 18,616
Visits to website 2,491,736
Blog visits 26,754
Publications distributed 1,260

The Privacy Act: A year in review

Our office’s public sector work in the past year continued to have a strong focus on privacy issues related to the COVID-19 pandemic. Collaborating with many partners within Canada and abroad, we contributed significant efforts to help ensure privacy principles would be considered and applied, in tandem with efforts to support public health goals, for example in relation to contact tracing technologies, vaccine passports, border issues, the role of telecommunications companies and the workplace. We experienced a significant increase in requests for advice and engagements, in terms of more privacy impact assessments and consultations with our office on a variety of key issues, including in the area of public safety and national security, but we saw fewer public sector breaches reported to us, despite concerns that breaches continue to take place. We exceeded our backlog reduction goals and improved complaint processes to create greater efficiencies, for example in improving front-end information gathering to support our investigations and in employing various strategies such as early resolution.

The following section highlights key initiatives under the Privacy Act in 2020-21.

Spotlight on pandemic-related work

From the beginning of the COVID-19 pandemic, our office’s approach has focused on how personal information can, and should, remain protected using a flexible and contextual approach during a grave national health emergency. Privacy is not an impediment to public health, and in 2020 we demonstrated measures to protect privacy and build public trust in government public health initiatives. Throughout the pandemic, Canadians have shown they continue to care about the protection of their personal information in this context, and we now know that public health and privacy can go hand in hand.

The pandemic has resulted in a wide range of federal government initiatives with a potential impact on privacy.

The federal government has engaged with our office on a range of files, including numerous initiatives related to COVID-19 infection tracking and tracing, border controls and initiatives to provide benefits during the economic crisis. This work included ongoing engagement on the government’s COVID-19 exposure notification application.

In general, the government has consulted with our office on key COVID-19 initiatives such as COVID Alert, benefit programs and, more recently, proof of vaccine credentials. As always, we recommend that we be consulted as soon as possible on initiatives that may impact the privacy of Canadians and that assessments of any privacy impacts be conducted ahead of implementation.

Health Canada – COVID Alert app

We have continued to provide advice to Health Canada on the national COVID-19 exposure notification application COVID Alert, as it evolved and new features were added.

Our initial review of the app’s privacy risks and measures in place to mitigate such risks was informed by the joint statement issued in May 2020 by federal, provincial and territorial privacy commissioners, entitled Supporting public health, building public trust: Privacy principles for contact tracing and similar apps. This is discussed in detail later in this report.

As noted in our 2019-2020 Annual Report, we supported the app on the condition that it would be voluntary, and as long as its use was found to be effective in reducing transmission of the virus.

During the course of our consultations with Health Canada, we made a number of recommendations that were implemented. We have advised the Government of Canada to closely monitor the app and that it be decommissioned if shown not to be effective.

At the time of writing, Health Canada was leading an evaluation of the app. Our office was participating with Health Canada on that evaluation. The app’s design and implementation were being assessed for necessity and proportionality, including effectiveness, and compliance with the principles outlined in the statement made with our provincial and territorial colleagues. From our perspective, a strong demonstration of effectiveness of the app will be a key focus of the evaluation. It is expected that the evaluation will be completed in the fourth quarter of 2021.

Further Reading

Public Health Agency of Canada – Vaccine passports

As the pandemic response evolved, governments in Canada and around the world turned their attention to the idea of a credential or certificate confirming vaccination status, sometimes called “vaccination passports.” Discussions are centering on the creation and use of this tool to enable economic recovery and more normal business practices to proceed in a secure manner.

Our office has been evaluating the viability of a vaccine passport and/or certificate in the Canadian context, especially concerning the necessity and effectiveness of this measure in enabling secure activities, and proportionality between the purpose for and collection of sensitive health information.

In May 2021, we issued a joint statement with our provincial and territorial counterparts that outlines fundamental privacy principles that should be adhered to in the development of vaccine passports. More detailed information can be found in the “domestic cooperation” section of this report.

At the time of writing this report, we were in discussions with the Government of Canada and our provincial and territorial counterparts on this initiative.

Further Reading

Public Health Agency of Canada – Canadian border enhancements in response to COVID-19

At the beginning of 2020, the Government of Canada announced new rules on international travel in response to the COVID-19 pandemic. The Public Health Agency of Canada (PHAC) is the lead institution for ensuring quarantine requirements are respected, and for ensuring compliance checks with travellers are completed post-arrival.

Our office consulted regularly with PHAC and its partner enforcement institutions as new measures under the Quarantine Act related to COVID-19 were implemented at Canadian borders.

This included an early consultation with the Canada Border Services Agency (CBSA) about the paper coronavirus form issued to travellers arriving in Canada from the Hubei province of China. After this initial consultation, measures evolved and expanded quickly. Our office provided advice and recommendations on multiple phases of PHAC’s ArriveCAN mobile device application and web browser platform, which enables international travellers to report on their compliance with mandatory isolation measures. Travellers can use the ArriveCAN app to provide the government with prescribed travel information when entering Canada and during their required isolation period.

In keeping with the principle of minimizing data collection to that which is necessary for the specified purpose, we recommended that PHAC and their partners put measures in place to prevent the over-collection of personal information from travellers. PHAC responded positively and addressed this concern by limiting open-text boxes and eliminating unnecessary sections of the form. We also advised that PHAC ensure privacy notice statements are clear and delivered consistently across all modes of interaction with arriving travellers. As a result of this recommendation, PHAC put measures in place to ensure privacy notices are given verbally by agents when screening travellers by telephone.

We also made recommendations on privacy issues related to processes and procedures allowing for the entry of travellers for compassionate reasons and for release from quarantine under limited and exceptional circumstances. As part of this compassionate entry program, PHAC discloses personal information to partners – including provinces and territories, the CBSA and Immigration, Refugees and Citizenship Canada (IRCC). We recommended that PHAC implement information sharing agreements for these disclosures to ensure adequate protection of personal information and clarity for travellers impacted during the COVID-19 pandemic. PHAC has indicated that an ISA with the CBSA with respect to ArriveCAN was under development, and has committed to providing our office with a copy of the agreement.

Contact-tracing initiatives: Canadian Air Transport Security Authority, Elections Canada, Vancouver Fraser Port Authority, Treasury Board of Canada Secretariat

Our office was consulted by multiple federal institutions regarding various COVID-19 workplace screening and contact-tracing activities. Some examples of these activities are:

  • The Treasury Board of Canada Secretariat (TBS) consulted us regarding tools to monitor and report cases of COVID-19 across the public service
  • The Canadian Air Transport Security Authority (CATSA) consulted us regarding the use of contact-tracing logs in their offices for CATSA employees testing positive for COVID-19
  • Elections Canada consulted us on contact-tracing activities related to 2 Toronto area by-elections
  • The Vancouver Fraser Port Authority submitted a privacy compliance evaluation on its use of a third-party electronic visitor management system to screen employees and visitors for COVID symptoms and to alert individuals in the event of an outbreak

For each of these initiatives, our office stressed the need to limit the purpose of any collection of personal information to the specific activity of contact tracing. We also advised institutions to restrict access to any logs containing personal information.

As well, we recommended that institutions place a time limit on the collection of personal information for these activities, and end such collection when no longer required. In cases where a third-party provider was used to aid in contact tracing efforts, our office recommended strong privacy provisions in the third-party contract, such as establishing clear procedures in the event of a breach and assessing the security controls of third-party systems.

Further Reading

The section in this report describing international and domestic cooperation includes other examples of our COVID-related work.

Virtual recruitment and hiring tools: VidCruiter

Virtual hiring tools and activities are gaining in popularity – particularly in the context of a global pandemic and the concurrent acceleration of digital tools.

In our 2019-2020 Annual Report, we noted consultations with 4 Government of Canada institutions – the Department of Justice, the Canadian Space Agency, Health Canada and Employment and Social Development Canada (ESDC) – on the use of VidCruiter.

VidCruiter is a Canadian company offering video recruitment and other hiring solutions, including applicant screening, scheduling of candidate interviews, a platform for live and recorded video interviews and reference checks.

Platforms like this can help to overcome hiring process challenges such as the scheduling availability and geographic location of candidates. However, their use raises certain risks to privacy that federal institutions must consider before contracting out their staffing processes.

We advised that all government institutions wanting to use this service should complete a privacy impact assessment (PIA). We also published a Privacy Act bulletin with privacy advice for institutions conducting video job interviews.

In 2020-2021, we received 6 PIAs from institutions implementing VidCruiter: ESDC, Canadian Space Agency, Department of Fisheries and Oceans, Canada Revenue Agency (CRA), Infrastructure Canada, and Health Canada.

Our office will continue to engage with institutions as these types of initiatives are implemented.

Further Reading

Other advice and outreach to government departments

Our Government Advisory Directorate (GA) provides advice and recommendations to the federal public sector on identifying, assessing and mitigating risks to personal information collected, used, retained and disclosed for government activities and programs. We provide our guidance through reviews of formal PIAs completed by institutions under the TBS Directive on Privacy Impact Assessment and through less formal consultations with institutions as initiatives develop.

We encourage federal institutions to reach out to us early in the development of programs which may raise privacy concerns. Early consultation allows our office the opportunity to help institutions implement a privacy-by-design approach and provide timely, focused and relevant feedback. We also see that early consultation has a positive impact on the quality of subsequent PIA submissions. This enhances the review process and contributes greatly to building solid relationships between our office and the institutions we work with.

In 2020-21, we were pleased to see an increase in requests for consultation from government institutions on issues such as COVID-19 programs (described above), as well as on national security and law enforcement initiatives. We opened 109 new consultation files, a significant increase from 66 in 2019-20. We welcome the additional opportunities to provide timely feedback on new programs and activities, in particular for potentially privacy-intrusive public safety activities.

We also reached out proactively to federal institutions to offer our consultation services. For example, we contacted the Department of National Defence and the Canadian Armed Forces (DND/CAF) to offer our support as they responded to recommendations issued by the National Security and Intelligence Committee of Parliamentarians in its special Canadian Citizen Information (CANCIT) report on the collection, use, retention and dissemination of information on Canadians in the context of DND/CAF defence intelligence activities. We met with DND/CAF several times and provided feedback on the new Chief of Defence Intelligence Functional Directive: Guidance on the Handling and Protection of Canadian Citizen Information, which governs these activities and was drafted in response to recommendations in the CANCIT report. Our recommendations focused on clarifying the extra-territorial application of the Privacy Act, legal authority for disclosure of certain personal information, as well as limiting collection, safeguards and accountability.

DND has taken our comments and committed to continuing consultations with our office on the Directive. In addition, DND acknowledged the need to engage with our office earlier for the next review of the Directive.

PIAs and advisory consultations

Law enforcement issues

We noted a significant increase in consultations related to the uptake and adaptation of new technologies in the context of public safety and law enforcement. These include the expansion of biometric identification techniques such as facial recognition and the adoption of new data analytics techniques and surveillance tools.

The evolution and expansion of all these technologies can have serious implications for privacy rights.

Body-worn cameras – National roll-out and Nunavut pilot

Our office has been engaging with the RCMP on projects involving body-worn cameras since 2010, and we continue to review and provide advice on use of these cameras and pilot projects.

Body-worn cameras are an inherently privacy invasive tool and law enforcement agencies contemplating their use must take steps to ensure that any use is lawfully authorized and that privacy risks are managed appropriately. As we note in guidance we co-developed with provincial and territorial counterparts in 2015, the privacy intrusion must be minimized to the extent possible and offset by significant and demonstrable benefits. The guidance highlights a number of key privacy principles including necessity and proportionality, data minimization, safeguards, retention issues, access rights and right of recourse, governance and accountability and transparency.

Body-worn cameras have been deployed by the RCMP on a discretionary basis in specific situations across the country for several years. Most of these situations have been public protests where the RCMP believed the risk to public order justified the use of cameras.

In June 2020, in the context of widespread international protests against police actions involving racialized individuals and systemic racism, the RCMP indicated it was planning a national roll-out of body-worn cameras for general duty policing. According to the RCMP, the purposes of the body-worn camera program would be to enhance accountability, increase transparency and ensure the accurate capture of evidence.

Our office reached out to the RCMP and advised that a pilot program should be undertaken before a national roll-out. We also discussed the program with the then-Information and Privacy Commissioner for the Northwest Territories and Nunavut. We discussed concerns about the serious privacy risks that may be posed by use of body-worn cameras, including the potential for function creep and the possible erosion of public trust if camera footage is used inappropriately. In our view, a pilot program would allow the RCMP to conduct a preliminary risk assessment of its policies and procedures, and would help ensure any national roll-out would be designed with privacy protections in mind. In November 2020, the RCMP launched a 7-month trial in Iqaluit, Nunavut.

We met several times with the RCMP to discuss the privacy implications of the pilot and of the planned national roll out of body-worn cameras. We provided comments on its draft national policy which will govern the use of cameras for general policing duty.

Among other things, we stressed the need for the RCMP to assess and mitigate privacy risks before rolling out the cameras, including by completing a revised PIA for that activity, and by undertaking a new PIA for the video storage solution. We also made several recommendations to ensure the policy is clear as to its objectives; that it more clearly addresses when officers are to activate and deactivate their cameras; and to ensure the retention periods are reviewed to keep only recordings which are necessary. We also made recommendations to address risks related to limiting use, redacting personal information, accountability, transparency and openness.

Further Reading

Clare’s Law

The RCMP consulted with our office on its project to amend regulations to allow it to participate in Clare’s Law processes in provinces where such legislation has been enacted and where the RCMP is the police force of local jurisdiction.

Clare’s Law is a domestic violence disclosure initiative that was first implemented in England and Wales in 2014 and named for Clare Wood, a woman murdered by a former intimate partner who police knew to be dangerous. The law allows police to divulge information suggestive of a risk of interpersonal violence to a relationship partner or third party.

Saskatchewan was the first Canadian province to bring a version of Clare’s Law into effect with the Interpersonal Violence Disclosure Protocol (Clare’s Law) Act. The Act came into force in 2020. Several other provinces have adopted or are considering similar legislation. The RCMP did not initially participate in provincial disclosure regimes.

Disclosing information about an individual’s past involvement with law enforcement, including accusations and complaints for which no charges were laid or convictions registered, clearly represents an invasion to the privacy of that individual. Without clear limits, the impact of such disclosures could be significant – particularly if the information is inaccurate, used for other purposes or further disclosed.

However, the right to privacy is not absolute and is subject to exceptions provided for in the law. We recognize that governments must take meaningful action to help stem the incidence of interpersonal violence. Addressing the issue is all the more important during a pandemic, where victims may face violence while under stay-at-home orders.

With these issues in mind, we believe disclosure under any Clare’s Law regime should balance the privacy interests of the individual whose information is disclosed with the imperative of protecting an individual at risk of interpersonal violence.

We provided comments and advice to the RCMP as to how it could disclose information under Clare’s Law while meeting its legal obligations under the Privacy Act.

The RCMP has now amended its Regulations to facilitate participation in Clare’s Law provincial regimes. At the time of writing, the RCMP indicated it was in the final stages of developing a PIA to assess privacy implications of participating in Clare’s Law processes and protocols in provinces where such legislation exists.

National security issues

We have seen a rise in requests for consultation from the national security and intelligence community. We welcome this increase in desire to engage with us, as our office has stressed that national security and intelligence activities must be balanced and proportionate to ensure the rights of law-abiding Canadians are not put at undue risk.

The law should prescribe clear and reasonable standards for the sharing, collection, use and retention of personal information, and compliance with these standards should be subject to independent and effective review mechanisms, including the courts. The importance of such oversight mechanisms is underscored by the lack of public transparency that out of necessity accompany national security and intelligence activities.

We are pleased to have the opportunity to engage in collaborative work on issues of mutual concern with the National Security and Intelligence Review Agency (NSIRA), Canada’s independent expert review body for all national security and intelligence activities, with whom we have a signed a Memorandum of Understanding (MOU) to work collaboratively.

Our office and NSIRA consult regularly to discuss issues of common interest, and to coordinate our review activities to avoid duplication and to help ensure productive engagements with institutions. From an enforcement perspective, our MOU will inform the basis for carrying out a joint Security of Canada Information Disclosure Act compliance review, which is anticipated to conclude with a report to be tabled later this year.

Our mandates are intertwined because the work of national security agencies depends in large part on personal information.

Our office’s work can benefit from NSIRA’s national security and intelligence expertise, and NSIRA from our office’s privacy expertise. This was recognized by Parliament with amendments to the Privacy Act to allow the Privacy Commissioner to coordinate activities and share information with NSIRA.

With this in mind, our office’s technological experts have launched a long-term technical knowledge-sharing partnership with peers at NSIRA. This work has involved discussions on how our technology analysts can effectively support each other in our overlapping mandates to support the privacy and security of Canadians. We have exchanged information on artificial intelligence, surveillance technologies and our research into the effects of Internet of Things devices.

Public Safety Canada – Updates to the Passenger Protect Program

The Passenger Protect Program is an air security program with the objective of preventing specified individuals who may pose a threat to air security, or who may travel by air to commit a terrorist act, from boarding a plane. The program screens commercial passengers travelling to, from, and within Canada against the Secure Air Travel Act (SATA) list, commonly referred to as the “No Fly List.”

Since 2007, our office has reviewed and provided recommendations on 3 PIAs for the Passenger Protect Program and we have undertaken an audit of the program. Our office has stressed since the inception of the program that national security activities must be undertaken in a manner that respects the rights of the travelling public.

Recent amendments to SATA have enabled changes to the Passenger Protect Program. These changes include the introduction of the Canadian Travel Number, a new component to prevent delays for travellers who have the same, or similar, name as someone on the list, and the development of the SATA Centralized Screening Program under the authority of Public Safety Canada, with assistance from Transport Canada and CBSA.

Our office has reviewed and provided comments on PIAs for both programs.

Under the Canadian Travel Number program, travellers who face difficulties travelling because they have the same or a similar name as someone on the No Fly List can apply for a special identification number for booking flights to, from or within Canada. Public Safety Canada works with federal partner institutions to verify information about applicants for Canadian Travel Numbers.

We recommended that Public Safety Canada update its MOU with these partner institutions to ensure that each institution is adequately protecting the information in its custody, and that regular reviews of audit logs are undertaken to monitor and control access to this sensitive personal information. Public Safety Canada confirmed it updated the MOUs in line with our recommendations.

The Centralized Screening Program gives the Government of Canada more control over the screening and matching process by removing air carriers’ access to the SATA list.

The CBSA’s centralized screening IT solution receives and evaluates passenger information from air carriers to determine whether there are potential matches to the SATA list.

We recommended that the system be tested and evaluated regularly for accuracy in order to ensure that decisions are made fairly and according to information that is as accurate and up-to-date as possible. In response, Public Safety Canada provided evidence that the program had been tested for accuracy. It also indicated that all partner agencies will continuously monitor the program to ensure data quality issues or errors in submission are addressed.

We also recommended that Public Safety Canada provide guidance to air carriers on what to include in privacy notices presented to travellers booking commercial flights. Privacy notices to travellers should clearly indicate that their personal information is being collected, used and disclosed to and within the Government of Canada, and the purposes for which it will be used should be detailed. Public Safety Canada has responded positively, and confirmed it is working with partner institutions to ensure that air carriers receive guidance in this regard.

Social media monitoring

Immigration, Refugees and Citizenship Canada social media monitoring

In 2020-2021, we reviewed a PIA and completed a consultation process with Immigration, Refugees and Citizenship Canada (IRCC) on their social media monitoring activity.

IRCC has been monitoring social media platforms to collect data on posts and commentary relevant to the department’s mandate, policies and activities. The activity is not directed at gathering information about individuals or IRCC clients. IRCC indicated that it is using social media monitoring tools to assess whether inaccurate information about Canada’s immigration policies and practices is being disseminated via social media platforms, and to counter such misinformation with communications messages where necessary.

In line with the OPC’s 2013 recommendations following an investigation into the government’s collection of an Indigenous activist’s personal information from social media, we recommended that IRCC ensure measures are in place to minimize the collection of personal information from social media sites to that which is clearly necessary for legitimate government business and the stated purposes of the program. In addition, we recommended the department regularly evaluate its use of social media monitoring tools to determine whether continued use is justified as necessary, proportional and effective. We also recommended that IRCC provide transparency to the public about this activity by posting a clear privacy notice on its website, by publishing descriptions of use of information collected from social media sites in its Personal Information Bank (PIB), and by publishing a summary of the PIA.

We have provided similar advice to other institutions as government interest in monitoring social media has grown. We remind all institutions that the Privacy Act protects personal information even when it is publicly available, and that institutions are not allowed to collect personal information that is not directly related to an operating program or activity. Institutions also have an obligation to ensure that personal information used in any decision affecting an individual is as up to date and accurate as possible.

The GA Directorate offers outreach sessions with considerations for these activities and is engaged with federal institutions to help ensure officials understand the risks and implement appropriate mitigations.

Public interest disclosures (under 8(2) (m) (i) and (ii) Privacy Act and section 37), Department of Employment and Social Development Act

Paragraph 8(2)(m) of the Privacy Act permits the disclosure of personal information where, in the opinion of the head of the institution, the public interest in disclosure clearly outweighs any invasion of privacy that could result. The Act also permits disclosures that would clearly benefit the individual to whom the information relates. It is up to the head of the institution to determine whether the public interest outweighs the right to privacy. Under subsection 8(5) of the Privacy Act, the institution has a duty to notify the Privacy Commissioner. Similar authorities for disclosure in the public interest or when the benefit to the individual is clear and a corresponding obligation to notify our office, are found in section 37 of the Department of Employment and Social Development Act.

Our office received 491 notifications of disclosures of personal information in the public interest, or in circumstances that benefited the individual, compared to 611 in 2019-20.

This is a slight decrease, but continues the trend we’ve seen of a large number of public interest disclosures over the past few years.

As in previous years, the majority (74.5%) of notifications we received were from ESDC. For example, this included cases where Service Canada clients in distress have threatened to harm themselves or have made threats against others. Disclosures are made to police of local jurisdiction for wellness checks on the individuals involved and/or to ensure the safety of others. ESDC also assists police in locating missing persons and in notifying next of kin for deceased individuals.

DND accounts for the next largest number of notifications (63), followed by IRCC (20), and the RCMP (18).

Compliance issues

Privacy Act enforcement

In 2020-21, we accepted 827 complaints under the Privacy Act, an 8% increase from the previous year, when we accepted 761 complaints.

Similar to previous years, most complaints accepted in 2020-21 were related to access to personal information (28% in both 2020-21 and 2019-20) and to institutions failing to respond to access requests within the time limit required under the Act (38% in 2020-21 and 45% in 2019-20).

Intake improvements

For complaints that have been submitted using our office’s online complaint form, we continue to reduce the number of steps needed for gathering additional information by both complainants and respondents. This is due to our efforts to better direct complainants and request the necessary information and documents at the time the complaint is raised.

We initially received fewer complaints at the onset of the pandemic. As well, some institutions were seized with operational priorities related to the pandemic and were unable to engage with us on investigations. This was only temporary, as individuals, government and our own office adapted to the new reality.

In 2020-21, we closed 855 complaints under the Privacy Act, compared to 1335 complaints in the previous year. The change was mostly due to changes in our counting methodology aimed at enhancing accuracy and consistency.

As we noted in last year’s annual report, we have adjusted how we track and report on complaints and investigation findings to more accurately represent the number of individuals raising privacy concerns.

Since April 1, 2019, when an individual’s complaint about a single matter represents potential contraventions of multiple sections of the Privacy Act, or when an individual complains following multiple access requests made to one institution, we track and report these as a single complaint. This allows us to provide a more consistent picture of our work related to both the Privacy Act and PIPEDA.

Complaint backlog

Of particular note in 2020-21 was our focus on reducing our backlog of cases over 12 months old. In 2019, our office committed to reducing the backlog of investigations related to the Privacy Act and PIPEDA by 90%.

Notwithstanding the operational challenges of the pandemic, we met and slightly exceeded this target in 2020-21. Our focus on closing the backlog explains why our average treatment time for investigations rose to 16.3 months (compared to 12.4 months last year).

We achieved these results in part through temporary funding our office received in 2019, as well as by enhancing our processes and continuing to capitalize on effective enforcement tools such as early resolution.

Time limits investigations: Deemed refusals

We have also continued to use the deemed refusal approach to time limit complaints, as described in our 2018-19 and 2019-20 annual reports.

Under the deemed refusal approach, if a federal institution does not grant access to a complainant’s personal documents within a set period of time, our office will rule that there is deemed refusal, thus enabling the matter to be brought before the Federal Court. The deemed refusal process generates positive outcomes for the privacy rights of individuals including encouraging institutions to respond to access requests in a more reasonable time-frame and empowering complainants with the right to pursue a matter in Federal Court when they do not.

This process continues to be an efficient way of resolving complaints. Our deemed refusal approach has significantly sped up time limit investigations where we face resistance or lack of responsiveness from organizations, ensuring the investigation will not exceed a year.

Time limit investigations treatment times
Fiscal Year Average treatment
time in months
2020-21 5.04
2019-20 7.50
2018-19 6.98
2017-18 6.28
Early resolution

Our Compliance, Intake and Resolution Directorate, of which the early resolution and intake team is a part, is responsible for ensuring that straightforward complaints are resolved as efficiently and effectively as possible.

Early resolutions apply to situations where an issue is resolved to the satisfaction of the complainant early in the investigation process and our office does not issue a finding. Early resolution continues to be an integral part of our operations.

We note that 2020-21 was a high-watermark year for early resolution. We closed more than half of all complaints through this process.

Percentage of complaints closed in early resolution
Fiscal Year Percentage of all
complaints closed in ER
2020-21 52%
2019-20 25%
2018-19 32%
2017-18 37%

In addition to maximizing our use of early resolution for Privacy Act complaints, we continue to use summary investigations to expedite the conclusion of investigations into straightforward matters. Summary investigations are shorter investigations that conclude with the issuance of a brief report or letter of findings. During this last year, we have made changes within our case management system to capture our use of summary investigations, which will allow us to better measure their use.

Early resolution success story:

Employee exit procedures reviewed and reinforced by Indigenous Services Canada to prevent risk of inappropriate access to employee work email accounts

A temporarily reassigned employee submitted a Privacy Act complaint against Indigenous Services Canada (ISC) after they learned that former colleagues were able to access their ISC work email account.

The employee reported this incident to ISC and submitted their complaint to our office. We contacted the complainant who agreed to attempt an Early Resolution of this matter outside of a formal investigation.

After we notified ISC of this complaint, the department completed an internal review to assess the scope of the reported incident and followed up with corrective action to limit access to the employee’s work email. This included changing the work email password as recommended by their IT department.

The ISC indicated that the request and approval for access to the employee’s email went through the appropriate channels. The ISC offered the rationale that managers require email access to gather information on working files that the departed employee may not have saved to a central repository. Further, the ISC noted that it was standard practice to share an employee’s work email inbox with managers, during an employee’s extended absence, but it would have been outside of the scope of normal operations to share access to the employee’s work email inbox with other ISC employees. The email inbox of the complainant was only shared with their manager, and not with other employees.

The complainant requested, and ISC management agreed, to review and remind staff about related e-mail security policies and procedures. To mitigate any future risk, the ISC management team also reviewed exit procedures with all directorate employees to reinforce the requirement to store all relevant Government of Canada information in the ISC central document repository.

As per procedure, file records from departing employees are to be reassigned to active employees, so that this information is still accessible to ISC post-departure. Finally, the management team has communicated to their employees that government email accounts are not personal email accounts and should be treated accordingly.

Through the early resolution process the complainant and the ISC were able to both address the complainant’s concerns, and mitigate the risk of recurrence through awareness and email hygiene training for management and staff alike.


Compliance monitoring unit

We often make recommendations to institutions in our investigative reports of findings, but it is incumbent on those organizations to implement our recommendations. Our compliance monitoring unit follows up with federal institutions to verify whether the recommendations we have made during key investigations under the Privacy Act are being addressed. This allows us to assess whether federal institutions are meeting their commitments to our office and to Canadians.

In 2020-21, 10 complaints were directed to the compliance monitoring unit. This included, for example, complaints against DND, Correctional Services Canada (CSC), ESDC, CRA and RCMP.

Statistics Canada follow up to ensure project redesign respects privacy principles

In 2018, we investigated allegations surrounding 2 administrative Statistics Canada data projects involving the collection of credit histories and the proposed mass collection of line-by-line financial transaction information from banks without the knowledge or consent of affected individuals.

Our office had received more than 100 complaints about the projects. We found the concerns raised by Canadians were clearly justified given the scale of the proposed collection, the highly sensitive nature of the information and the fact that the information in question would paint an intrusively detailed portrait of a person’s lifestyle, consumer choices and private interests.

Our investigation concluded that, had Statistics Canada proceeded with the collection of financial transaction data, this would have exceeded its legal authority to collect personal information and we expressed significant privacy concerns regarding the necessity and proportionality of both projects.

In response, Statistics Canada agreed to follow our recommendations not to implement the projects as originally designed and to work with our office to redesign the initiatives to respect the principles of necessity and proportionality.

Statistics Canada has made progress on this front. In particular, it is now proposing a 3-phase approach to the projects. The first phase will test for feasibility, which will inform the shape of the final projects. It has also reduced the amount of personal information to be gathered in the feasibility phase.

During our engagement, Statistics Canada also implemented a number of privacy-enhancing measures, such as the creation of internal and external ethics bodies and a research initiative that will lead to the development of a sensitivity scale to help guide managers as they gather data for statistical programs. As well, the Statistics Canada website now provides more information related to privacy and the collection of data.

Despite the progress, we found project plans lacking in a number of areas, including:

  • its public goals are not described with the level of specificity and precision that is commensurate with the privacy impacts;
  • the effectiveness of the projects has not yet been demonstrated, thus making it difficult to assess whether less intrusive means had been given sufficient consideration; and
  • privacy impacts have not been given sufficient consideration.

We made a number of recommendations to Statistics Canada to address these outstanding areas of concern and believe a course correction is still possible. We have asked Statistics Canada to resubmit project plans for our review.

Further Reading

Summaries of key reviews and investigations

Workplace issues
An institution shared employee’s sensitive personal information without authorization

We investigated a complaint against a federal institution by an employee who alleged that her privacy was breached when the institution shared the fact that she is transgender with her supervisor and colleagues despite an explicit expectation of confidentiality.

The disclosure in question relates to a request by the complainant to transfer to another position because she had experienced workplace harassment and discrimination relating to her gender identity.

She asked that the reasons for her transfer not be shared with her new supervisors or coworkers, and her employer assured her that the matter would be dealt with in a discreet and confidential manner, given the sensitive nature of her case.

Nevertheless, when she started in her new position, it was clear to the complainant that her new supervisor and several of her new co-workers were aware she was transgender and knew of the reasons for her transfer.

The institution conducted an internal review of the matter and determined that some of the managers involved in effecting the complainant’s transfer disclosed the reasons for it to several staff members because they thought it necessary to support her and her new supervisor. The internal review ultimately found that this was done in error, and was not handled in accordance with the institution’s internal policies.

We found that these actions clearly contradicted the explicit expectation of confidentiality assured by the institution. The institution did not argue that the disclosure was permissible under any exceptions to disclosure under subsection 8(2) of the Act.

Accordingly, we found the complainant’s personal information had been disclosed without consent contrary to section 8(1) of the Act.

In response to this clear privacy breach, the institution stated that it recognized the need for better adherence to policies and procedures, as well as further transgender awareness education. We recommended that the institution act quickly to update its policies and procedures with the goal of preventing a similar future occurrence. The institution responded by creating a new guidance document to support its transgender employees and making it available to staff to better understand their roles and the privacy and confidentiality of transgender employees.

Accordingly, the complaint was well-founded and resolved.

Intersection between public and private sectors

The line between private businesses and government institutions is becoming increasingly blurred, given the interconnectedness of the digital economy and the ease with which information is generated, used, and disclosed.

We are increasingly seeing issues that span both public and private sectors, raising important privacy questions and illustrating the need for some consistency in updated privacy legislation. For example, this intersection appeared in our investigation of the RCMP’s collection of information from Clearview AI.

Investigation into the RCMP’s use of Clearview AI facial recognition technology

In June 2021, our office tabled a Special Report to Parliament to share our findings in an investigation regarding the RCMP’s use of a facial recognition technology database created by Clearview AI, a technology company that was itself the subject a previous OPC investigation.

As described elsewhere in this report, Clearview AI was earlier found to have violated Canada’s federal and provincial private sector privacy laws by creating a databank of more than 3 billion images scraped from internet websites without users’ consent.

Following our investigation of the RCMP under the Privacy Act, we affirmed that a government institution cannot collect personal information from a third party if that third party collected the information unlawfully.

Our investigation found that the RCMP’s use of Clearview AI’s facial recognition technology to conduct hundreds of searches of a database compiled illegally by Clearview AI was a violation of the Privacy Act.

We also found serious and systemic gaps in the RCMP’s policies and systems to track, identify, assess and control novel collections of personal information through new technologies.

The RCMP agreed to implement our recommendations to improve its policies, systems and training. This includes conducting fulsome privacy assessments of third-party data collection practices to ensure any personal information is collected and used in accordance with Canadian privacy legislation.

The RCMP also agreed to create a new oversight function intended to ensure new technologies are on-boarded in a manner that respects individuals’ privacy rights.

The case was another example of how public-private partnerships and contracting relationships involving digital technologies are creating new complexities and risks for privacy.

While the OPC maintains the onus was on the RCMP to ensure the database it was using was compiled legally, the RCMP argued doing so would create an unreasonable obligation and that the law does not expressly impose a duty to confirm the legal basis for the collection of personal information by its private sector partners.

Our view is that activities of federal institutions must be limited to those that fall within their legal authority and respect the general rule of law. The Commissioner encouraged Parliament to amend the Privacy Act to clarify that federal institutions have an obligation to ensure that third party agents it collects personal information from have acted lawfully.

In an effort to provide some clarity to police agencies that are increasingly looking to facial recognition technology to solve crime or find missing persons, our office, along with our provincial and territorial privacy counterparts, issued draft guidance to assist police in ensuring any use of facial recognition technology complies with the law, minimizes privacy risks and respects privacy rights.

We launched a public consultation to help establish clearer rules and consider whether new laws are desirable.

The nature of the risks posed by facial recognition technology calls for collective reflection on the limits of acceptable use of the technology. Canadians must be free to participate in the increasingly digital, day-to-day activities of a modern society without the risk of their activities being routinely identified, tracked and monitored. We believe that it is necessary to carefully consider issues related to facial recognition technology as Canada looks to modernize federal privacy laws.

Further Reading

RCMP commits to privacy improvements for vulnerable sector checks

Three individuals filed complaints after the RCMP included non-conviction information about them during “vulnerable sector” checks, which may be required when applying for certain jobs or volunteer positions. The RCMP had indicated on completed vulnerable sector check forms that it had located “adverse information” concerning the complainants. This information related to occurrences in which the complainants were either not charged or not convicted of an offence. In one case, the complainant had a mental health crisis that involved police intervention.

We found that the RCMP failed to obtain informed consent from the complainants for the vulnerable sector checks in 2 of the 3 cases. In particular, the consent forms signed by the complainants suggested that “adverse information” was limited to information about suspects who were charged with an offence – which the 2 complainants had not been.

Further, we found that the issue of consent did not address all of the privacy issues raised by the complainants. Applicants for jobs or volunteer opportunities that require vulnerable sector checks have little choice but to accept the terms offered by the RCMP. While we are of the view that non-conviction information can be relevant to vulnerable sector checks and can serve a legitimate public purpose in some cases, the RCMP’s policy allowing for the broad use of non-conviction information also raised concerns about the presumption of innocence and the stigmatization of mental health issues.

While not a legal requirement under the Act, we assessed the RCMP’s practices by applying the concepts of necessity, effectiveness, proportionality and minimal intrusiveness. We accepted that in some cases, non-conviction information may be relevant and recent enough to merit disclosure to a potential employer or volunteer organization. Non-conviction information can, in certain circumstances, be effective in meeting the important need of conducting an assessment of risk of an individual working with vulnerable persons.

Nevertheless, we were not persuaded that the loss of privacy entailed by the RCMP’s policy of reporting non-conviction information broadly, including mental health incidents, in vulnerable sector checks was proportional or minimally intrusive. We noted that other jurisdictions in Canada, including Alberta and Ontario, had adopted more restrictive criteria for the use of non-conviction information in vulnerable sector checks.

In response to our investigation and recommendations, the RCMP agreed to revise its vulnerable sector check consent form within 6 months. The RCMP will provide more specific details about the types of information that will be considered during a vulnerable sector check and will inform applicants of their right to request an independent review of a decision to include non-conviction information.

Additionally, the RCMP committed to implementing a policy of only considering non-conviction information relevant if it meets certain specific criteria for “exceptional disclosure.” The policy will also specify that mental health related occurrences should not be considered relevant unless they meet the criteria for exceptional disclosure. We are monitoring the RCMP’s implementation of these measures.

We therefore consider this matter to be well-founded and conditionally resolved.

Investigation finds no privacy contraventions in administration of Canada Student Service Grant program

ESDC funded the WE Charity Foundation, through a contribution agreement, to implement the Canada Student Service Grant program. The contribution agreement was with the WE Charity Foundation but permitted the participation of other WE Charitable entities (such as WE Charity Canada, WE Well-being Foundation, and ME to WE Foundation) in the delivery of the program.

The now-cancelled grant program encouraged young people to take part in service activities to gain experience and support the COVID-19 response.

In the course of administering the program, WE Charity Canada collected the personal information of student applicants. The information was not shared with ESDC.

We received complaints against the WE Charitable entities (WE Charity Foundation, WE Charity, WE Well-being Foundation, and ME to WE Foundation) under PIPEDA and against ESDC under the Privacy Act. The complaints noted that the personal information appeared to be stored outside of Canada, and that the website’s terms appeared to allow collected information to be shared with partners for unclear purposes.

One of the complaints against ESDC concerned the outsourcing of the use and collection of students’ personal data by the WE Charitable entities under the auspices of a government initiative and potential impacts on its security and disclosure to third parties.

In our investigation of these complaints, we found no indication that personal information had been shared or used for purposes other than administering the Canada Student Service Grant Program, or any other indications of inappropriate handling. Neither PIPEDA nor the Privacy Act prevent the storage of this type of personal information outside of Canada.

Given the lack of indications of inappropriate handling of the personal information in question, we discontinued our investigation under PIPEDA and closed our investigation under the Privacy Act. In doing so, we refrained from making a finding as to whether the Privacy Act applied to ESDC in this matter. However, we noted there may be important privacy risks to Canadians, and related Privacy Act considerations, when activities funded by the Government of Canada for the purposes of implementing a government program involve the collection, use, and disclosure of personal information. We encouraged ESDC to carefully consider privacy risks when contemplating similar arrangements in the future and to consult with our office in advance of undertaking such initiatives to ensure that both the spirit and the letter of the Privacy Act are respected.

Statutory Review of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act

Our office completed its fourth biennial review of measures taken by the Financial Transactions and Reports Analysis Centre (FINTRAC) to protect the information it receives or collects, as required by the Proceeds of Crime (Money Laundering) and Terrorist Financing Act. The review objective was to assess whether FINTRAC has appropriate controls in place to protect the personal information it collects. The review examined governance, risk management and control practices for managing the security of personal information at FINTRAC and assessed progress made by FINTRAC in response to our 2017 report.

While our review found some areas of strength, we are concerned that several recommendations made in our 2017 report remain outstanding, despite commitments by FINTRAC to implement them. The outstanding commitments relate to: (i) collection and retention of personal information not meeting reporting thresholds, and (ii) monitoring of activity logs to identify security risks. We have called on FINTRAC to address these issues on a priority basis.

Privacy Act breaches

Our office continues to be concerned with breach reporting practices in the public sector. Breach reporting is mandatory in the public sector at the policy level, unlike in the private sector, where it is required by law.

We remain convinced that under-reporting by federal government organizations represents a systemic problem.

We saw an 18% decline in reports received this year (280 in 2020-21 compared to 341 in 2019-20).

Just one institution, ESDC, made up 59% of reported breaches in 2020-21. In fact, breach reports from ESDC have accounted for at least half of all breaches reported in the public sector since 2017-18. It is evident that ESDC has invested in privacy awareness and developed breach reporting procedures. We would encourage ESDC to continue its efforts in this regard, and for other institutions to do the same. However, many other government institutions handle large amounts of personal information, or highly sensitive information, but do not report breaches to our office.

As discussed in last year’s annual report, we are concerned that several large institutions have been conspicuously absent from the breach reports we receive.

In addition, we continue to see few reported breaches involving cyber attacks, although we know that government institutions – which are increasingly using digital tools and delivering online services that involve personal information – are not immune to them. The total number of such reports received this past year was just 9, compared with 5 in 2019-2020. Despite this increase, we are concerned that overall reports of cyber attacks to our office remain low. This is especially evident when comparing these numbers to breach reports under PIPEDA, where 42% of breaches over the previous year were related to cyber attacks.

In fact, we have an ongoing investigation into the cyber-attack of the GCKey system that is used by approximately 30 government departments.

Breach reports are important tools – they allow us to ensure that organizations are putting in place appropriate measures to reduce risks to Canadians following a breach, as well as to prevent future incidents. They are a valuable source of information to inform how our office, and other stakeholders, can best address privacy risks.

Over the years, our office has repeatedly called for mandatory privacy breach reporting under the Privacy Act to help combat systemic under-reporting in the federal public sector. We are pleased to see that this change is being proposed as part of Justice Canada’s proposals for Privacy Act modernization.

In the meantime, we continue to work with TBS on updating our breach reporting form as a way of providing increased clarity to institutions on reporting practices.

PSPC breach affects almost 70,000 public servants

We received a breach report from Public Services and Procurement Canada (PSPC) about disclosures of pay-related information for about 70,000 public servants.

Upon receiving related complaints, we launched an investigation to determine whether the disclosure was a contravention of the Privacy Act. We also examined how PSPC responded to the breach.

Our investigation revealed that the breach resulted from a preventable, even predictable, error when producing reports about overpayments to public servants. The error led to mismatched data in a spreadsheet, which in turn led to PSPC sending reports with information about individuals in other departments to 61 institutions. We concluded that the disclosure of the information in these reports was in contravention of obligations under the Privacy Act.

In response, PSPC put in place measures to prevent a recurrence of this type of breach, which include verifying the data when producing the reports and removing employees’ home addresses from certain reports.

We were encouraged by the expediency with which PSPC responded to the breach, issuing breach notices directly to the large number of affected public servants. For that notification process, PSPC relied on cooperation from the home departments of the affected public servants as well as the departments that received information in error.

We concluded that the matter was well-founded and resolved.

We are pleased by the manner in which PSPC responded to this particular breach. Nonetheless, it is concerning that the institution did not appear to learn that it must treat pay information with due care from our 2017 investigation of the Phoenix pay system.

Review of passport management practices by 4 institutions

IRCC is accountable for the Passport Program, which issues Canadian passports in collaboration with partner institutions.

Every year, our office receives reports of passports lost or stolen from federal institutions. This prompted us to examine the measures that IRCC has put in place to protect passports. We also examined the practices of 3 other partner institutions involved in the issuing of passports – ESDC, Global Affairs Canada (GAC) and Canada Post Corporation (CPC).

We found that the 4 institutions generally had reasonable measures in place to prevent unauthorized disclosures of passports.

However, we found a few areas for potential improvement.

In terms of remediation for individuals, we found that lost and stolen passports were not consistently deemed to be viewed as “material” breaches (that is, serious enough to be reported to our office and the Treasury Board of Canada Secretariat), despite the persistent risk of identity theft that could ensue from a passport falling into the wrong hands. Given the information it contains and the intrinsic sensitivity and value of a passport, we see any loss or theft of passports as a material breach that should be reported both to our office, and to individuals when they are unaware.

We also found that individuals were, on average, not notified for months, and that individuals were not offered concrete assistance, such as credit monitoring, to help them manage the risk of identity theft.

To address this, we recommended that GAC, IRCC and ESDC should jointly implement and provide:

  • consistent written guidance on how to assess whether a passport breach is “material”
  • reasonable service standards for timely notification to affected parties of lost or stolen passports
  • appropriate advice and consistently offer mitigation measures, such as credit monitoring, to help protect affected individuals from the longer-term risks of identity theft

In terms of lessons learned after passport breaches, we found that IRCC monitors passport incidents reported to them, and they review incidents when there is a marked deviation from the normal.

However, ESDC, the department reporting most of the breaches, could provide no indications they collect information aimed at reducing risks of recurrence related to incidents while passports are in transit through the mail with Canada Post. As a result, it is unclear what data the IRCC Passport Program uses for passport incident analysis, and how the results are communicated to relevant stakeholders to reduce the risk of unauthorized disclosures.

To address this, we recommended that IRCC strengthen the current incident assessment and investigation processes and inform the relevant stakeholders accordingly.

The Personal Information Protection and Electronic Documents Act: A year in review

Our office’s work under PIPEDA in the past year included a significant focus on breaches, as detailed below, which have been reported to us in increasingly high numbers since it became legislatively required. We are particularly concerned by a growing trend we are seeing in private sector breaches involving ransomware attacks. To keep up with the high volume in our breach area, we launched technological solutions to help make it easier for organizations to report incidents to our office. In the spirit of ongoing efficiency, more certainty for businesses and service to Canadians, we used a variety of tools to conclude investigation files as effectively as possible, seeing more than 70% of PIPEDA files closed through early resolution.

As noted earlier, with the help of federal funding, we met and surpassed our backlog reduction goals of 90%, (which is why the treatment times for some files are longer for this year), and we anticipate the investment and efficiency lessons learned will lead to much shorter treatment times on a going forward basis. As we did in the public sector, we were pleased to be in a position to provide proactive advice to organizations to improve their privacy practices.

The year marked some important collaborations with a wide variety of partners, including our provincial and territorial colleagues, with whom we conducted the highest ever number of joint investigations last year. It was also the first time a joint investigation was conducted by the federal office and all 3 offices with substantially similar legislation. We look forward to even more collaborative initiatives such as these, generating superior privacy protections and benefits to Canadians and organizations.

The following section highlights key initiatives under PIPEDA in 2020-21.

Spotlight on breaches

Breaches continue to be a significant area of concern for our office. Last year, our office received 782 breach reports, affecting at least 9 million Canadian accounts.

This represents a 15% increase in reports received over the previous year. Since mandatory breach reporting obligations came into effect in 2018 under PIPEDA, our office has seen a 600% increase in reports.

The continuing large volume and scope of breach incidents is extremely concerning. Theft and unauthorized access to individuals’ personal information can result in serious harms, like fraud, financial loss, reputational damage, and identity theft. Despite the serious stakes, we continue to see too many cases, like in the Desjardins and BMO breaches discussed below, where even large sophisticated organizations handling vast amounts of highly sensitive information have failed to implement information security safeguards that adequately protect the personal information in their care.

The majority of breach reports received by our office continued to involve 3 main industry sectors, with 22% from the financial sector, 14% from telecommunications and 10% from sales and retail. This split has remained consistent over the last several years.

Top 5 sectors by percentage of total breaches reported
Industry sector 2017-18 2018-19 2019-20 2020-21
Financial 23% 22% 19% 22%
Telecommunications 6% 17% 17% 14%
Sales/Retail 9% 18% 14% 10%
Insurance 6% 8% 11% 9%
Services 9% 5% 9% 6%

The leading cause of reported breaches was unauthorized access (64%), which includes, among others, external actors gaining access to systems through malware, ransomware or social engineering. It also includes scenarios where employees viewed information without authority and used the information for inappropriate purposes.

We also saw that 28% of breaches were caused by unauthorized disclosures, including employee errors involving misdirected communication and disclosures resulting from a failure of technical safeguards and system vulnerabilities.

Our office continues to see an elevated proportion of incidents originating from cyber attacks, with 42% (328) of the breaches reported in 2020-21 attributed to malware, ransomware, password attacks, credential stuffing attacks, and other cyber threats. Of particular concern are the ransomware attacks and credential stuffing attacks described further in the following section.

Percentage of breaches reported by type
Breach Type 2017-18 2018-19 2019-20 2020-21
Unauthorized disclosure 29% 26% 21% 28%
Unauthorized access 52% 57% 59% 64%
Theft 16% 9% 9% 5%
Loss 3% 9% 11% 3%*
* Figures may not add to 100 due to rounding

Ransomware and other attacks

Cyber-attackers’ methods are ever evolving and it is important that organizations remain vigilant, to implement safeguards that protect against common and emerging online threats like ransomware or credential stuffing.

Ransomware attacks occur when malicious actors are able to gain access to an organization’s system and encrypt the data. Once encrypted, they then ask for a ransom to be paid in order for the organization to regain access to the system and often threaten to release the data on the dark web if the ransom isn’t paid.

Paying the ransom does not guarantee the files will be decrypted or recovered. In some cases, attackers will continue to demand more money without giving anything in exchange. Furthermore, paying the ransom also encourages cyber criminals to continue infecting devices with ransomware – it validates the use of cybercrime and shows that threat actors can use it to generate revenue.

Even if the files are released, organizations still have to deal with a data breach. The attacker had access to files, most likely made copies and could leak them or use them try to access other exposed online accounts.

Ransomware is reportedly on the rise globally as cyber criminals continue to exploit the increased use of digital transactions and new commercial online platforms, particularly since the pandemic.

We began to collect information about these types of attacks in 2020-21. Reports we received reflect the global trend, with 101 ransomware attacks reported to our office, representing 20% of all breaches related to unauthorized access.

It would appear that no sectors are immune. While the not-for-profit and professional sectors were hit hard, such attacks were also felt across the financial, transportation, manufacturing and retail sectors which accounted for 27% of all breach reports submitted to us related to ransomware attacks. These attacks compromised sensitive personal information including social insurance numbers (SINs), as well as financial and credit information, and impacted 54,962 Canadian accounts. In one case, an unauthorized actor was able to successfully gain access to and encrypt an organization’s system and successfully exfiltrated personal information of approximately 12,000 Canadian accounts.

Credential stuffing attacks are also on the rise globally. These occur when bad actors take advantage of valid credentials (usernames and passwords) obtained via unrelated breaches and sold, often on the Dark Web, which are then used to gain access to user accounts across other online sites. These attacks are possible when individuals use the same or similar username and password for multiple online services, as well as when organizations don’t require users to create strong and intricate passwords, and fail to implement or keep up with emerging technology for user authentication, such as multi-factor authentication.

Successful attacks can give bad actors access to personal or corporate accounts. Attacks on personal accounts can lead to identity theft, as bad actors gain access to more personal identifying information. Fraud, financial loss and reputational damage are some of the common harms caused by credential stuffing. When successful against organizations, these attacks can lead to ransomware attacks, data leaks and the compromise of personal information.

Given the extent of the harm that can result from this type of attack, in our view these incidents meet the threshold for creating a real risk of significant harm (RROSH). The RROSH test is set out in PIPEDA and triggers requirements to report the breach to our office and notify affected individuals.

Our office has started to receive reports involving attacks using credentials, with reports submitted from various sectors including retail, financial, and telecommunications industries. More than 60,000 Canadians were affected by these attacks and compromised data included highly sensitive personal information such as SINs, dates of birth, and banking details. Our office is continuing to monitor this trend and has developed guidance for businesses and consumers.

Given the global concern with respect to credential stuffing, a collaborative compliance initiative was commenced through the Global Privacy Assembly’s International Enforcement Cooperation Working Group, of which the OPC is a chair. The aim of this initiative is to conduct research on credential stuffing and develop guidance to assist both organizations and the public in preventing and/or mitigating the risk from such attacks.

Credential stuffing is not the only way that bad actors can cause harm. Identity theft can also be the result of malicious employees deliberately misusing personal information and insufficient technological safeguards to monitor and detect threats. As the investigation summaries further in the report illustrate, organizations need to develop and enforce strong administrative safeguards, such as policies and training, and implement technical safeguards to protect the personal information of their customers.

Breach reporting and law reform

Breach reporting obligations are legislated in the private sector. Our experience dealing with mandatory breach reporting helped us provide well-informed recommendations in this area in our submission on Bill C-11.

We advocated for legislated timelines for breach reporting under a reformed law. Our office relies on timely and accurate information from organizations to assess the level of risk a specific breach presents and to ensure that appropriate mitigating measures are applied.

However, we note that timeliness of reporting continues to be an ongoing challenge. It is not uncommon for breach reports to be received long after the breach was first detected.

In 2019, we received about 31% of breach reports more than 3 months after the breach occurred. In 2020, we saw this figure increase to approximately 40% of reports being received in excess of 3 months following detection.

The pandemic may have contributed to some of the delays we saw this year. However, the issue of delayed reporting is concerning. The sooner we are notified, the sooner we can ensure that a breach is properly contained and managed and that appropriate actions have been taken.

In our Bill C-11 submission, we recommended a standard by which breaches should be reported without unreasonable delay, but also within a defined time period. Globally, this period can vary, for example from 72 hours (the EU under the GDPR; Egypt; Philippines; Uruguay) to 5 days (Costa Rica) to 14-15 days (Indonesia, Colombia). We recommended a middle ground: reporting breaches to us without unreasonable delay, but within 7 calendar days.

Bringing efficiencies to our breach processes

Our office has worked to implement a number of measures to streamline processes to ease reporting and expedite our review of breach reports, allowing us to give more timely feedback to institutions.

In the last year, our office launched a secure portal for reporting breaches that allows businesses to easily submit their breach reports and instantly receive a file number, which facilitates future communication regarding the breach.

Following the publication of a report from our first breach record inspections in 2019, our office published new resources to help businesses manage breaches and follow mandatory reporting and other requirements related to the safe storage of personal information. This includes a new series of easy to follow videos designed to help businesses generate discussions with their staff on what they should do to protect the personal information of customers, clients and their own employees and ensure their business is prepared in the event of a breach.

Organizations subject to PIPEDA are required to report to the Privacy Commissioner of Canada breaches of security safeguards involving personal information that pose a real risk of significant harm (RROSH) to individuals. To assist with our assessment of compliance with this reporting requirement, our office began the process of developing an innovative tool, based on risk science, to assess harm in breaches. The tool, which is scheduled for launch in 2021-22, considers factors such as the sensitivity of personal information involved, and the probability that the information has been, is being, or will be misused.

Further Reading

Breach investigations

Security deficiencies at Bank of Montreal lead to large-scale breach

Our office received complaints from 2 Bank of Montreal (BMO) customers alleging the bank had failed to adequately safeguard their personal information and that a breach of its systems had resulted in sensitive personal information falling into the hands of malicious third parties and being posted online.

Between June 2017 and January 2018, a vulnerability in BMO’s in-house online banking application allowed attackers to breach security safeguards, take over individual online accounts and collect personal information associated with those accounts. These breaches occurred in 2 waves.

In the first wave, attackers had been successfully stealing personal information from BMO’s systems from June 2017 to November 2017, with the breach of approximately 36,000 accounts during this period occurring undetected.

BMO became aware of the vulnerability only after the larger second wave, a major cyber attack against its systems in December 2017, in which an attacker initiated a mass takeover of approximately 76,000 accounts.

However, even after identifying and patching the vulnerability, BMO remained unaware of the full extent of the attack. It was not until May 2018, when it received a ransom letter from an attacker, that BMO realized the large number of compromised accounts and the fact that personal information had been stolen.

In total, some 113,000 BMO customers were affected by the breach. The attackers collected personal information, including customers’ contact details as well as banking history. More than half of these customers also had their social insurance numbers and dates of birth compromised. After BMO refused to pay a ransom, the attackers posted the personal information of more than 3,000 customers on various public websites.

Our investigation identified significant deficiencies in BMO’s security practices and tools, which contributed to the breach, including:

  • inadequate software developer security testing and evaluation – resulting in the online banking application being deployed with a critical, high-risk vulnerability in place
  • inadequate vulnerability management – leaving the vulnerability undetected for approximately 6 months, and the impact to customers’ personal information unknown for a further 6 months
  • inadequate oversight and monitoring – resulting in major gaps in BMO’s monitoring and detection systems that contributed to the long exposure period, the delays in the detection of a major cyberattack and BMO’s failure to grasp the scope of the attack for several months

We noted that with proper monitoring, the first wave of data thefts would have been detected and mitigated, with the vulnerability being patched much sooner. This could have allowed BMO to avoid another breach of approximately 76,000 accounts in December 2017 altogether. Further, BMO lacked a management solution for addressing bots, a common method of cyberattack, leaving it vulnerable to the attack in December 2017.

BMO was responsible for highly sensitive personal information, including information that could be used by malicious actors to perpetrate identity theft. As a result, the strength of its security safeguards should have been higher. BMO failed to meet this standard.

During the course of our investigation, BMO made a variety of substantial changes to its policies, procedures, resourcing and technical safeguards to improve security and prevent future breaches. We determined that these improvements resolved the deficiencies we identified. As a result, we considered the matter to be well-founded and resolved.

Combination of weaknesses led to massive data breach at Desjardins

In December 2020, along with the Commission d’accès à l’information du Québec, we released the results of our investigations of Desjardins. We found that a series of gaps in administrative and technological safeguards at Desjardins allowed a malicious employee to inappropriately access and/or exfiltrate the data of close to 9.7 million individuals, making this incident the largest data breach ever in the Canadian financial services sector.

The compromised personal information included first and last names, dates of birth, social insurance numbers, residential addresses, telephone numbers, email addresses and transaction histories.

The breach, which spanned a period of at least 26 months, raised the question as to whether Desjardins’ security safeguards were appropriate and whether it met accountability requirements with respect to the personal information entrusted to it.

Our investigation revealed that Desjardins had failed to meet several of its obligations under PIPEDA. For example:

  • Desjardins failed to ensure the proper implementation of its policies and procedures for managing personal information
  • Employee training and awareness were lacking in light of the sensitive nature of the personal information the organization was entrusted with
  • Desjardins had not implemented retention periods or procedures regarding the destruction of personal information
  • From a technological standpoint, the access controls and data segregation of the databases and directories were inadequate

Desjardins agreed to implement our recommendations to improve its information security and protection of personal information, including its data destruction practises. It committed to providing progress reports to our office every 6 months. The financial institution also agreed to engage external auditors to assess and certify its programs and to submit an assessment report to our office. Regarding our recommendations on the retention schedule and the destruction of personal data, Desjardins proposed a plan spanning 18 months, to be completed by June 2022. Our office is monitoring Desjardins’ progress, and will continue to do so until it has demonstrated that it has met the terms of the recommendations outlined in our final report.

Further Reading

PIPEDA enforcement

General complaint and investigations statistics and trends

In 2020-21, our office accepted 309 complaints under PIPEDA, a 7% increase from the previous year. We accepted 289 complaints in 2019-20.

As in the previous year, we received the most complaints against businesses in the financial (24%), service (15%) and telecommunications (10%) industries. Access (34%) remained the issue most complained about by individuals, followed by use and disclosure of personal information (23%), safeguards (14%) and collection of personal information (14%).

As noted earlier, for complaints that have been submitted using our office’s online complaint form, we continue to find efficiencies by better directing complainants and requesting necessary documentation and information at the outset of the process, reducing the back-and-forth with complainants and respondents.

Similar to what we experienced with Privacy Act complaints, we initially received fewer PIPEDA complaints at the start of the pandemic, but this was only temporary as individuals, businesses and our own office adapted.

Our office closed 296 complaints in 2020-21, a 7% decrease from the previous year 318 in 2019-20.

Early resolution

As illustrated in its application with the public sector law, early resolution continues to be an invaluable tool to resolve straightforward complaints. Complainants typically see an outcome in a few months, compared to a much lengthier formal investigation. We appreciate that many organizations work with our office to resolve matters up front, to the mutual satisfaction of all involved parties, without the need for a full investigation.

We closed 71% (or 210) of all PIPEDA complaints using early resolution in 2020-21, the highest proportion ever.

In addition to maximizing our use of early resolution, we continue to use summary investigations to conclude straightforward matters.

Percentage of all complaints closed in early resolution
Fiscal Year Percentage of all complaints
closed in early resolution
2020-21 71%
2019-20 69%
2018-19 63%
2017-18 66%
Early resolution success stories:

Electronic commerce company addresses gaps in privacy complaint escalation processes

An e-commerce company failed to address an individual’s privacy concerns submitted through its complaint escalation process and failed to acknowledge or respond to the individual’s request for access to their personal information within the legislated timeline.

The complainant had expressed concerns about their inability to opt-out of the collection of proof of identity information that the e-commerce company requested. They also could not close their recently opened online merchant account without first providing that personal information.

After our office contacted the organization about this complaint, the company acted swiftly to respond and resolve all privacy concerns. The company also committed to take steps to review and address gaps identified in the complaint escalation process to ensure that all client queries relating to personal information would be acknowledged and responded to in a more timely and efficient manner.

Financial services company updates forms to remove option to collect beneficiaries’ social insurance numbers

An individual complained to our office about a financial services company’s forms for designating third party beneficiaries.

The company had to transfer an individual’s retirement savings plan contract due to a company merger and asked the individual to complete a form to designate a beneficiary. The form included a mandatory field to input the SIN for any named beneficiary.

We noted that there was no legal or business requirement for the company to collect a beneficiary’s SIN. Our office notified the company of this concern and shared our longstanding guidance on best practices for the use of SINs in the private sector. As a result of the intervention, the company removed the SIN field section from the form.

Further engagement was conducted between the company and our Business Advisory Directorate, which provides advisory services to the private sector and participates in outreach and engagements aimed at encouraging compliance with PIPEDA. These discussions concluded satisfactorily, with the organization accepting our recommendations. The complainant was very pleased with both how promptly we addressed the issue and with the outcome. We considered this matter to be early resolved.


Investigations

Telco employees’ failure to follow policies allowed fraudsters to repeatedly access account

A Fido customer filed a complaint with our office after fraudsters accessed and changed the personal information on his account, through several calls to the company over a number of days, even after the complainant had added a security PIN and secret questions to the account.

We reviewed recordings of 5 phone calls and found that, in each case, fraudsters pretending to be the complainant were granted access to the complainant’s account, even though they failed Fido’s authentication protocols.

In responding to our requests for information, Fido’s parent company Rogers indicated that it had various safeguards in place to prevent unauthorized access.

In our view, the repeated failures by multiple customer service representatives illustrated a systemic safeguards concern.

Although Fido had implemented protocols requiring that employees authenticate callers using multiple authentication methods, the protocols were bypassed by every customer service representative who dealt with the complainant’s account over the period in question.

Our guidelines for identification and authentication note that, in order for authentication measures to be effective, employees must be aware of and adhere to policies established to prevent individuals from gaining unauthorized access to customers’ personal information. It is important that organizations implement measures to ensure their processes are actually followed, particularly in cases where employees may have motives, such as meeting sales targets that could tempt them to bypass protocols.

Fido agreed to implement our recommendations aimed at ensuring that authentication protocols are understood and followed by staff. These recommendations included making staff aware of the consequences associated with not following authentication protocols.

We therefore found the safeguards aspect of the complaint to be well-founded and conditionally resolved.

The complainant also alleged Fido failed to provide him access to transcripts of calls between the fraudsters and the company.

Although we found Fido was not obligated to provide the information regarding the calls in the form of transcripts, the recordings the company allowed the complainant to listen to were of such poor sound quality that we concluded Fido had failed to provide access in a form that was generally understandable.

In the end, Fido agreed to provide transcripts of the calls. We therefore found the complaint with respect to access to be well-founded and resolved.

Further Reading

Default “stay signed in” email feature posed privacy risks for users

Our office investigated a complaint against Yahoo Canada alleging that a setting that kept Yahoo Mail users signed into the email service by default posed privacy concerns for anyone accessing email on a public or shared computer. Our investigation focused on aspects of safeguards and consent related to Yahoo’s “stay signed in” setting. We noted that a person’s emails can contain highly sensitive information that if disclosed to a stranger on a public computer, or family member on a shared computer, could cause the individual significant harm. We did not accept Yahoo’s assertion that a reasonable person would understand the stay signed in setting to be “on” by default.

We also did not accept that the additional safeguards in place were adequate to mitigate the privacy risk to a user who inadvertently stays signed in on a shared or public computer. These additional safeguards included an algorithm intended to deselect the “stay signed in” setting on public computers and a means of closing an email account remotely.

Our testing found that the algorithm to protect Yahoo Mail users on public computers did not work. Meanwhile, the feature allowing users to close an open Yahoo Mail session by changing their password via another device did not work for Rogers Yahoo Mail users. In any event, this would be available to a user only after realizing they had remained logged in unintentionally, at which point their information may already have been exposed.

On the issue of consent, in our view, given the risks to users, opt-out consent was inadequate. Yahoo was required to obtain express opt-in consent for its stay signed in setting.

We also noted Yahoo had failed to obtain meaningful consent because it did not make clear to users, at the point of deciding whether or not to stay signed in, that any person who subsequently visits Yahoo’s website on that device would be able to access all of the potentially sensitive information in the previous user’s emails.

Following our recommendations, Yahoo committed to change the stay signed in setting from opt-out to opt-in, and to display clear and prominent information to better inform users about the privacy implications of opting in to the setting. Our office also engaged with Rogers to ensure Rogers Yahoo Mail users will, like other Yahoo Mail users, have a way to force a log-out where the user remains signed in on a public or shared device. Accordingly, we found the complaint to be well-founded and conditionally resolved.

At the time of publishing this Report, we continue to work with Yahoo on the final implementation of our recommendations. We also intend to follow-up with other email service providers to share our position and encourage them to review and change, where appropriate, their own practices in line with the recommendations in our report.

Payday loan company requires borrowers to provide online banking passwords

Ontario’s Ministry of Government and Consumer Services alerted our office that a short-term loan company, CashHere, was requesting that its clients provide their online banking passwords, usernames and security questions and answers when applying to borrow money.

We launched a Commissioner-initiated investigation and determined that CashHere was collecting and using banking login credentials for a purpose that a reasonable person would not consider appropriate.

We understand that the requested online banking credentials can help validate identity and obtain income history to make lending decisions and manage loan repayments. However, there are far less privacy-invasive means to achieve these purposes. For example, a lender could examine redacted hard copy bank statements or confirm income with employers.

Furthermore, we found that the potential privacy harms associated with applicants providing CashHere with access to their full financial statement history as well as the unfettered ability to make financial transactions on their accounts, are not proportionate to the commercial benefits for CashHere.

During the course of the investigation, CashHere stopped responding to our office and its website was listed for sale.

We discovered, however, an organization operating as MoneyHome which started operating under a different website. This company had the same online application as CashHere and was requesting banking login credentials. It listed on its website CashHere’s previous physical address, as well as CashHere’s payday loan license certificate. Despite these links, MoneyHome denied any connection to CashHere.

We found this matter to be well-founded, and not resolved.

At the conclusion of this investigation, we shared our findings with MoneyHome, as well as our expectation that it cease collecting online banking log-in credentials, but received no response and we note the website is now for sale. We subsequently shared our concerns with Ontario’s consumer services ministry. We note the ministry has added MoneyHome to its Consumer Beware List.

Consumer shocked when computer services help desk remotely accesses new laptop

An individual filed a complaint with our office after buying a new laptop from a computer services company which included a help desk service. During an appointment, the individual alleged a help desk technician used pre-installed remote access software to access his laptop without his consent.

In our view, remote access provided technicians with the ability to view personal information on customer computers, including sensitive information. Because of this, the company should have obtained express opt-in consent for such access.

The computer services company claimed the technician obtained express oral consent to use remote access software. However, the individual contested this and the company was unable to prove it.

The complainant questioned whether the computer services company adequately protected his and other customers’ personal information from unauthorized access. The company pointed to their preventive security measures, however, we did not accept that these measures were adequate.

The company said that remote access required that the customer give a unique customer ID to the technician. We observed that once a technician knew the number, which had been provided by the company at the time of purchase, it could be used for access on multiple occasions.

The company also said that technicians could not access the customer’s computer unless it was turned on and connected to the Internet, such that the user would notice the remote access. We noted that individuals sometimes leave their computer unattended and would not necessarily witness remote access.

The company further claimed that the remote access software did not allow technicians to copy or record personal information. We noted that a technician could copy information, for example, by using a cellphone camera.

We also noted that the company could have used another kind of remote access tool that would require the user to actively confirm, through their computer, consent to third-party access. Instead, it relied on obtaining oral consent which it was unable to demonstrate.

In light of this, we found that the computer services company did not obtain valid consent or implement appropriate safeguards, commensurate to the sensitivity of the information at risk.

During the course of our investigation, the company underwent corporate restructuring, closing its help desk service and ending the use of the remote access software. We therefore found that the matters were well-founded and resolved.

Education software firm addresses security vulnerabilities

An investigation into a parent’s complaint found that a Canadian education technology company lacked a comprehensive information security framework to adequately protect the sensitive personal information of hundreds of thousands of students.

The investigation was launched following a complaint filed by a parent after they discovered security vulnerabilities in a software application adopted by their children’s school board.

The software application, called Edsby, is owned by CoreFour, Inc., a company based in Richmond Hill, Ontario. CoreFour handles the personal information of hundreds of thousands of children from across Canada and other countries. The sensitive personal information entrusted to CoreFour included student grades, absence details, learning disabilities and health information related to, for example, allergies and medication. This information should be protected with heightened security safeguards commensurate with the volume and sensitivity of the information.

The investigation found that CoreFour had implemented some effective security practices, but had not developed a robust, overarching information security framework.

Such a framework could have potentially avoided the security vulnerabilities identified by the complainant and our office’s own testing. In particular, the investigation identified weak password requirements for certain Edsby parental accounts, and inadequate safeguards to protect against unauthorized access to thumbnail images of student profile pictures. The investigation also highlighted the need to scan for malware when uploading content into Edsby from third-party apps.

CoreFour was co-operative, addressed the identified safeguards vulnerabilities and agreed to implement all of our office’s recommendations.

This was our office’s first investigation into “Ed Tech,” which has become increasingly prevalent in the context of remote learning during the pandemic.

The investigation highlighted the importance, particularly for SMEs, of implementing information security and privacy management frameworks that will keep pace with organizational growth, to adequately protect personal information and meet legislative requirements.

During the investigation, we shared information with the Office of the Information and Privacy Commissioner of Ontario, which was conducting a related investigation under Ontario’s Municipal Freedom of Information and Protection of Privacy Act into a complaint against a school board using the Edsby application to manage student attendance.

Further Reading

Canada’s Anti-Spam Legislation update

The principal provisions of Canada’s Anti-Spam Legislation, or CASL, came into effect in 2014 to protect consumers and businesses from the misuse of digital technology, including spam and other electronic threats. CASL also aims to help businesses stay competitive in a global and digital marketplace. Our office shares responsibility for enforcing CASL with the Canadian Radio-television and Telecommunications Commission and the Competition Bureau.

In November 2020, the 3 enforcement agencies conducted their first tri-partite CASL-related compliance initiative, with the issuance of an awareness-raising letter to 36 companies involved in the mobile app industry in Canada.

The letter reminded the companies of their compliance obligations under CASL and those related to the promotion, installation and use of mobile apps under PIPEDA and the Competition Act.

It also encouraged the companies to take action to help prevent activities that raise concerns, such as:

  • apps that collect or use personal information, such as electronic addresses, without consent
  • apps designed to spam users’ friends and contacts
  • apps that make false or misleading claims to promote a product, a service or a business interest
  • apps that do not properly identify their functions (such as allowing information sharing with other computers or automatically downloading other programs on the user’s devices) in order to obtain informed consent from the user prior to installation

These activities put Canadians at risk of fraud, identity theft and financial loss, among other things. Mobile app companies are therefore in a unique position to detect, prevent and stop such practices from harming consumers.

We encouraged the companies to review their practices and take preventative and corrective measures where necessary.

Two complaints concerning unsolicited marketing emails and unsubscribe functions were closed in 2020-21 through early resolution and due to a lack of jurisdiction. Two additional cases potentially related to our office’s responsibilities under CASL were ongoing at the time of writing this report.

Our Information Centre received 37 inquiries related to CASL from individuals and businesses in 2020-21. Most calls from individuals were related to the collection and use of their email address without consent or the absence of an unsubscribe option in emails they received. Inquiries from businesses concerned the obtaining of email lists and the use of such email addresses.

We again produced an insert on CASL that was mailed through the CRA in April and May 2020, with a potential reach of 477,350 businesses in Canada. The insert provided businesses with information on their obligations related to e-marketing.

Other CASL-related efforts involved creating and sharing helpful tips and information related to these issues on social media on a regular basis, with a particular emphasis during Fraud Prevention Month in March 2021.

Further Reading

New technology analysis lab

Evolving technologies are constantly creating new risks for privacy and it is critical for our office to keep up with new developments. This year, we performed an assessment of our technology analysis laboratory to determine whether we were properly equipped with the necessary tools to keep up with the rapid evolution of technology.

Upon our assessment, we concluded that our lab infrastructure needed to be improved. We performed a complete upgrade of our lab IT infrastructure and acquired new state-of-the-art tools that have allowed us to better analyze technology components.

Additionally, the introduction of these new tools has allowed us to better analyze the technology behind an innovative medical platform, powered by AI, as part of a research collaboration project in partnership with the National Research Council.

Our office is also in the process of relocating the current lab to a space that better meets our current and future needs – a flexible new lab that will allow us to adapt to ever evolving advances in technology. This will allow us to deploy a new lab configuration, based on a diversified multi-zone design. The new design will expand our abilities when conducting analyses of malware, hardware components, mobile applications, IoT devices and digital forensic analyses. Additionally, the new lab configuration will enable us to extend our capabilities when analyzing emerging technologies and when engaging in interdepartmental collaboration within the Government of Canada.


Advice and outreach to businesses

Our office’s Business Advisory Directorate engages with businesses to assist them in assessing the privacy implications of their practices and initiatives – and to help them better comply with PIPEDA as they adopt new technologies and innovative business models.

Addressing privacy considerations and issues early helps mitigate privacy risks, offers organizations a measure of regulatory predictability as they innovate and grow, and allows Canadians to benefit from innovation with confidence.

Businesses subject to PIPEDA can voluntarily request an advisory consultation with our office, and we may also proactively offer advisory services to individual organizations or broader groups.

This year, our Technology Analysis Directorate (TAD) also led an initiative to provide privacy advice to a Toronto-based company developing an AI-powered software platform designed to improve care for people with developmental disabilities.

Company seeks advice on upgrading device for people with vision impairment

A Canadian SME (small and medium enterprise) requested our business advisory services to seek privacy advice for its wearable optic-enhancement device for low-vision individuals. The device was conceptualized and built in Canada and has enhanced the quality of life and daily functioning of its users.

To increase functionality and introduce more features, the company decided to upgrade its device and sought privacy advice on the more digitally enhanced and connected version of its fourth-generation product.

This proactive engagement on the part of the organization enabled it to access privacy compliance advice on a range of privacy related areas, and to innovate with a measure of regulatory predictability that mitigates against future re-engineering costs, offering more advanced versions of its products to Canadians and international consumers and creating well-paying jobs in Canada as it grows its business in a privacy-compliant manner.

Digital media firm seeks advice on video analytics technology product for retailers

An organization providing digital services contacted our Business Advisory Directorate and voluntarily sought PIPEDA compliance advice on its Anonymous Video Analytics (AVA) service. This request came on the heels of our office’s investigative findings in Cadillac Fairview Corporation Limited. The organization distinguished its technology and application from that in the reported case, and sought advice to ensure it remained compliant with PIPEDA. Based on information presented in the consultation, we made recommendations for compliance and privacy improvements in certain areas, such as accountability, purpose specification, consent, limiting collection and use of personal information, as well as safeguards.

The company expressed appreciation for the opportunity to engage with our office proactively and for the valuable advice it received to ensure better privacy compliance and avoid privacy problems.

Further Reading

Innovative Solutions Canada

For the past 2 years, we have been working to establish collaborative technical research partnerships with peers in other federal institutions. These efforts have led to a successful research partnership involving members of TAD and the National Research Council’s Medical Devices Research Centre (MDRC).

Our collaborators at the MDRC invited us to undertake a joint technical analysis of an innovation that had received funding through Innovation, Science and Economic Development Canada’s Innovative Solutions Canada program.

The innovation – Reveal by Awake Labs – is an AI-powered software platform that aims to facilitate improved care for people who have an intellectual or developmental disability, including autism.

Reveal uses a smart watch and mobile phone app to help people with intellectual disabilities manage stress. It provides real-time feedback about stress levels as well as a note-taking function to help track what might have influenced the changes in stress levels.

We performed privacy-focused cybersecurity testing on the Reveal platform with an emphasis on PIPEDA’s 10 fair information principles. Our research included network traffic analysis, vulnerability testing and an interface design assessment, among other tests.

The research introduced members of TAD to a new ecosystem of tools and also helped Awake Labs’ product development process by identifying several privacy and security issues, which the company quickly addressed.

Guidance development

An important area of our work is the development of guidance to help support organizations in meeting their obligations under PIPEDA.

In August 2020, we published guidance for manufacturers of IoT devices that collect personal information – such as doorbells, alarms, TVs, appliances, toys or health trackers.

Such devices, which are quickly becoming commonplace in our lives, involve a complex and often opaque ecosystem in which many components and actors, such as social media platforms, third-party applications and service providers, can potentially collect, use and disclose personal information.

Our guidance provides practical information to help manufacturers ensure their IoT devices and business practices are privacy protective and compliant with PIPEDA.

We also published a companion piece aimed at individuals to help them understand how to protect their privacy while enjoying the benefits of smart devices.

Further Reading

Contributions program

Our Contributions Program funds independent privacy research and related knowledge translation initiatives.

Each fall, our office issues a call for proposals to fund independent privacy research and knowledge translation initiatives. Academic institutions and non-profit organizations, such as industry and trade associations as well as consumer, voluntary and advocacy organizations, are eligible to receive funding. The annual budget for the program is $500,000 and the goal is to support arm’s length, non-profit research on privacy, further privacy policy development and promote the protection of personal information in Canada in the private sector. Since it was created in 2004, we have been able to allocate approximately $7 million to some 160 projects through the program.

Our 2020-21 call for proposals expressed a particular interest in funding research projects on AI and its repercussions on privacy rights. Our office was also interested in funding the planning of a new “Pathways to Privacy” symposium.

We received 43 proposals, which we evaluated based on merit, and selected 11 projects to receive funding. We awarded up to $50,000 per project and a maximum of $100,000 per organization. These projects examine a variety of topics, such as AI and machine learning, the implications of AI on the privacy of children and young people, protecting health information in the context of AI, and examining AI from the consumer and ethical perspectives.

Our calls for legislative reform in the public sector include a specific mandate for research and public education, which exists in PIPEDA, creating the possibility to expand the focus of our Contributions Program beyond private sector privacy issues.

Further reading

Advice to Parliament

Appearances and submissions

Every year our office engages with Parliament through various channels. Parliamentarians primarily seek our input at committee, to advise on legislation that could impact Canadians’ privacy and to seek our expertise for committee studies on privacy-related issues. Given the COVID-19 pandemic, we participated in fewer committee appearances than in previous years.

However, the pandemic brought forward numerous considerations for the protection of personal information in a digital environment that resulted in engagement with Parliament.

The Commissioner appeared before the Standing Committee on Industry, Science and Technology as part of its ongoing study of the Canadian response to the COVID-19 pandemic. His appearance focused on contact tracing applications.

The Commissioner noted that when properly designed, tracing applications could protect both public health and privacy simultaneously. He emphasized that an appropriate design of technologies such as tracing applications depends on respecting the key privacy principles recommended in our office’s framework for the assessment of privacy-sensitive pandemic initiatives. The Commissioner also stressed the importance of laws that would allow technologies to produce benefits in the public interest without creating risks that fundamental rights such as privacy will be violated.

We also appeared before the Standing Committee on Procedure and House Affairs in the context of the committee’s study on how members can best fulfill their parliamentary duties during the COVID-19 pandemic. Our appearance focused on privacy issues related to web-based video conferencing platforms. We noted that we often see a connection between the privacy concerns and the cybersecurity risks and vulnerabilities of these platforms. We offered several recommendations to protect privacy when web-based video conferencing platforms are used for public meetings. Among our suggestions, we noted that the services’ privacy policies and terms should be reviewed closely, that caution should be exercised when using private messaging features and that meeting hosts should apply the controls that are often available.

In addition to our work to support federal parliamentarians, we also participated in 2 provincial privacy law modernization initiatives.

As noted in an earlier chapter on law reform, the Commissioner appeared before a National Assembly of Quebec committee regarding Bill 64, An Act to modernize legislative provisions as regards the protection of personal information.

The Commissioner emphasized that modern privacy laws should ensure that new technologies are being used in a way that respects the privacy of individuals. The starting point for reform should be to ensure that privacy laws recognize the fundamental nature of that right and implement it in a modern, sustainable way.

The Commissioner noted that a number of elements contained in Bill 64 are consistent with the detailed proposals to reform federal privacy legislation that our office presented last year. For example, the Bill includes provisions that address profiling and protect the right to reputation. It also subjects political parties to the private sector act. The Commissioner has suggested that Bill C-11 could be improved by adopting certain approaches found in Bill 64.

Our office also participated in the public consultation conducted by the Special Committee to Review the Personal Information Protection Act of British Columbia.

Our submission expressed support for recommendations put forth by the Information and Privacy Commissioner for British Columbia and highlighted our own experiences and perspective with respect to some of the key legislative measures he outlined. For example, we strongly agree the pandemic has made the need for robust privacy laws fit for purpose in the digital age even more apparent. Our submission also stressed the importance of mandatory breach reporting and agreed that it is essential that data protection regulators have the authority to impose administrative monetary penalties.

International and domestic cooperation

The global pandemic has illustrated the inter-connectedness of our world like never before. As the COVID-19 virus has circled the globe, so has scientific and technological knowledge that has made it possible, for example, to develop and begin distributing COVID-19 vaccines within the space of a year.

At the same time, we have seen similar privacy issues arising across Canada and in countries around the world as governments looked to stop the spread of the virus and individuals and organizations turned to technology to allow essential activities such as education, work and socialization to continue.

The pandemic has underscored the importance of collaboration with both international and domestic counterparts in order to effectively protect Canadians’ privacy rights.

Over the past year, this collaboration continued to prove crucial to protecting the privacy rights of Canadians in a borderless society. While world economic output and trade were hard hit by the pandemic, individuals and businesses became increasingly reliant on information and communication technologies. At the same time, international commerce continued to rely on cross-border flows of personal information and online global risks to individuals’ privacy persisted.

As our office pivoted to working online, we were able to continue to seamlessly engage with our domestic and international counterparts.

Through our interaction with our partners, in Canada and around the world, we shared best practices, learned from others, and leveraged their experience and expertise to inform and support our domestic activities. In adopting and communicating joint positions on issues that have a significant impact on privacy, we can speak as a common voice and have a greater effect on businesses and governments.

A number of our collaboration initiatives over the last year involved issues related to the pandemic, but we also continued to work with partners on other issues.

Domestic cooperation

Federal, provincial and territorial collaboration

Although the 2020 annual meeting of Canada’s federal, provincial and territorial information and privacy commissioners was cancelled due to the pandemic, this group continued to work together throughout the year in many forms.

In May 2020, Canada’s federal, provincial and territorial (FPT) privacy commissioners issued a joint statement, Supporting public health, building public trust: Privacy principles for contact tracing and similar apps, calling on governments to ensure that COVID-19 contact-tracing applications observe key privacy principles.

We emphasized the need for a flexible application of privacy laws. We highlighted that with proper care and attention to privacy principles and appropriate design, respect for citizens’ rights and public trust can be maintained while, in tandem, advancing important public health goals.

The group statement served as a reminder that while the pandemic has accelerated the broader trend of government digitization, the choices that are made will shape the future of our country. They therefore set out and encouraged governments to consider key principles with respect to contact tracing applications, including consent and trust, legal authority, necessity and proportionality, purpose limitation, transparency and accountability. They also highlighted the importance of using technical protections, such as de-identification, aggregate data and security safeguards to build public trust in these initiatives aimed at supporting public health.

In May 2021, FPT Commissioners were able to gather virtually, where we endorsed and issued a joint statement with our counterparts on privacy issues related to COVID-19 vaccine passports, as well as a joint resolution on access and privacy during and after recovery following the pandemic released in June 2021.

In the statement, signatories called on governments and businesses to ensure that privacy is front and centre as COVID-19 vaccine passports are considered as a tool to help Canadians return to normal life.

The statement emphasized that vaccine passports must be developed and implemented in compliance with applicable privacy laws, and that they should also incorporate privacy best practices in order to achieve the highest level of privacy protection commensurate with the sensitivity of the personal health information that will be collected, used or disclosed.

The group also outlined fundamental privacy principles that should be adhered to in the development of vaccine passports. In particular, the necessity, effectiveness and proportionality of vaccine passports must be established for each specific context in which they will be used.

They also noted that any personal health information collected through vaccine passports should be destroyed and vaccine passports decommissioned when the pandemic is declared over by public health officials or when vaccine passports are determined not to be a necessary, effective or proportionate response to address public health purposes.

The group also took the occasion to call on their respective governments to show leadership and apply fundamental principles in the implementation and the necessary modernization of governance regimes around freedom of information and protection of privacy.

In the joint resolution resulting from the June 2021 meeting, they highlighted that access to government information and respect for privacy are essential for governments to be held accountable for their actions and decisions, increasing confidence in decision-making. They noted that this can help maintain the public’s well-being and trust in times of widespread crisis.

The resolution pointed to the need to incorporate privacy by design principles to ensure emergency, recovery and resumption initiatives, supported by innovation, technology and digitization, demonstrate transparency and respect for individuals’ rights.

Further Reading

Enforcement collaboration with provincial and territorial counterparts

The Domestic Enforcement Collaboration Forum promotes and facilitates federal-provincial cooperation. The forum brings together representatives from our counterparts in Alberta, B.C. Quebec and our office, which also acts as chair. Through this forum, we identify new complaints or data breaches of potential interest for collaboration, and provide updates and strategic advice on ongoing joint investigations. We also discuss investigation techniques, and share lessons learned and best practices. The forum has also created opportunities to discuss enforcement strategies in the public sector, where we also share similar mandates but oversee a distinct group of government agencies.

In 2020-21, we collaborated with our provincial and territorial colleagues on an unprecedented number of joint investigations. We have continued to expand our collaboration with our provincial partners, completing 3 joint or coordinated investigations, launching a new one and regularly sharing information with provincial counterparts.

In October 2020, the federal, British Columbia and Alberta offices released findings into Cadillac Fairview’s use of embedded cameras inside their digital information kiosks, which collected 5 million shoppers’ images and used facial recognition technology to guess their age and gender.

In December 2020, along with our counterparts at the Commission d’accès à l’information du Québec (CAI), we published the findings of our separate but coordinated investigations into a privacy breach at Desjardins. This represented the first time the OPC and the CAI collaborated on an investigation.

In February 2021, we announced the findings of a joint investigation into Clearview AI’s facial recognition app, with provincial commissioners in Alberta, British Columbia and Quebec, notably the first investigation pursued jointly with all 3 provinces with private-sector privacy legislation.

Along with these 3 offices, in June 2020 we launched a joint investigation into a Tim Hortons mobile application after media reports raised concerns about how the app may be collecting and using data about people’s movements as they go about their daily activities. At the time of writing, this investigation was ongoing.

As noted earlier, in the spirit of collaboration with our provincial and territorial colleagues, we also shared information with our counterparts from the Office of Information and Privacy Commissioner of Ontario as part of our investigation into an educational technology platform.

Cross-regulatory cooperation

In a number of recent investigations, we have identified intersections between issues raised with our office and the mandates of other domestic regulatory authorities such as provincial consumer protection authorities/agencies, the Office of the Superintendent of Financial Institutions and the Canadian Human Rights Commission.

However, our ability to engage and share information with those institutions is limited such that we have not been able to fully explore those intersections or collaborate on the investigation of matters that span our regulatory spheres.

Our work through the Global Privacy Assembly, described later in this report, highlights global efforts and changes we are calling for in order to advance cross-regulatory enforcement cooperation in practice.

Civil society roundtable

For over a decade, our office has met regularly each year with a representative group of Canadian privacy rights, consumer association and advocacy organizations. These meetings include representatives from the Canadian Civil Liberties Association, International Civil Liberties Monitoring Group, British Columbia Civil Liberties Association, British Columbia Freedom of Information and Privacy Association, Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic, Ligue des droits et libertés, Public Interest Advocacy Centre, Right to Know, Open Media, Consumer Interest Alliance and the Consumer Association of Canada.

The OPC organizes this forum to give representatives of civil society groups a venue to exchange ideas and discuss issues of common concern and we share updates on our work, consult with this group on various issues and receive important insights into emerging issues from across Canada, which helps inform our thinking.

Topics of discussion at our 2020-21 meetings in April and June 2020 included privacy and pandemic responses, contact-tracing applications, employee health testing, facial recognition technologies, police usage of body-worn cameras, and privacy law reform.

International cooperation

Over the past year, collaboration with our international partners continued to prove crucial to protecting the privacy rights of Canadians in a borderless word. While world economic output and trade were hard hit by the pandemic, individuals and businesses became increasingly reliant on information and communication technologies. At the same time, international commerce continued to rely on cross-border flows of personal information and online global risks to individuals’ privacy persisted. Through our continued interaction with our international counterparts, we shared best practices, learned from others and leveraged their experience and expertise to inform and support our domestic activities. In joining compliance forces and adopting and communicating joint positions on issues that have a significant impact on privacy, together we were able to expand our collective enforcement capacity and increase our global impact.

Most international privacy forums continued to meet virtually during the pandemic, and our office has actively participated in those online discussions.

Global Privacy Assembly

The Global Privacy Assembly (GPA) first met in 1979 as the International Conference of Data Protection and Privacy Commissioners. The assembly works to provide international leadership in data protection and privacy by connecting more than 130 data protection and privacy authorities from around the world.

The GPA plays a central role in fostering international collaboration and sharing of best practices within the global privacy community. To this end, our office is actively involved in the work of the GPA through various channels, including our participation on working groups and our involvement in drafting and sponsoring resolutions. We consider the work of the GPA as a key priority, and will continue to support its efforts going forward.

2020 GPA closed session overview

The 42nd meeting of the GPA took place from October 13-15, 2020 and was held entirely online as members and observers joined virtually to discuss key data protection and privacy issues. Its theme focused on the increased importance of privacy and data protection during the COVID-19 pandemic. Over the course of the 3-day conference, members discussed issues related to children’s privacy, facial recognition and the future of the GPA conference.

The GPA’s COVID-19 taskforce provided members with an overview of its activities, presented its compendium of best practices, which was adopted by GPA members, and proposed a resolution (which passed) to continue its work in 2021.

At the conference, GPA members adopted 4 other resolutions, including ones on the use of facial recognition technology, the ethical development of AI, and the role of data protection in international development aid. Our office supported and co-sponsored all the resolutions.

COVID-19 Working Group

In April 2020, we joined the GPA’s COVID-19 Taskforce, led by the national Privacy Commissioner of the Philippines. This international group, made up of more than 40 data protection offices and NGO observers around the world, has facilitated vital information-exchange and jurisdictional updates from affected countries, set up discussion seminars on emerging issues tied to privacy and the pandemic and conducted other research.

We also shared with the group our framework for protecting privacy in the context of initiatives to respond to COVID-19. The GPA COVID-19 Working Group soon after assembled a public repository for all similar guidance work, statements and assessments produced by its membership, which is now available for all regulators or policy makers worldwide to consult.

In March 2021, the working group developed and provided the GPA Executive Committee with a joint statement on the use of health data for domestic or international travel purposes, addressing the widely debated question of vaccine passports.

Our office contributed recommendations to the UK Information Commissioner’s Office (ICO), the primary authors of that statement. The GPA Statement was circulated to a wide range of international bodies working on the issues of global travel checks and vaccination certification, such as the World Health Organization (WHO), the International Air Transport Organization (IATA), the Organisation for Economic Co-operation and Development (OECD) and Council of Europe. These are some of the key policy-makers and stakeholders informing regulation of travel and movement.

International Privacy and Human Rights Working Group

In 2021, we entered our second year as chair of the GPA’s Policy Strategy Working Group 3. Established as part of the GPA’s 2019-2020 policy strategy, this group brought together over 40 individuals representing more than 20 data protection authorities from every region of the world. The Working Group has developed a narrative report that explores and highlights the relationship between privacy, data protection and other fundamental rights, as well as democratic and political rights.

The purpose of this narrative is to support GPA members in their call for the recognition of privacy and data protection as fundamental human rights, something that Commissioner Therrien and previous Canadian commissioners have emphasized is necessary in updated privacy laws. The narrative was presented at the annual GPA conference in October 2021.

Digital Citizen and Consumer Working Group

Co-chaired by our office and the Office of the Australian Information Commissioner, the Digital Citizen and Consumer Working Group’s 2019-2021 mandate focuses primarily on exploring and understanding the complex intersection between privacy and anti-trust. The group is conducting an in-depth assessment of the substantive complements and tensions between these 2 regulatory spheres, which has included commissioning an independent academic study and interviewing competition authorities from around the globe. It is also working to track and facilitate cross-regulatory cooperation, while raising awareness of the issue through engagement with other privacy and competition enforcement networks. This is a topic of high interest in both the Privacy and Competition regulatory spaces. Our office represented the Working Group and its work at numerous domestic and international events considering the intersection phenomenon.

International Enforcement Working Group

Our office served as co-chair of the GPA’s International Enforcement Working Group along with the UK ICO and the US Federal Trade Commission.

This group’s mandate is to foster proactive and practical enforcement cooperation on current critical issues of mutual interest to the international privacy enforcement community.

During the last year, the working group facilitated several virtual meetings amongst privacy authorities to discuss and share perspectives on issues such as COVID-19 tracing apps, facial recognition technology and credential stuffing (an increasingly common cyberattack). These sessions led to the pursuit of collaborative compliance activities, such as the global video teleconferencing initiative discussed below.

The group also worked to enhance existing practical tools, like an online repository to facilitate sharing of enforcement knowledge and experience, and the Enforcement Cooperation Handbook, which provides guidance to data protection authorities wishing to cooperate on enforcement matters.

The co-chairs adopted an engagement strategy to encourage new members to join the working group, to ensure regional, cultural and linguistic diversity. At the time of writing, the working group consisted of 26 members, including new members from Africa, the Middle East and Latin America.

Global video teleconferencing initiative

Further to our joint statement on global privacy expectations of video teleconferencing companies, published in July 2020 and discussed in our last annual report, our office and 5 other members of the International Enforcement Working Group proactively engaged Microsoft, Cisco, Google and Zoom.

The companies each replied to the open letter, highlighting various privacy and security best practices, measures and tools implemented or built into their video teleconferencing services.

We further engaged, fittingly through virtual meetings, with each of these companies to discuss their video teleconferencing platforms and better understand their privacy practices.

Other GPA working groups

Our office also regularly participates in several other GPA working groups.

In 2020-21, as part of the Ethics and Data Protection in Artificial Intelligence Working Group, we contributed to the development of a resolution on accountability and AI.

A new Facial Recognition Technology Working Group initiated a project aiming to develop and promote a set of principles and expectations that developers and users of facial recognition technology should abide by to help ensure that personal information is used in a manner that respects privacy.

We continued to be an active member of the Digital Education Working Group. The group drafted a submission to a public consultation by the United Nations Committee on the Rights of the Child on how the UN Convention on the Rights of the Child should be implemented in the digital environment. The submission noted children are especially vulnerable in the online environment, particularly in situations where profiling, automated decision-making and behavioural advertising come into play.

As well, we participated in GPA’s Future of the Conference Working Group, which aims to lay foundations for the Assembly itself, through the establishment of plans for a funded and stable GPA Secretariat. The group examined funding models from other international networks and explored the establishment of the GPA Secretariat as a separate legal entity.

Further Reading

Global Privacy Enforcement Network

The Global Privacy Enforcement Network (GPEN) is a platform for privacy enforcement authorities to share knowledge, experience and best practices on the practical aspects of privacy enforcement and co-operation. It also leads in the coordination of joint enforcement initiatives to foster greater compliance with global privacy laws.

Our office is a member of the GPEN executive committee, and hosts the GPEN website, the primary vehicle for information sharing among network members.

In March 2021, GPEN worked with the International Consumer Protection Enforcement Network (ICPEN) to hold the first-ever joint best practices workshop. The virtual event brought together enforcement practitioners from both regulatory spheres from around the world. Participants discussed the extensive intersection between privacy and consumer protection, and explored strategies to advance cross-regulatory enforcement cooperation in practice, including the need to press for legislative change to allow for greater cross-regulatory cooperation; develop relationships with potential cross-regulatory partners; and seek out opportunities to engage in further cross-regulatory cooperation (including between the 2 networks).

We are pleased to note that the first-ever collaborative compliance activity between ICPEN and GPEN, whereby GPEN endorsed a 2019 letter from 27 ICPEN agencies to both Apple and Google, has now resulted in those 2 companies implementing changes to enhance the information provided by apps to users about their collection and use of data.

GPEN members continue to engage through various initiatives, including monthly teleconferences and other events. For example, in response to the COVID-19 pandemic, GPEN carried out a virtual round table discussion and subsequent survey under the theme “resetting privacy.” The initiative was aimed at gauging the impact of the COVID-19 pandemic on the operations of data protection authorities and sharing approaches for enforcement during, and coming out of the pandemic.

Asia-Pacific Privacy Authorities Forum

Our participation as a member of the Asia-Pacific Privacy Authorities (APPA) forum focuses on building international capacity for collaboration in the Asia-Pacific region through the sharing of relevant expertise and investigative findings with members.

While respecting the limitations of our confidential information-sharing agreements regarding public sector investigations, we present on findings from both our private and public sector investigations, and gain lessons learned from other authorities’ investigations.

Our office has also shared research and guidance developed in response to the COVID-19 pandemic with our Asia-Pacific colleagues. At the September 2020 APPA COVID-19 webinar, we presented on our techniques to leverage virtual tools and continue our collaboration work through the International Enforcement Working Group during the pandemic.

OECD Working Party on Data Governance and Privacy

Our office attends meetings of the Organisation for Economic Co-operation and Development (OECD) Working Party on Data Governance and Privacy as a member of the Canadian delegation. In 2020-21, we also participated in the Working Party’s review of the Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data (OECD Privacy Guidelines) as a member of the ad hoc group of 60 experts.

This expert group helped guide the Working Party’s efforts in reviewing the guidelines by, for instance, contributing to identifying challenges in current-day privacy. The Working Party held 2 virtual meetings in 2020, during which it discussed the review as well as issues on the nature and frequency of government requests for access to personal data held by private sector organizations.

Our office – upon invitation from the Government of Canada – also attended OECD virtual roundtables on data localization and information sessions related to issues the OECD is exploring, such as data portability and regulatory sandboxes.

International Standards Organization

Our office often uses well-established standards from the International Standards Organization (ISO), among others, as frameworks when investigating breaches that involve a failing in security and privacy and encourages organizations to look to these standards as examples of best practice.

We have also been active in ISO’s efforts to develop standards. Our participation has primarily been in the identity and privacy technologies working group (SC 27/WG5) which focuses on the development of standards and guidelines addressing security aspects of identity management, biometrics and privacy.

Our interest in privacy enhancement technologies led experts from our Technology Analysis Directorate, with the support of other countries, to propose developing a de-identification framework standard, based on best practices, for protecting personal information in shared and released datasets.

This standard would provide guidance on assessing and mitigating risks of re-identification throughout the de-identification lifecycle. In 2020-21, as co-editor, our office’s technical experts continued to participate actively in the standard’s development, collaborating with multiple data protection authorities and international privacy experts. The hope is that this initiative will eventually lead to the publication of an effective and universally accepted international privacy-enhancing standard and potential code of practice.

Before the courts

Cases in which our office was a participant

Google Reference, T-1779-18

This is an application by the Privacy Commissioner of Canada pursuant to section 18.3 of the Federal Courts Act referring 2 questions for hearing and determination. These questions are as follows:

  • Does Google LLC in the operation of its search engine service, collect, use or disclose personal information in the course of commercial activities within the meaning of paragraph 4(1)(a) of PIPEDA when it indexes web pages and presents search results in response to searches of an individual’s name?
  • Is the operation of Google’s search engine service excluded from the application of Part I of PIPEDA by virtue of paragraph 4(2)(c) of PIPEDA because it involves the collection, use or disclosure of personal information for journalistic, artistic or literary purposes and for no other purpose?

The questions arose in the context of a complaint from an individual alleging that Google is contravening PIPEDA by continuing to prominently display links to online news articles concerning him in search results when his name is searched using Google’s search engine service. The complainant requested that Google remove the articles in question from results for searches of his name.

In its initial response to the complaint, Google took the position, in part, that PIPEDA does not apply to it in the circumstances. In order to resolve, as a first step, this jurisdictional issue, the Privacy Commissioner referred the above 2 questions regarding whether PIPEDA applies to Google’s operation of its search engine to the Federal Court for determination before continuing with the investigation.

After a number of interlocutory motions were decided by the Court, on January 26 and January 27, 2021, the Federal Court heard the Reference questions. The OPC, Google LLC, the Attorney General of Canada, and the complainant participated as parties in the hearing; CBC and CIPPIC participated as intervenors.

On July 8, 2021, the Federal Court issued its decision on the merits of the Reference questions. The Court agreed with the OPC’s position that PIPEDA applies to Google’s search engine service.

The Court’s answer to the first question was “Yes” – Google is collecting, using, and disclosing personal information in the course of commercial activities when operating its search engine service.

The Court found that Google collects personal information when it uses its automated “crawlers” to continuously access publicly available webpages, copy their contents, including personal information, and transmit it to its servers for indexing. Further, Google also uses personal information, and needs as much information as possible to make its search as comprehensive and valuable as possible for users, and consequently for advertisers. Lastly, Google also discloses this information; it has control over its “snippets” (automatically generated textual excerpts from the websites it indexes) and the order in which the information appears in Google search results.

The Court dismissed the arguments that because Google’s search engine service is free to use, and that there was no evidence that advertisements appear alongside the search results generated by a search of the Complainant’s name, that Google’s search engine service was not commercial.

The Court held that every component of Google’s search engine service is a commercial activity as contemplated by PIPEDA. The Court noted that Google has a flagrant commercial interest in connecting users and web content providers. In exchange for the information displayed in search results, users provide a variety of personal information (including their location, interests, and consumption patterns). Google then uses this information for profit, as the bulk of Google’s revenue comes from advertising, and most of Google’s advertising revenue is in turn generated from Google’s search engine and other online services.

The Court’s answer to the second question was “No” – Google’s search engine service is not exempt from the application of PIPEDA by virtue of the journalistic exemption found in paragraph 4(2)(c) of the Act, because it does not operate for a journalistic purpose, and certainly not for an exclusively journalistic purpose.

The Court noted that an ordinary understanding of the word journalism encompasses content creation and content control, as shown by the test for determining whether an activity qualifies as journalism adopted by the Federal Court in A.T. v Globe24h.com. According to that test, an activity will qualify as journalism where its purpose is to: (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a “self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.”

Applying the test to Google, the Court found that first, Google makes information universally accessible, which is much broader than informing a community about issues the community values; second, that Google does not create or produce anything, it only displays search results; and third, there is no effort on the part of Google to determine the fairness or the accuracy of the search results. Publishers are accountable for the accuracy of content found in search results, not Google.

Although the Court accepted that Google’s search engine helps facilitate access to information, it found that it does not contain any other defining feature of journalism. The primary purpose of Google’s search engine service is to index and present search results. Even though Google returns some journalism in its search results, its search results clearly extend beyond journalism.

At the hearing, Google sought to add a third question for the Court’s determination, which was whether the Court should decline to answer the Reference questions or dismiss the Reference because the questions could not or should not be answered without addressing the issue of whether a potential requirement to remove links from its search results would violate section 2(b) of the Canadian Charter of Rights and Freedoms, and/or because there was an inadequate evidentiary record before the Court.

The Court declined to do so, finding that Google’s proposed issue contained a contradiction, because Courts should in fact refrain from addressing constitutional issues without an adequate evidentiary record. These questions are left to the OPC, which will have the benefit of a complete evidentiary record and will be better placed to assess whether PIPEDA can be applied in the way the complainant wants it to be applied, without violating Charter values. The issue of a right to de-indexing inaccurate or out of date information therefore remains to be examined once the OPC resumes its investigation.

Google confirmed at the end of September that it is appealing the Court’s decision.

Privacy Commissioner of Canada v Facebook, Inc. (T-190-20) (Federal Court) (Facebook 1), Facebook, Inc. v Privacy Commissioner of Canada (T-473-20) (Federal Court) (Facebook 2)

The OPC’s Facebook litigation continued this year.

Facebook 1 is a Federal Court application brought by the Privacy Commissioner of Canada in February 2020, under paragraph 15(a) of PIPEDA, following an investigation and issuance of a report of findings regarding a complaint concerning the personal information-handling practices of the respondent, Facebook Inc.

The 2019 joint investigation by the Privacy Commissioner of Canada and the Office of the Information and Privacy Commissioner for British Columbia found major shortcomings in the social media giant’s privacy practices. Facebook disputed the findings and refused to implement recommendations to address the deficiencies identified.

The Privacy Commissioner of Canada then filed a Notice of Application in the Federal Court seeking a declaration that Facebook had contravened PIPEDA, and various other remedies. Among other powers, the Federal Court can impose binding orders requiring an organization to correct or change its practices and comply with the law.

The OPC’s 2018-2019 Annual Report provides details of the investigation and the April 2019 Report of Findings.

The OPC’s 2019-2020 Annual Report summarizes the Office of the Privacy Commissioner’s Notice of Application and the relief that it is seeking.

In March, 2020, our office served Facebook with our affidavit evidence in support of the application. In response, Facebook brought a motion to strike portions of our affidavit.

In April 2020, Facebook also brought an application for judicial review under s. 18.1 of the Federal Courts Act of our Report of Findings (Facebook 2). In this matter, Facebook is seeking judicial review of our decision to investigate and continue to investigate, and the investigation process, and seeks to quash the resulting report of findings.

In response, our office brought a motion to strike Facebook’s application for judicial review on the basis that Facebook is out of time to bring such a challenge and has an adequate alternative remedy in its legal right to respond to our office’s ongoing application under section 15 of PIPEDA (Facebook 1).

The 2 motions were heard on January 19 and 21, 2021. On June 15, 2021, the Federal Court released its decision on these 2 motions. With respect to Facebook’s motion to strike large portions of the OPC’s affidavit, the Court was largely unpersuaded that the OPC’s affidavit evidence was inadmissible and held that only a certain limited number of paragraphs and exhibits to the affidavit should be struck. The Court also dismissed our application to strike Facebook’s application for judicial review, finding that there was at least a debatable issue as to whether there is an adequate alternative remedy for Facebook in the PIPEDA application, such that Facebook’s arguments were not so bereft of any chance of success to justify striking out its application at this stage.

As a next step, the parties are to agree on a schedule for advancing both the section 15 application and the judicial review application.

Cain v Canada (Minister of Health) (T-645-20 and T-641-20) and Hayes v Canada (Minister of Health) (T-637-20)

Provided certain conditions are met and upon registering with Health Canada, medical users may grow their own cannabis, or designate someone to grow it for them. Health Canada received requests for information about these registrations under the Access to Information Act, including requests for information such as the first 3 digits of postal codes of registered personal producers. Health Canada’s position is that it should only release the first digit of a postal code as including more, when used in conjunction with other available pieces of information, would unacceptably increase the risk of disclosing information about identifiable individuals.

If information is about an identifiable individual it is considered personal information and cannot be released unless certain exceptions apply. Information is personal information where there is a serious possibility that an individual could be identified through the use of that information, alone or in combination with other available information (Gordon v Minister of Health, 2008 FC 258 at para 34).

The Information Commissioner of Canada did not agree with Health Canada’s position and has brought applications on behalf of the complainants in Federal Court requesting the release of the first 3 digits of the postal codes of registered personal and designated producers where there is no serious possibility of identification.

Our office is intervening in this case to recommend a framework for operationalizing the test for determining whether there is a serious possibility that an individual could be identified.

Capital One Bank (Canada Branch)

This proceeding involves an application against Capital One Bank (Canada Branch) (“Capital One”) following the release of our office’s Report of Findings concerning a complaint against the company. The Applicant is seeking relief primarily in the form of damages against Capital One.

Although our office is not named as a party to this proceeding, the Applicant is seeking our office’s investigation file in support of her application pursuant to Rule 317 of the Federal Courts Rules. Rule 317 permits a party to request “material relevant to an application that is in the possession of a tribunal whose order is the subject of the application […].” Our office objected to the Rule 317 request on the basis that the Applicant’s application is a request for a hearing pursuant to section 14 of PIPEDA for damages against Capital One for alleged contraventions of PIPEDA, not a judicial review of the Privacy Commissioner’s report of findings or investigation and Rule 317 is therefore not available. In response, the Applicant filed a motion seeking to compel our office to produce the investigation file.

In August 2020, a Prothonotary of the Federal Court issued an order dismissing the Applicant’s motion to compel production of our office’s investigation file on the basis that the thrust of the application is a request for a de novo hearing pursuant to section 14 of PIPEDA and therefore outside the scope of a Rule 317 request. The Applicant subsequently appealed the Prothonotary’s order. On May 21, 2021, the Federal Court dismissed the Applicant’s motion to appeal the Prothonotary’s order finding that he did not commit a reviewable error in dismissing the motion to compel production under Rule 317. The Applicant has since filed an appeal of the Federal Court’s order to the Federal Court of Appeal. This latest appeal remains pending at the time of writing.

Cases our office followed with interest

In addition to the cases in which our office played an active part as a party or intervener, there were a number of other cases touching on privacy issues before the courts, which we followed with interest. Among others that attracted our attention were 2 decisions issued by the Federal Court of Appeal.

Constitutionality of Canada’s Anti-Spam Legislation

In 3510395 Canada v Canada (Attorney General) (2020 FCA 103), the Federal Court of Appeal dismissed a challenge to the constitutionality of Canada’s Anti-Spam Legislation (CASL) and held that the legislation – specifically rules governing the distribution of commercial electronic messages – was a valid exercise of Parliament’s power over general trade and commerce, noting that e-commerce permeates Canada’s economy and is thus concerned with trade as a whole, providing a valid basis for enacting this legislation.

The Supreme Court of Canada subsequently declined to hear an appeal of this case.

This case was of particular interest to our office given our role under CASL.

Decision clarifies rules around Section 41 applications

Canada (Minister of Public Safety and Emergency Preparedness) v Gregory (2021 FCA 33) involved an application under section 41 of the Privacy Act. The respondent complained to our office after the RCMP failed to provide him with access to a video he requested within the 30-day time limit prescribed by the Privacy Act. Our office investigated and ultimately issued a well-founded report of findings.

After receiving our report, the respondent commenced an application under section 41 of the Privacy Act.

However, shortly after the respondent filed his application, the RCMP provided a response to his initial request for access to the video, refusing to disclose it on the basis of an exemption found in section 22 of the Privacy Act.

The Federal Court of Appeal held there must be a report from the Privacy Commissioner specifically on the validity of the RCMP’s claimed exemption in order for the Federal Court to have jurisdiction over the matter.

Therefore, in a section 41 application, the Federal Court cannot rule upon the application of any exemption to access claimed under the Privacy Act before our office has investigated and reported on the claimed exemption.

R v Canfield (2020 ABCA 383) (Border searches of electronic devices)

This case concerns 2 individuals who were each convicted of possession of child pornography. The evidence against them included photos and videos retrieved by Canadian Border Services Agency (CBSA) agents when their personal electronic devices (a cellphone and laptop computer, respectively) were searched at the Edmonton International Airport.

Both individuals were referred for secondary inspection upon re-entering Canada and their electronic devices were searched pursuant to section 99(1)(a) of the Customs Act, which gives CBSA officers routine authority to examine any “goods” that have been imported into Canada. The trial judge concluded that section 99(1)(a) of the Customs Act was valid and that the evidence obtained from the devices was admissible and had not been obtained in breach of the Canadian Charter of Rights and Freedoms.

On appeal, the Alberta Court of Appeal concluded that section 99(1)(a) of the Customs Act was unconstitutional to the extent that it imposes no limits on the searches of personal electronic devices at the border and was not saved by section 1 of the Charter. The Court declared that the definition of “goods” in section 2 of the Customs Act is of no force or effect insofar as the definition includes the content of personal electronic devices for the purpose of section 99(1)(a). In its decision, the Court recognized that the search of a computer or cellphone has the potential to be a significant intrusion on personal privacy. Therefore, in order for such a search to be reasonable, it must have a threshold requirement.

Although the Court accepted that the national security interests of Canada in policing its borders and enforcing its customs and other laws at the border are important objectives, it found that it was not at all clear that these objectives could not be met if additional safeguards are put in place to protect individuals from unnecessarily intrusive searches of their personal electronic devices. The Court found that appropriate threshold, whether it be reasonable suspicion or something less than that having regard to the unique nature of the border, will have to be determined by Parliament.

The Court suspended the declaration of invalidity for one year to give Parliament the opportunity to amend the legislation. The Supreme Court of Canada declined to hear an appeal of this case, which means that Parliament has until October 2021 (one year from the Court of Appeal’s decision) to amend the legislation accordingly.

In 2019, our office reported on complaints we received from individuals whose personal electronic devices were searched at the border. In our report, we recommended that the Customs Act be updated to recognize that personal electronic devices contain sensitive personal information and are thus not mere “goods” within the meaning of the Act. We also recommended that the Act be updated to include a clear legal framework for the examination of digital devices at the border and that the threshold for examinations of digital devices at the border be elevated to “reasonable grounds to suspect” a contravention of the Act.

 


Appendices

Appendix 1: Definitions

Complaint Types

Access
The institution/organization is alleged to have denied one or more individuals access to their personal information as requested through a formal access request.
Accountability
Under PIPEDA, an organization has failed to exercise responsibility for personal information in its possession or custody, or has failed to identify an individual responsible for overseeing its compliance with the Act.
Accuracy
The institution/organization is alleged to have failed to take all reasonable steps to ensure that personal information that is used is accurate, up-to-date and complete.
Challenging compliance
Under PIPEDA, an organization has failed to put procedures or policies in place that allow an individual to challenge its compliance with the Act, or has failed to follow its own procedures and policies.
Collection
The institution/organization is alleged to have collected personal information that is not necessary, or has collected it by unfair or unlawful means.
Consent
Under PIPEDA, an organization has collected, used or disclosed personal information without valid consent, or has made the provisions of a good or service conditional on individuals consenting to an unreasonable collection, use, or disclosure.
Correction/notation (access)
The institution/organization is alleged to have failed to correct personal information or has not placed a notation on the file in the instances where it disagrees with the requested correction.
Correction/notation (time limit)
Under the Privacy Act, the institution is alleged to have failed to correct personal information or has not placed a notation on the file within 30 days of receipt of a request for correction.
Extension notice
Under the Privacy Act, the institution is alleged to have not provided an appropriate rationale for an extension of the time limit, applied for the extension after the initial 30 days had been exceeded, or, applied a due date more than 60 days from date of receipt.
Fee
The institution/organization is alleged to have inappropriately requested fees in an access to personal information request.
Identifying purposes
Under PIPEDA, an organization has failed to identify the purposes for which personal information is collected at or before the time the information is collected.
Index
Info Source (a federal government directory that describes each institution and the information banks – groups of files on the same subject – held by that particular institution) is alleged to not adequately describe the personal information holdings of an institution.
Language
In a request under the Privacy Act, personal information is alleged to have not been provided in the official language of choice.
Openness
Under PIPEDA, an organization has failed to make readily available to individuals specific information about its policies and practices relating to the management of personal information.
Retention and disposal
The institution/organization is alleged to have failed to keep personal information in accordance with the relevant retention period: either destroyed too soon or kept too long.
Safeguards
Under PIPEDA, an organization has failed to protect personal information with appropriate security safeguards.
Time limits
Under the Privacy Act, the institution is alleged to have not responded within the statutory limits.
Use and disclosure
The institution/organization is alleged to have used or disclosed personal information without the consent of the individual or outside permissible uses and disclosures allowed in legislation.

Dispositions

Well-founded
The institution or organization contravened a provision of the Privacy Act or PIPEDA.
Well-founded and resolved
The institution or organization contravened a provision of the Privacy Act or PIPEDA but has since taken corrective measures to resolve the issue to the satisfaction of the OPC.
Well-founded and conditionally resolved
The institution or organization contravened a provision of the Privacy Act or PIPEDA. The institution or organization committed to implementing satisfactory corrective actions as agreed to by the OPC.
Not well-founded
There was no or insufficient evidence to conclude the institution/organization contravened the privacy legislation.
Resolved
Under the Privacy Act, the investigation revealed that the complaint is essentially a result of a miscommunication, misunderstanding, etc., between parties; and/or the institution agreed to take measures to rectify the problem to the satisfaction of the OPC.
Settled
Our office helped negotiate a solution that satisfied all parties during the course of the investigation, and did not issue a finding.
Discontinued
Under the Privacy Act: The investigation was terminated before all the allegations were fully investigated. A case may be discontinued for various reasons, but not at the OPC’s behest. For example, the complainant may no longer be interested in pursuing the matter or cannot be located to provide additional information critical to reaching a conclusion.

Under PIPEDA: The investigation was discontinued without issuing a finding. An investigation may be discontinued at the Commissioner’s discretion for the reasons set out in subsection 12.2(1) of PIPEDA.
No jurisdiction
It was determined that federal privacy legislation did not apply to the institution/organization, or to the complaint’s subject matter. As a result, no report is issued.
Early resolution (ER)
Applied to situations in which the issue is resolved to the satisfaction of the complainant early in the investigation process and the office did not issue a finding.
Declined to investigate
Under PIPEDA, the Commissioner declined to commence an investigation in respect of a complaint because the Commissioner was of the view that:
  • the complainant ought first to exhaust grievance or review procedures otherwise reasonably available;
  • the complaint could be more appropriately dealt with by means of another procedure provided for under the laws of Canada or of a province; or,
  • the complaint was not filed within a reasonable period after the day on which the subject matter of the complaint arose, as set out in subsection 12(1) of PIPEDA.
Withdrawn
Under PIPEDA, the complainant voluntarily withdrew the complaint or could no longer be practicably reached. The Commissioner does not issue a report.

Appendix 2: Statistical tables

Statistical tables related to the Privacy Act

Table 1 - Privacy Act dispositions of access and privacy complaints by institution
Respondents Discontinued
 
No Jurisdiction
 
Not Well-Founded
 
Resolved
 
Settled
 
Well-founded
 
 
Well-Founded -
Conditionally
Resolved
 
Well-Founded -
Resolved
Early Resolved
 
Total
 
Administrative Tribunals
Support Service of Canada
    3           1 4
Agriculture and Agri-Food Canada     1           1 2
Bank of Canada                 1 1
Canada Border Services Agency 1   11   1   5 4 27 49
Canada Energy Regulator                 1 1
Canada Mortgage and
Housing Corporation
                1 1
Canada Post Corporation     2   1     2 8 13
Canada Revenue Agency 1   6 1 1 1 1 1 21 33
Canada School of Public Service       1       1   2
Canadian Air Transport
Security Authority
            1     1
Canadian Food Inspection Agency     2     1     1 4
Canadian Heritage               1   1
Canadian Security
Intelligence Service
    7   1       8 16
Canadian Transportation Agency     1             1
Civilian Review and
Complaints Commission for the
Royal Canadian Mounted Police
1   2             3
Communications Security
Establishment Canada
    1         1 1 3
Correctional Service Canada     9 1     1 2 23 36
Crown-Indigenous Relations and
Northern Affairs Canada
    8           3 11
Department of Justice Canada 1   2           2 5
Elections Canada /
Office of the Chief Electoral Officer
                1 1
Employment and Social
Development Canada
    5 1       2 15 23
Environment and
Climate Change Canada
                1 1
Export Development Canada                 1 1
Federal Public Service
Labour Relations and
Employment Board
                1 1
Financial Transaction and
Reports Analysis Centre
of Canada
    1             1
Fisheries and Oceans Canada     1           2 3
Global Affairs Canada     1 5   1   1 4 12
Health Canada     3         1 5 9
Immigration and
Refugee Board of Canada
                2 2
Immigration, Refugees and
Citizenship Canada
    2   1       22 25
Indigenous Services Canada     1           5 6
Innovation, Science and
Economic Development Canada
                2 2
Military Police
Complaints Commission
31                 31
National Defence     4 1 1 2 3 2 6 19
Office of the Public Sector
Integrity Commissioner
of Canada
    1             1
Parole Board of Canada   1             3 4
Privy Council Office     1           1 2
Public Health Agency
of Canada
                1 1
Public Prosecution Service
of Canada
                1 1
Public Safety Canada     8 1   1 1 3 3 17
Public Service Commission
of Canada
    2         1 4 7
Public Services and
Procurement Canada
    2       2 2 73 79
Royal Canadian Mounted Police 2   26     1 2 6 42 79
Royal Canadian Mounted Police
External Review Committee
    1         1   2
Service Canada             1   1 2
Social Sciences and
Humanities Research Council
of Canada
    2         1   3
Statistics Canada                 6 6
Transport Canada     2           1 3
Treasury Board of Canada
Secretariat
                1 1
Veterans Affairs Canada     5         1 10 16
Total 37 1 123 11 6 7 17 33 313 548

Table 2 - Privacy Act treatment times – Early resolution cases by complaint type
Complaint type Count Average
treatment time
(months)
Privacy 185 3.51
Accuracy 2 0.75
Collection 29 2.44
Retention and Disposal 4 0.57
Use and Disclosure 150 3.83
Access 128 5.74
Access 119 5.95
Correction - notation 9 2.89
Time limits 129 1.00
Extension Notice 2 4.62
Time limits 127 0.94
Total 442 3.42

Table 3 - Privacy Act treatment times – All other investigations by complaint type
Complaint Type Count Average
treatment time
(months)
Privacy 107 28.79
Collection 17 32.20
Retention and Disposal 11 39.03
Use and Disclosure 79 26.63
Access 128 21.65
Access 119 21.63
Correction – Notation 5 29.98
Denial of Access 4 11.72
Time Limits 178 5.04
Correction – Time Limits 1 22.75
Extension Notice 2 0.74
Time Limits 175 4.99
Total 413 16.34

Table 4 -Privacy Act treatment times – All closed files by disposition
Complaint type Count Average treatment
time (months)
Early resolved 441 3.41
All other investigations 414* 16.32
Discontinued 43 38.38
No Jurisdiction 1 2.79
Not Well-Founded 125 18.38
Resolved 13 7.73
Settled 6 24.53
Well-Founded 10 14.46
Well-Founded - Conditionally Resolved 45 16.96
Well-Founded - Deemed Refusal 68 7.02
Well-founded - Resolved 103 11.39
Total 855 9.66
* The statistics on total number of complaints closed in 2020-21 are lower than previous years due to a change introduced in 2019-2020 in the way our office counts the number of complaints. As noted in last year’s annual report, we adjusted how we track and report on complaints and investigation findings. Since April 1, 2019, when an individual’s complaint about a single matter represents potential contraventions of multiple sections of the Privacy Act, or when an individual complains following multiple access requests made to one institution, we track and report these as a single complaint.

Table 5 - Privacy Act breaches by institution
Respondent Incident
Canada Border Services Agency 3
Canada Council for the Arts 1
Canada Economic Development for Quebec Regions 1
Canada Energy Regulator 5
Canada Post Corporation 1
Canada Revenue Agency 5
Canada School of Public Service 1
Canadian Institutes of Health Research 2
Canadian Museum for Human Rights 1
Canadian Security Intelligence Service 1
Correctional Service Canada 40
Department of Justice Canada 1
Employment and Social Development Canada 164
Environment and Climate Change Canada 1
Global Affairs Canada 8
Great Lakes Pilotage Authority Canada 1
Health Canada 1
Immigration, Refugees and Citizenship Canada 6
International Development Research Centre 1
National Defence 4
Non-Public Property and Staff
of the Non-Public Funds, Canadian Forces
1
Privy Council Office 1
Public Health Agency of Canada 2
Public Sector Pension Investment Board 1
Public Service Commission of Canada 9
Royal Canadian Mounted Police 13
Shared Services Canada 1
Statistics Canada 1
Transport Canada 1
Veterans Affairs Canada 1
Windsor-Detroit Bridge Authority 1
Total 280

Table 6 - Privacy Act complaints and breaches
Category Total
Accepted
Privacy 281
Access 234
Time Limits 312
Total accepted 827
Closed through early resolution
Privacy 185
Access 128
Time limits 128
Total 441
Closed through all other investigations
Privacy 107
Access 128
Time limits 179
Total 414
Total closed 855
Breaches received
Unauthorized disclosure* 126
Loss 105
Theft 9
Unauthorized access 40
Total received 280
* In previous years, “Accidental Disclosure” was used by this office to reflect instances where personal information was disclosed outside of the provisions of the Privacy Act. This term has been changed to “Unauthorized Disclosure” to reflect the wording in TBS Guidelines for Privacy Breaches, but the meaning remains unchanged.

Table 7 - Privacy Act complaints accepted by complaint type
Complaint type Early Resolution Investigation Total Number Total
percentage*
Number Percentage* Number Percentage*
Access
Access 180 34% 41 14% 221 27%
Correction - Notation 12 2%     12 1%
Denial of Access     1 0% 1 0%
Time limits
Correction - Time Limits 1 0%     1 0%
Extension Notice 2 0% 2 1% 4 0%
Time Limits 128 24% 179 59% 307 37%
Privacy
Accuracy 3 1%     3 0%
Collection 34 6% 13 4% 47 6%
Retention and Disposal 7 1%     7 1%
Use and Disclosure 158 30% 66 22% 224 27%
Total 525 100% 302 100% 827 100%
* Figures may not sum to total due to rounding.

Table 8 - Privacy Act top 10 institutions by complaints accepted
Respondent Privacy Access Time Limits Total
Early Resolution Investigation Early Resolution Investigation Early Resolution Investigation
Royal Canadian Mounted Police 25 10 44 10 52 45 186
Correctional Service Canada 25 7 22 2 10 64 130
National Defence 5 5 11 2 12 16 51
Canada Border Services Agency 9 2 15 4 4 14 48
Immigration, Refugees and Citizenship Canada 22 2 11   7 5 47
Public Services and Procurement Canada 21 3 11 1 3 3 42
Employment and Social Development Canada 12 5 15 2 4 3 41
Canada Revenue Agency 9 5 10 2 9 5 40
Canada Post Corporation 10 2 6 1 3   22
Veterans Affairs Canada 9 2 3 2 3 1 20
Total 147 43 148 26 107 156 627

Table 9 - Privacy Act top 10 institutions by complaints accepted and fiscal year
Respondent 2017/18 2018/19 2019/20 2020/21
Royal Canadian Mounted Police 232 273 176 186
Correctional Service Canada 440 426 155 130
National Defence 93 121 33 51
Canada Border Services Agency 76 109 42 48
Immigration, Refugees and Citizenship Canada 29 59 44 47
Public Services and Procurement Canada 49 27 70 42
Employment and Social Development Canada 24 39 25 41
Canada Revenue Agency 63 79 63 40
Canada Post Corporation 33 29 4 22
Veterans Affairs Canada 12 20 12 20
Total 1051 1182 624 627

Table 10 - Privacy Act complaints accepted by institution
Respondent Early resolution Investigation Total
Administrative Tribunals Support
Service of Canada
4 5 9
Agriculture and Agri-Food Canada 1 1 2
Atlantic Canada Opportunities Agency   1 1
Atomic Energy of Canada Limited 1   1
Bank of Canada 2 1 3
Canada Border Services Agency 28 20 48
Canada Employment Insurance Commission 1   1
Canada Energy Regulator   1 1
Canada Post Corporation 19 3 22
Canada Revenue Agency 28 12 40
Canada School of Public Service 1 6 7
Canadian Air Transport
Security Authority
  1 1
Canadian Broadcasting Corporation 1 1 2
Canadian Food Inspection Agency 2 1 3
Canadian Heritage 1 1 2
Canadian Human Rights Commission 2 1 3
Canadian Nuclear Safety Commission 1   1
Canadian Radio-Television and
Telecommunications Commission
  1 1
Canadian Security Intelligence Service 10 6 16
Canadian Transportation Agency   1 1
Civilian Review and Complaints Commission
for the Royal Canadian Mounted Police
1 1 2
Communications Security Establishment Canada 3 2 5
Correctional Service Canada 59 74 133
Crown-Indigenous Relations and
Northern Affairs Canada
4   4
Department of Justice Canada 6   6
Elections Canada /
Office of the Chief Electoral Officer
3 1 4
Employment and
Social Development Canada
31 10 41
Environment and Climate Change Canada 3 4 7
Export Development Canada 2   2
Federal Economic Development
Agency for Southern Ontario
  1 1
Federal Public Service
Labour Relations and Employment Board
1   1
Fisheries and Oceans Canada 2 1 3
Global Affairs Canada 7 11 18
Health Canada 6 1 7
Immigration and Refugee Board
of Canada
5   5
Immigration, Refugees and
Citizenship Canada
40 7 47
Indigenous Services Canada 7 2 9
Innovation, Science and
Economic Development Canada
2 4 6
Library and Archives Canada 2 1 3
National Defence 28 23 51
Natural Resources Canada   1 1
Office of the Information Commissioner
of Canada
  1 1
Parks Canada Agency 2 1 3
Parole Board of Canada 8 1 9
Privy Council Office 1 2 3
Public Health Agency of Canada 1 1 2
Public Prosecution Service
of Canada
1   1
Public Safety Canada 3 2 5
Public Service Commission of Canada 4 1 5
Public Services and
Procurement Canada
35 7 42
Royal Canadian Mounted Police 121 65 186
Service Canada 1   1
Shared Services Canada 3 1 4
Statistics Canada 9   9
Telefilm Canada 1   1
Trans Mountain Corporation   1 1
Transport Canada 4 2 6
Transportation Safety Board
of Canada
  1 1
Treasury Board of Canada Secretariat 2 3 5
Veterans Affairs Canada 15 5 20
Western Economic
Diversification Canada
  1 1
Total 525 302 827

Table 11 - Privacy Act complaints accepted by province, territory or other
Province/territory
or other
Early resolution Investigation Total Number Total
percentage*
Number Percentage* Number Percentage*
British Columbia 136 25.90% 71 23.51% 207 25.03%
Alberta 36 6.86% 21 6.95% 57 6.89%
Saskatchewan 15 2.86% 1 0.33% 16 1.93%
Manitoba 21 4.00% 11 3.64% 32 3.87%
Ontario 166 31.62% 96 31.79% 262 31.68%
Quebec 86 16.38% 69 22.85% 155 18.74%
New Brunswick 22 4.19% 12 3.97% 34 4.11%
Nova Scotia 30 5.71% 10 3.31% 40 4.84%
Prince Edward Island 2 0.38%   0.00% 2 0.24%
Newfoundland and Labrador 2 0.38% 3 0.99% 5 0.60%
Nunavut   0.00% 2 0.66% 2 0.24%
Northwest Territories   0.00% 2 0.66% 2 0.24%
Yukon 1 0.19%   0.00% 1 0.12%
United States 2 0.38% 2 0.66% 4 0.48%
Other (Not US) 3 0.57%   0.00% 3 0.36%
Not specified 3 0.57% 2 0.66% 5 0.60%
Total 525 100% 302 100.00% 827 100%
* Figures may not sum to total due to rounding.

Table 12 - Privacy Act dispositions by complaint type
Complaint type Discontinued
 
No Jurisdiction
 
Not Well-Founded
 
Resolved
 
Settled
 
Well-Founded
 
 
Well-Founded -
Conditionally
Resolved
 
Well-Founded -
Deemed Refusal
 
Well-Founded -
Resolved
Early Resolved
 
Total
 
Privacy
Accuracy                   2 2
Collection 5   10 1     1     29 46
Retention and Disposal 7   2     1     1 4 15
Use and Disclosure 9   41 6 1 5 7   10 150 229
Access
Access 15 1 63 4 5 1 9   21 119 238
Correction – Notation     4           1 9 14
Denial of Access 1   3               4
Time Limits
Correction – Time Limits                 1   1
Extension Notice       1         1 2 4
Time Limits 6   2 1   3 28 68 68 126 302
Total 43 1 125 13 6 10 45 68 103 441 855
Table 13 - Privacy Act dispositions of time limits by institution
Respondent Discontinued
 
Not Well-Founded
 
Resolved
 
Well-Founded
 
 
Well-Founded -
Conditionally
Resolved
 
Well-Founded -
Deemed Refusal
 
Well-Founded -
Resolved
Early Resolved
 
Total
 
Administrative Tribunals
Support Service of Canada
        1     2 3
Bank of Canada         1   1   2
Canada Border Services
Agency
1       1 2 4 4 12
Canada Post Corporation             1 3 4
Canada Revenue Agency   1     2   6 9 18
Canadian Human Rights
Commission
        1       1
Canadian Security
Intelligence Service
    1         2 3
Communications Security
Establishment Canada
            2 1 3
Correctional Service
Canada
2   1 3 5 52 9 11 83
Crown-Indigenous Relations
and Northern Affairs
Canada
            1 1 2
Department of Justice
Canada
          1   1 2
Elections Canada /
Office of the
Chief Electoral Officer
              1 1
Employment and
Social Development
Canada
          1 2 3 6
Environment and
Climate Change Canada
          1   1 2
Export Development
Canada
              1 1
Global Affairs Canada         2 1   1 4
Health Canada               2 2
Immigration and
Refugee Board of Canada
              2 2
Immigration, Refugees
and Citizenship Canada
            4 7 11
Indigenous Services Canada             1 1 2
Innovation, Science
and Economic Development
Canada
        1       1
Library and Archives
Canada
            1 1 2
National Defence   1     8 2 7 12 30
Natural Sciences
and Engineering Research
Council of Canada
        1   1   2
Parole Board of Canada               2 2
Public Services and
Procurement Canada
2         1   2 5
Royal Canadian
Mounted Police
1       5 7 29 53 95
Shared Services Canada               1 1
Transport Canada               1 1
Treasury Board of Canada Secretariat               1 1
Veterans Affairs Canada             1 2 3
Total 6 2 2 3 28 68 70 128 307

Statistical tables related to PIPEDA

Table 1 - PIPEDA complaints accepted* by industry sector
Industry sector Number Proportion of
all complaints accepted*
Accommodations 12 4%
Construction 2 1%
Entertainment 1 0%
Financial 73 24%
Food and beverage 1 0%
Government 1 0%
Health 1 0%
Insurance 16 5%
Internet 27 9%
Manufacturing 9 3%
Not for profit organizations 1 0%
Not specified 1 0%
Professionals 10 3%
Publishers (except internet) 12 4%
Sales/Retail 38 12%
Services 47 15%
Telecommunications 32 10%
Transportation 22 7%
Utilities 3 1%
Total 309 100%
* Figures may not sum to total due to rounding.

Table 2 - PIPEDA complaints accepted* by complaint type
Complaint type Number Percentage of all
complaints accepted*
Access 105 34%
Accuracy 3 1%
Collection 42 14%
Consent 35 11%
Correction/Notation 2 1%
Retention 8 3%
Safeguards 43 14%
Use and Disclosure 71 23%
Total 309 100%
* Figures may not sum to total due to rounding.

Table 3 - PIPEDA investigations closed by industry sector and disposition
Sector category Early Resolved
 
 
Discontinued
(under 12.2)
No Jurisdiction
 
Not well-founded
 
Settled
 
Well-Founded
 
 
Well-founded
conditionally
resolved
 
Well-founded
resolved
Withdrawn
 
Total
Accommodations 7 1   1         1 10
Construction 1                 1
Entertainment 1                 1
Financial 53 1   4 3 3 3 7 3 77
Food and beverage 2                 2
Government 1                 1
Health 2           1     3
Insurance 7   1 1 1     1   11
Internet 22 1   1 1 3 2   1 31
Manufacturing 6             1 4 11
Not for profit organizations   1         1     2
Professionals 4 1   1 1       2 9
Publishers (except Internet) 3           1   1 5
Rental 2                 2
Sales/Retail 25     1 1 1   2   30
Services 30     1   2   2 2 37
Telecommunications 24     3     2 3 2 34
Transportation 17     1   2   4 1 25
Utilities 3     1           4
Total 210 5 1 15 7 11 10 20 17 296

Table 4 - PIPEDA investigations closed by complaint type and disposition
Complaint type Early Resolved
 
 
Discontinued
(under 12.2)
No Jurisdiction
 
Not well-founded
 
Settled
 
Well-Founded
 
 
Well-founded
conditionally
resolved
 
Well-founded
resolved
Withdrawn
 
Total
Access 79 2 1 4   3 2 7 3 101
Accountability         1         1
Accuracy 2                 2
Appropriate purposes           1   1 1 3
Collection 29 1   2   1   2 2 37
Consent 24 1   7 3 3 2 4 2 46
Correction/Notation 1                 1
Openness 1                 1
Retention 11     1 2       1 15
Safeguards 16     1 1   6 5 6 35
Use and disclosure 47 1       3   1 2 54
Total 210 5 1 15 7 11 10 20 17 296

Table 5 - PIPEDA investigations – Average treatment time by disposition
Disposition Number Average treatment time
in months
Early resolved 210 7.5
Discontinued (under 12.2) 5 15.2
No Jurisdiction 1 10.5
Not well-founded 15 20.1
Settled 7 28.4
Well-founded 11 21.5
Well-founded - conditionally resolved 10 28.7
Well-founded - resolved 20 28.0
Withdrawn 17 22.8
Total 296  
Overall weighted average   12.2

Table 6 - PIPEDA investigations - Average treatment times by complaint and disposition types
Early resolved Dispositions not early resolved All dispositions
Complaint type Number of
cases
Average treatment
time in months
Number of
cases
Average treatment
time in months
Number of
cases
Average treatment
time in months
Access 79 8.0 22 20.5 101 10.8
Accountability     1 57.6 1 57.6
Accuracy 2 6.4     2 6.4
Appropriate purposes     3 55.0 3 55.0
Collection 29 6.0 8 22.1 37 9.5
Consent 24 9.9 22 25.9 46 17.5
Correction/Notation 1 8.8     1 8.8
Openness 1 8.3     1 8.3
Retention 11 6.2 4 24.9 15 11.2
Safeguards 16 6.1 19 21.1 35 14.2
Use and disclosure 47 6.9 7 19.7 54 8.6
Total 210 7.5 86 23.9 296 12.2

Table 7 - PIPEDA breach notifications by industry sector and incident type
Sector Incident type Total incidents per sector Percentage of total incidents**
Loss Theft Unauthorized access Unauthorized disclosure*
Accommodation   3 3 3 9 1%
Agriculture, Forestry,
Fishing and Hunting
    5   5 1%
Construction     4 1 5 1%
Entertainment   1 5   6 1%
Financial 10 11 73 77 171 22%
Food and beverage     7   7 1%
Government 1   6 1 8 1%
Health 1 2 12 14 29 4%
Insurance 7 4 23 38 72 9%
Internet     16 2 18 2%
Manufacturing 1 1 42 3 47 6%
Mining and oil and gas extraction     2 1 3 0%
Not for profit organizations   1 55 15 71 9%
Not specified 1   3 1 5 1%
Professionals   6 44 15 65 8%
Publishers
(except Internet)
    9 1 10 1%
Sales/retail   4 58 13 75 10%
Services 2 3 32 7 44 6%
Telecommunications     89 22 111 14%
Transportation     14 3 17 2%
Utilities   1   3 4 1%
Total 23 37 502 220 782 100%
* In previous years, “Accidental Disclosure” was used by this office to reflect instances where personal information was disclosed outside of the provisions of PIPEDA, either intentionally or accidentally. This term has been changed to “Unauthorized Disclosure” to reflect the wording of PIPEDA, but the meaning remains unchanged.
** Figures may not sum to total due to rounding.

Table 8 - Number of Canadians accounts affected by incident type
Incident type Number of Canadians accounts affected
Loss 15,312
Theft 5,495
Unauthorized access 8,957,786
Unauthorized disclosure* 110,514
Total 9,089,107
* In previous years, “Accidental Disclosure” was used by this office to reflect instances where personal information was disclosed outside of the provisions of PIPEDA, either intentionally or accidentally. This term has been changed to “Unauthorized Disclosure” to reflect the wording of PIPEDA, but the meaning remains unchanged.


Appendix 3: Investigation processes

Privacy Act investigation process

Figure 2: Privacy Act investigation process: see text version.

Figure 3: Privacy Act investigation process: see text version.

Intake

Individuals make written submissions to our Office about alleged violations of the Privacy Act. Our Intake Unit reviews the matter to determine whether it constitutes a complaint, i.e., whether the allegations could constitute a contravention of the Act, and the most efficient manner in which to resolve it.

An individual may complain about any matter specified in section 29 of the Privacy Act, for example:

  • denial of access or unacceptable delay in providing access to his or her personal information held by an institution;
  • improper collection, use or disclosure of personal information, or
  • inaccuracies in personal information used or disclosed by an institution.

It is sometimes possible to immediately address issues, eliminating the need for our Office to pursue the matter as a standard investigation. In these cases, we simply resolve the matter through early resolution. The Privacy Commissioner may also initiate a complaint if satisfied there are reasonable grounds to investigate a matter.

  • Complaint
    • No:
      The individual is advised, for example, that the matter is not in our jurisdiction.
    • Yes:
      An investigator is assigned to the case.
      • Early resolution
        A complaint may be resolved before a standard investigation is undertaken if, for example, the issue has already been fully dealt with in another investigation and the institution has ceased the practice or the practice does not contravene the Act.
      • Standard investigation
        The investigation provides the factual basis for the Commissioner to determine whether the individual’s rights under the Privacy Act have been contravened.

        The investigator writes to the institution, outlining the substance of the complaint. The investigator gathers the facts related to the complaint through representations from both parties and through independent inquiry, interviews of witnesses, and review of documentation.

        Through the Commissioner or his delegate, the investigator has the authority to receive evidence, enter premises where appropriate, and examine or obtain copies of records found on any premises.
        • Discontinued
          A complaint may be discontinued if, for example, a complainant decides not to pursue it, or a complainant cannot be located.
        • Settled
          The OPC seeks to resolve complaints and to prevent contraventions from recurring. The Commissioner encourages resolution through negotiation and persuasion. The investigator assists in this process.
      • Analysis
        The investigator analyzes the facts and prepares recommendations to the Commissioner or his delegate. The investigator will contact the parties as necessary and review the facts gathered during the course of the investigation. The investigator may also tell the parties what he or she will be recommending, based on the facts, to the Commissioner or his delegate. At this point, the parties may make further representations.

        Analysis will include internal consultations with various directorates, for example, Legal Services, Policy, Research and Parliamentary Affairs and Technology Analysis, as appropriate.
        • Findings
          The Commissioner or his delegate reviews the file and assesses the report. The Commissioner or his delegate, not the investigator, decides what the appropriate outcome should be and whether recommendations to the institution are warranted.

          The Commissioner or his delegate sends letters of findings to the parties. The letters outline the basis of the complaint, the relevant findings of fact, the analysis, and any recommendations to the institution. The Commissioner or his delegate may ask the institution to respond in writing, within a particular timeframe, outlining its plans for implementing any recommendations.

          The possible findings are:
          • Not well-founded: The evidence, on balance, does not lead the Commissioner or his delegate to conclude that the complainant’s rights under the Act have been contravened.
          • Well-founded: The institution failed to respect a provision of the Act.
          • Well-founded, resolved: The investigation substantiated the allegations and the institution has agreed to take corrective measures to rectify the problem.
          • Resolved: The evidence gathered in the investigation supports the allegations raised in the complaint, but the institution agreed to take corrective measures to rectify the problem, to the satisfaction of this Office. The finding is used for those complaints in which “well-founded” would be too harsh to fit what essentially is a miscommunication or misunderstanding.

            In the letter of findings, the Commissioner or his delegate informs the complainant of his or her rights of recourse to the Federal Court on matters of denial of access to personal information.
            • Where recommendations have been made to an institution, OPC staff will follow up to verify that they have been implemented.
            • The complainant or the Commissioner may choose to apply to the Federal Court for a hearing of the denial of access. The Federal Court has the power to review the matter and determine whether the institution must provide the information to the requester.

PIPEDA investigation process

Figure 4: PIPEDA investigation process: see text version.

Figure 5: PIPEDA investigation process: see text version."

Intake

Individuals make written complaints to the OPC about violations of the Act. Our Intake Unit reviews these complaints, and, if necessary, follows up with complainants to seek clarification and gather additional information.

If complainants have not raised their concerns directly with the organization, we will ask them to do so in order to try to resolve the issue directly and then to come back to us if they are unsuccessful.

The Intake Unit is also sometimes able to immediately address issues. For example, if we have previously investigated the type of issue being raised and have determined that the activities are compliant with PIPEDA, an intake officer will explain this to the individual. Or, if we have previously determined that we do not have jurisdiction over the organization or type of activity, an intake officer will explain this and, where appropriate, refer the individual to other resources or sources of assistance.

In cases where the Intake Unit is not able to immediately address issues (and once the necessary information is gathered), the matter is accepted by our Office as a formal complaint. The Privacy Commissioner may also initiate a complaint if satisfied there are reasonable grounds to investigate a matter.

  • Complaint declined
    The Commissioner may decide to decline to investigate a complaint if certain conditions under subsection 12(1) of the Act are met. The complainant may request that the Commissioner reconsider this decision.
  • Sent to investigation
    Complaints of a serious, systemic or otherwise complex nature, for example, uncertain jurisdictional matters, multiple allegations or complex technical issues, are assigned to an investigator.
    • Investigation
      Investigations provide the factual basis for the Commissioner to determine whether individuals’ rights have been contravened under PIPEDA.

      The investigator writes to the organization, outlining the substance of the complaint. The investigator gathers the facts related to the complaint through representations from both parties and through independent inquiry, interviews of witnesses, and review of documentation. Through the Commissioner or his delegate, the investigator has the authority to receive evidence, enter premises where appropriate, and examine or obtain copies of records found on any premises.
      • Analysis
        The investigator analyses the facts and prepares recommendations to the Commissioner or his delegate.

        The investigator will contact the parties and review the facts gathered during the course of the investigation. The investigator will also advise the parties of his or her recommendations, based on the facts, to the Commissioner or his delegate. At this point, the parties may make further representations.

        Analysis will include internal consultations with various directorates, for example, Legal Services, Policy, Research and Parliamentary Affairs and Technology Analysis, as appropriate.
        • No jurisdiction
          The OPC determines that PIPEDA does not apply to the organization or activities being complained about.
        • Findings
          The Commissioner or his delegate reviews the file and assesses the report. The Commissioner or his delegate (not the investigator) decides what the appropriate outcome should be and whether recommendations to the organization are warranted.
        • Preliminary report
          If the results of the investigation indicate that there likely has been a contravention of PIPEDA, the Commissioner or his delegate recommends to the organization how to remedy the matter, and asks the organization to indicate within a set time period how it will implement the recommendation.
        • Final report and letters of findings
          The Commissioner or his delegate sends letters of findings to the parties. The letters outline the basis of the complaint, the relevant findings of fact, the analysis, and the response of the organization to any recommendations made in the preliminary report.

          (The possible findings are described in Appendix 1 – Definitions)

          In the letter of findings, the Commissioner or his delegate informs the complainant of his or her rights of recourse to the Federal Court.
          • Where recommendations have been made to an organization but have not yet been implemented, the OPC will ask the organization to keep us informed, on a predetermined schedule after the investigation, so that we can assess whether corrective action has been taken.
          • The complainant or the Commissioner may choose to apply to the Federal Court for a hearing of the matter. The Federal Court has the power to order the organization to correct its practices. The Court can award damages to a complainant, including damages for humiliation. There is no ceiling on the amount of damages.
        • Settled
          The OPC seeks to resolve complaints and to prevent contraventions from recurring. The OPC helps negotiate a solution that satisfies all involved parties during the course of the investigation. The investigator assists in this process.
        • Discontinued
          A complaint may be discontinued if, for example, a complainant decides not to pursue it or cannot be located, or if certain conditions, described in subsection 12.2 of the Act, are met.
  • Sent to early resolution officer
    Complaints which we believe could potentially be resolved quickly are sent to an early resolution officer. These complaints include matters where our Office has already made findings on the issues; where the organization has already dealt with the allegations to our satisfaction; or where it seems possible that allegations can be easily remedied.
    • Transferred to investigation
      If early resolution is unsuccessful, the case is transferred to an investigator.
    • Early resolution
      Early resolution officers encourage resolutions through mediation, negotiation and persuasion.

Appendix 4: Substantially similar legislation

Subsection 25(1) of PIPEDA requires our office to report annually to Parliament on the “extent to which the provinces have enacted legislation that is substantially similar” to the Act.

Under paragraph 26(2)(b) of PIPEDA, the Governor in Council may issue an Order exempting an organization, a class of organizations, an activity or a class of activities from the application of PIPEDA with respect to the collection, use or disclosure of personal information that occurs within a province that has passed legislation that is “substantially similar” to PIPEDA.

On August 3, 2002, Industry Canada (now known as Innovation, Science and Economic Development Canada) published the Process for the Determination of “Substantially Similar” Provincial Legislation by the Governor in Council, outlining the policy and criteria used to determine whether provincial legislation will be considered substantially similar. Under the policy, laws that are substantially similar:

  • provide privacy protection that is consistent with and equivalent to that in PIPEDA;
  • incorporate the 10 principles in Schedule 1 of PIPEDA;
  • provide for an independent and effective oversight and redress mechanism with powers to investigate; and
  • restrict the collection, use and disclosure of personal information to purposes that are appropriate or legitimate.

Organizations that are subject to provincial legislation deemed substantially similar are exempt from PIPEDA with respect to the collection, use or disclosure of personal information occurring within the respective province. Accordingly, PIPEDA continues to apply to the collection, use or disclosure of personal information in connection with the operations of a federal work, undertaking or business in the respective province, as well as to the collection, use or disclosure of personal information outside the province.

The following provincial laws that have been declared substantially similar to PIPEDA:

  • Quebec’s An Act Respecting the Protection of Personal Information in the Private Sector;
  • British Columbia’s Personal Information Protection Act;
  • Alberta’s Personal Information Protection Act;
  • Ontario’s Personal Health Information Protection Act, with respect to health information custodians;
  • New Brunswick’s Personal Health Information Privacy and Access Act, with respect to health information custodians;
  • Newfoundland and Labrador’s Personal Health Information Act, with respect to health information custodians; and
  • Nova Scotia’s Personal Health Information Act, with respect to health information custodians.

Appendix 5: Report of the Privacy Commissioner, Ad Hoc

It has been a trying year for everyone in all manner of things without question. Despite these challenges, and some initial delays, the Office of the Privacy Commissioner of Canada (OPC) established safe and secure protocols for transmission of records and I continued my work with resolve. In few cases, individuals sought an “appeal” of outcomes of OPC privacy complaint investigations involving other public bodies and I could not act in such cases. Nonetheless, and to be of service to those individuals, I took the time to explain my role as Ad hoc Commissioner and why I could not entertain their complaint. At the same time, I informed the OPC of their concerns to ensure that the OPC could address them directly where necessary.

I reviewed in total 6 matters from April 1, 2020 to March 31, 2021, of which 3 were cases that I could not accept as valid complaints, and 3 cases that required full investigations and corresponding Reports of Findings. In the complaints investigated, individuals were seeking access to personal information derived from or found in OPC privacy complaint investigations to which they were a party. In each of the Reports of Findings issued this year, I found that the OPC had lawfully refused access to some of the personal information requested in accordance with the rules under the Privacy Act. This meant there was no need for me to issue recommendations for additional disclosure.

These cases were interesting. The individuals seeking access (requesters) to personal information concerned them but also concerned third party individuals that were likewise associated with the same OPC investigation files.

Where a request for access to personal information has been submitted to the OPC and the OPC’s decision was to refuse access, this will trigger the right to file a complaint with the Ad hoc Commissioner. At first blush, many complained to me that being refused access did not seem right. As I had dealt with similar issues in the past, I thought it appropriate, for the purpose of this Annual Report, to highlight the rules upon which access to the requested personal information is determined.

Personal information as defined in the Privacy Act is recorded information which can identify an individual. Accordingly, its scope is wide ranging: an individual’s age, race, religion or education; an individual’s financial or employment history; private or confidential correspondence that an individual has sent to a government institution; or even views or opinions about an individual that have been expressed by others, and so on. This definition broadly encapsulates most information that can identify an individual.

The Act grants to every person the right to request access to one’s personal information found in a record under the control of a government institution, which includes the OPC. This is an important right, a broad right of access to one’s personal information; however, the Act also states that this right is not absolute. In some cases, this right of access can be lawfully limited, thereby allowing the OPC to withhold certain personal information. This is a difficult concept as it goes counter to our natural inclination to believe that we should be able to obtain any information about ourselves.

In fact, the right of access to personal information provision defines the scope of the information to which an individual is entitled to access: personal information about the individual who is requesting it but not the personal information of other individuals. Additionally, in specific instances, the access can be further restricted. For instance, accessing personal information contained in the files of public law enforcement and legal investigation authorities is limited and will depend on the identity of the person requesting it, the subject matter under investigation, and the timing of the request itself (whether during or at the conclusion of the law enforcement process/legal investigation).

The Privacy Act (as well as the Access to Information Act section 16.1) recognizes this fact under section 22.1:

22.1 (1) The Privacy Commissioner shall refuse to disclose any personal information requested under this Act that was obtained or created by the Commissioner or on the Commissioner’s behalf in the course of an investigation conducted by, or under the authority of, the Commissioner or that was obtained by the Commissioner in the course of a consultation with the Information Commissioner under subsection 36(1.1) or section 36.2 of the Access to Information Act.

The subsection (1) above spells out that any information the OPC obtains and records in its investigation files can never be disclosed. In other words, the OPC has no choice but to refuse access to that type of requested information. Subsection (2) of section 22.1, on the other hand, operates as an exception to that rule:

22.1(2) However, the Commissioner shall not refuse under subsection (1) to disclose any personal information that was created by the Commissioner or on the Commissioner’s behalf in the course of an investigation conducted by, or under the authority of, the Commissioner once the investigation and all related proceedings, if any, are finally concluded.

The words I have put in bold above focus upon information that was created during an investigation, as opposed to that which was obtained as we noted in the first subsection shown above. This means that at the conclusion of an OPC investigation, disclosure of personal information may take place upon request by the person to whom the information relates (and of course where there is no other exception to limit or restrict that access right). The OPC remains the correct source to determine access rights to the information created during investigations.

Another observation worth noting is the source where the requested information is found. The OPC has separate divisions which carry out various work, including investigations. An individual requesting access to their personal information will submit the request to the OPC Director of ATIP (Access to Information and Privacy Division). Meanwhile, the OPC has a separate complaint investigation unit which is referred to as the OPC Compliance Directorate.

As the OPC Director of ATIP processes the request, records received from the other division (OPC Compliance Directorate) will be processed in accordance with the rules for access to personal information. This means that any information the OPC Compliance Directorate obtained during the investigation of the privacy complaint is considered “information obtained”, and for that reason, can never be disclosed to the individual requesting it, even where that information is the requester’s own personal information. In those same records, personal information that the OPC Compliance Directorate created during its investigation is considered “information created” and for that reason, can be disclosed to the requester when the investigation is concluded, provided it is the personal information of the requester and not that of a third party.

Finally, I mention the operation of section 26 of the Act. This is an exception that prohibits access to the personal information that concerns an individual other than the requester without consent. The giving of consent is a commonly understood prerequisite under privacy laws for the sharing of one’s information. Nonetheless, individuals have complained of being denied access to personal information of others involved in privacy complaint investigations that relate to them. The OPC has a persistent duty to protect the privacy of third parties, even where access rights may be denied in doing so.

With a view to inform the public, here is a summary of the application of the important rules regarding access to personal information found in investigation files of the OPC:

  • the OPC Director of ATIP considers the OPC Compliance Directorate (complaint investigation division) as a separate source for the purpose of applying access to personal information rules when processing requests;
  • personal information of the requester that was obtained by the OPC Compliance Directorate in the course of an investigation can never be disclosed by the OPC Director of ATIP; and
  • personal information of the requester that was created by the OPC Compliance Directorate can be disclosed by the OPC Director of ATIP, provided that the OPC investigation and all related proceedings are concluded, and further provided it does not constitute the personal information of another individual.

In closing, I hope that my Annual Report has provided some useful insights into the rules regarding access to one’s personal information from the files held by the OPC. I look forward to continuing to be of service to those who will seek my assistance in the coming year.

Respectfully submitted,

Anne E. Bertrand, Q.C.
Ad hoc Commissioner


Appendix 6: Review of the Financial Transactions and Reports Analysis Centre of Canada

Section 72(2) of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act

Final Report 2021


Table of contents


Main points

What we examined

Pursuant to subsection 72(2) of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (“PCMLTFA”), the Privacy Commissioner is required to conduct a biennial review of the measures taken by the Financial Transactions and Reports Analysis Centre of Canada (“FINTRAC”) to protect the information it receives or collects.

We reviewed FINTRAC’s policies and procedures related to administrative, physical and information technology (“IT”) security controls. Because FINTRAC’s electronic data holdings are housed on the Shared Services Canada (“SSC”) IT infrastructure, our review covered relevant IT controls in place at SSC and FINTRAC to protect FINTRAC personal information holdings from inappropriate or unauthorized use and disclosure. We also interviewed FINTRAC and SSC officials responsible for implementing and monitoring security controls. In conducting this review, we assessed FINTRAC’s measures against the Treasury Board Secretariat (“TBS”) Policy on Government Security and the Directive on Security Management.

Our office also assessed the progress made by FINTRAC in response to the recommendations from our 2017 report.

What we found

Based on our examination of key documents describing processes and practices supporting FINTRAC’s integrated security framework, with respect to most security measures we reviewed, our review found no indications of inadequacies in FINTRAC’s controls in place to protect the personal information it acquires. However, we found two deficiencies that our Office recommends FINTRAC address on a priority basis.

First, we are concerned that our 2017 observation that activity logs are not regularly monitored for indications of inappropriate access, use, or disclosure has not yet been fully, nor satisfactorily, addressed. Certain progress was made since 2017, but the current log management practices do not adequately monitor to detect or prevent potential unauthorized access by FINTRAC employees. We recommend the implementation of measures against insider threats.

Second, FINTRAC’s Business Continuity Plan (“BCP”) does not address the protection of personal information. We recommend FINTRAC update its BCP so that personal information holdings are continuously protected.

In response, FINTRAC accepted the OPC’s recommendations to address these two concerns, though, for the first, on a longer timeline than the OPC recommended. However, we remain concerned, in light of the fact that FINTRAC did not complete the implementation of several of the recommendations from our 2017 review despite its commitments at the time. Further, it has not committed to complete the implementation of the outstanding elements of our 2017 recommendations within 12 months as we have again recommended.

Introduction

  1. FINTRAC is responsible for facilitating the detection, prevention, and deterrence of money laundering, terrorist activity financing and other threats to the security of Canada. Created in 2001, FINTRAC is an independent agency reporting to the Minister of Finance who is accountable to Parliament for the activities of the institution. It was established by, and operates under, the legislative authority of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (“PCMLTFA”, or “the Act”) and its regulations.
  2. FINTRAC produces financial intelligence to support investigations by Canada’s police, law enforcement and national security agencies in relation to money laundering, terrorist activity financing and threats to the security of Canada. FINTRAC also generates strategic financial intelligence, including specialized research reports and trends analysis.
  3. Under the PCMLTFA, specified businesses are required to collect information about their clients and financial transactions and to report some or all of this information to FINTRAC under prescribed circumstances. Entities who are required to report to FINTRAC include:
    • Financial entities such as banks, credit unions, caisses populaires, financial services cooperatives, credit union centrals, trust companies, loan companies
    • Life insurance companies, brokers and agents
    • Securities dealers
    • Money services businesses
    • Agents of the Crown that accept deposit liabilities
    • Accountants and accounting firms
    • Real estate brokers or sales representatives
    • Casinos
    • Dealers in precious metals and stones
    • British Columbia notaries
  4. FINTRAC receives the following types of reports from these entities, all of which contain sensitive personal information:
    • Cross Border Currency and Seizure Reports (“CBCR” and “CBSR”)
    • Suspicious Transaction Reports (“STR”)
    • Large Cash Transaction Reports (“LCTR”)
    • Electronic Funds Transfer Reports (“EFTR”)
    • Casino Disbursement Reports (“CDR”)
    • Voluntary Information Records (“VIR”)
    • Terrorist Property Reports (“TPR”)
  5. A variety of large but potentially legitimate purchases might be reported to FINTRAC as a matter of course. LCTRs, EFTRs and STRs submitted to FINTRAC could include transactions such as down payments and mortgage arrangements for home purchases, car, boat and recreational vehicle purchases, funds sent to family members abroad and money received by international students studying in Canada. FINTRAC analyzes this financial information to determine whether there are reasonable grounds to suspect that the information is relevant to the investigation or prosecution of a money laundering or a terrorist financing offence.
  6. If this is the case, FINTRAC must disclose designated information to the appropriate police force at the federal, provincial or municipal levels, to be used in the investigation of money laundering and/or terrorist activity financing. When FINTRAC has reasonable grounds to suspect the information to be disclosed is relevant to threats to the security of Canada, it must disclose the information to the Canadian Security Intelligence Service.
  7. Further, if FINTRAC also meets other specific legal thresholds set out under the PCMLTFA, the institution must disclose designated information to additional disclosure recipients including the Canada Border Services Agency, the Canada Revenue Agency, the Communications Security Establishment, provincial securities regulators, the Department of National Defence and the Canadian Forces, the Competition Bureau, and Revenue Quebec. FINTRAC may also disclose information to international foreign intelligence units in certain circumstances.
  8. Ownership of the infrastructure supporting FINTRAC core systems and electronic data holdings was transferred to Shared Services Canada (SSC) in 2012. The two institutions have a shared responsibility relating to the protection of systems and personal information received, collected and electronically managed in the context of FINTRAC’s mandated activities. SSC is responsible under the Shared Services Canada Act for providing IT infrastructure. FINTRAC is responsible for ensuring the personal information it collects or receives is managed and protected in compliance with both the Privacy Act and the PCMLTFA.

Objective and approach of this review

  1. The PCMLTFA mandates the Privacy Commissioner of Canada to: “review the measures taken by the Centre to protect information it receives or collects.”
  2. Accordingly, we conducted a review of relevant documentation, including policies, standards, guidelines, frameworks and business processes, and interviewed key officials. Consistent with the scope of a review engagement, we did not undertake detailed examinations of the implementation of policies and processes, or conduct any testing of systems.
  3. The review objective was to assess whether FINTRAC has appropriate controls in place to protect the personal information it collects and retains, which resides on SSC’s IT infrastructure. The review examined governance, risk management and control practices for managing the security of personal information at FINTRAC.

Previous findings

  1. As noted above, the PCMLTFA mandates the Privacy Commissioner of Canada to undertake a biennial review of the measures taken by FINTRAC to protect the information it receives or collects and to report on those measures to Parliament. This is the fourth such review undertaken by the Commissioner.Footnote 1
  2. Our 2017 report found the following deficiencies with respect to measures taken to protect information FINTRAC collects and receives:
    • no Security Assessment and Authorization had been completed by SSC for the infrastructure supporting FINTRAC core systems transferred to SSC;
    • the lack of an agreement between FINTRAC and SSC that clearly defined IT security roles and responsibilities and relevant privacy and security clauses for the protection of FINTRAC information holdings; and,
    • FINTRAC did not regularly monitor activity logs for IT systems to identify inappropriate access, use, or disclosure of personal information.
  3. FINTRAC agreed with the recommendations and committed to the implementation of specific actions to address them. Our current review assessed the progress made by FINTRAC in addressing these findings.

Security Assessment and Authorization

  1. We are satisfied that FINTRAC adequately remediated this deficiency with respect to the infrastructure on which FINTRAC data resides. Specifically, FINTRAC formally assessed and accredited its legacy infrastructure, which had not changed significantly since infrastructure operations were transferred to SSC. SSC also completed a Security Assessment and Authorization process to update certification and accreditation of the IT infrastructure on which FINTRAC’s data resides.
  2. FINTRAC then submitted a formal request to SSC and after a review of the FINTRAC documentation, the Chief Technology Office, Security Management of SSC, determined that the existing security assessment and FINTRAC authorizations were adequate.
  3. With respect to accreditation of the IT infrastructure over which FINTRAC data flows, we noted in our 2017 review that the accreditation letter for the Managed File Transfer System (MFST) had expired in February 2015. The MFST is a Shared Services Canada solution that FINTRAC and other federal institutions use for transferring certain information between federal institutions. In response to our 2017 recommendation FINTRAC took steps to have Shared Service Canada provide it with interim Authority to Operate (iATO) certificates for the MFST.
  4. While this action addressed the specific recommendation made by OPC at the time, we are concerned by the continued reliance on ‘interim’ Authority to Operate for the MSFT, which has now been renewed for more than 5 years. The iATO for the MSFT (Refer to Annex A) that we reviewed during this exercise indicates that SSC had assessed the level of risk associated with the MSFT as high “for which the target level of acceptable residual risk is Low.”
  5. We are encouraged by the fact that this iATO was approved on the basis of seven conditions (steps to be taken to address security concerns), and that it mandates that the SSC MSFT service must report on the progress towards satisfying these conditions to the Assessment Authority, the Senior Assistant Deputy Minister of SSC.
  6. We shared our observations on this matter with SSC in the course of this review. SSC has since put in place a further interim ATO (Refer to Annex B). We strongly encourage SSC to complete the steps it has identified to address the risks and put in place a non-interim authority to operate for the MSFT on a priority basis.

Lack of Agreement between FINTRAC and SSC

  1. We are satisfied that FINTRAC adequately addressed this deficiency. Specifically, FINTRAC and SSC established a joint interdepartmental working group to clearly define their respective IT security roles and responsibilities, and relevant privacy and security clauses for the protection of FINTRAC’s information holdings. The “Collaborative Agreement for Partnership Roles and Responsibilities with respect to Privacy and Security between Financial Transactions and Reports Analysis Centre of Canada (FINTRAC) and Shared Services Canada (SSC)” was adopted in June 2019.
  2. The Collaborative Agreement notes that FINTRAC and SSC share the responsibility for the overall IT infrastructure ecosystem that FINTRAC relies upon in order to execute its legislative mandate. Our examination of the Collaborative Agreement, coupled with interviews of SSC and FINTRAC officials, determined that it formally documents the business arrangement between FINTRAC and SSC for IT security roles and responsibilities for their respective programs and services as they relate to privacy and data security.

Lack of regular monitoring of activity logs

  1. We are concerned that despite the passage of time, FINTRAC has not adequately addressed this deficiency. FINTRAC took an introductory step of centralizing its logging system and advanced plans for further monitoring. However, it still does not have tools or processes in place to actively monitor activity logs, including those related to access to personal information records and other IT events that could indicate security risks. This is despite the requirement in Appendix B of the TBS Directive on Security Management to analyze information system audit logs and records.Footnote 2

Concerns with respect to collection and retention of personal information that does not meet reporting thresholds

  1. In addition to examining issues related to the protection of FINTRAC’s holdings of personal information, in 2017, our review (under the authority of Section 37 of the Privacy Act), also examined and found concerns that FINTRAC continued to receive and retain personal information that did not meet reporting thresholds set out in the PCMLTFA. FINTRAC is obliged to destroy information that does not meet reporting thresholds when this determination is made in the normal course of its activities.
  2. However, such reports were being received by FINTRAC and retained in its databases, potentially for long periods of time. In February 2021, FINTRAC provided a status update on its commitment to implementing action plan items related to our recommendations. However, as described below, with the exception of front-end screening, FINTRAC’s reported activities did not include clear evidence of implementation of our recommendations.
  3. Front-end screening: We consider that FINTRAC implemented our recommendation that it continue its efforts to implement robust and comprehensive front-end screening for incoming submissions to ensure the records and reports it retains meet legislated reporting thresholds and do not contain unnecessary and/or excessive personal information. FINTRAC confirmed it continues to employ front-end screening. Though it did not identify any modifications made over the last 3 years to improve them, it indicated that it has made efforts to ensure new legislation and regulations will facilitate the adoption of more robust reporting systems. Changes to reporting systems will be implemented in conjunction with pending changes to reporting forms. For example, FINTRAC’s new Large Virtual Currency Transaction Report (LVCTR) forms have been developed with these considerations. Further, FINTRAC cites that the majority of reports it has identified for destruction in the past 3 years have been due to errors by reporting entities that could not have been detected through FINTRAC front-end screening, as there were no indicators in the reports that would flag over-reporting.
  4. Manually review and delete Terrorist Property Reports (“TPR”) not meeting the reporting thresholds: To address concerns similar to those identified in our 2009 and 2013 reviews about FINTRAC holding reports received based on ‘possible’ matches to terrorist listings, in 2017 we recommended: “FINTRAC should manually review all TPRs, and immediately dispose of those which are identified as not meeting reporting thresholds.” At the time, FINTRAC accepted this recommendation and said: “FINTRAC will conduct a manual review of all Terrorist Property Reports in its data holdings. Should the Centre have, or receive, information from a reporting entity that has submitted a report or the appropriate authorities that a terrorist affiliation is no longer suspected to exist, FINTRAC will immediately segregate and dispose of that report.”
  5. FINTRAC confirmed that each of the 48 TPRs received over the last 4 years were manually reviewed, while noting none, in its view, met the threshold for destruction. However, FINTRAC did not address our recommendation, and their commitment, to review reports it still holds which were received previously.
Recommendation and FINTRAC Response
  1. We therefore recommended that FINTRAC carefully review our 2017 recommendation and take steps to clearly and fully implement it within 12 months. FINTRAC indicated that it “accepts” this recommendation. However, it went on to indicate that its planned actions in response to the recommendation are, on an ‘ongoing’ basis, to manually review all incoming TPRs upon receipt and delete those reports that do not meet the legislative requirement.
  2. This does not address the existence of previously received reports within FINTRAC’s holdings based on “possible matches” to terrorist listings – as described in our 2017 report. It is also at odds with FINTRAC’s express commitment, in 2017, that it would review all TPRs in its holdings and dispose of those where a terrorist affiliation is no longer suspected to exist. We are concerned by this lack of follow-through by FINTRAC on its express commitments. We do not see this as fulfilling or being responsive to our recommendation.
  3. Dispose of Electronic Funds Transfer Reports and Large Cash Transaction Reports below reporting threshold: To address concerns that as of July 31, 2016, FINTRAC had received approximately 80,000 EFTRs and 150,000 LCTRs that were under the $10,000 reporting threshold and did not fall under the 24-hour rule, in 2017 we recommended: “FINTRAC should dispose of EFTRs and LCTRs that are below the $10,000 reporting threshold and that do not fall under the 24-hour rule, as the Centre has no legislative authority to retain this information.” At the time, FINTRAC accepted the recommendation and indicated it was in the process of updating its guidelines for reporting entities and would explore options to identify, segregate and destroy information that it should not have received.
  4. In reporting on progress against this recommendation, FINTRAC indicated that, (i) in 2020, it broadened its procedures for segregating reports received which fall outside the legislated threshold, to apply beyond its own compliance examinations (e.g. self-declarations by reporting entities of over-reporting); (ii) it recently began an internal management review of quality assurance for receipt of inappropriate information encountered, and; (iii) it has updated the terms of reference of its sub-committee responsible for ensuring that such reports are segregated for the purpose of destruction. It further reported that its policy is to segregate and dispose of reports uncovered during the normal course of business that are found to not meet a legal reporting threshold.
  5. However, despite having accepted our 2017 recommendation to dispose of the reports flagged by the OPC, it did not address the key question of whether such reports have been destroyed.
Recommendation and FINTRAC Response
  1. We therefore recommended that FINTRAC carefully review our 2017 recommendation and take steps to clearly and fully implement it within 12 months. FINTRAC indicated that it “accepts in principle” this recommendation, and that as part of upcoming changes to FINTRAC’s reporting systems, it will update various reporting forms to, among other things, minimize the receipt of reports that do not fall under the 24-hour rule. It provided no timeline for this activity despite having indicated in 2017 that, at that time, it was “in the process” of updating its guidelines for reporting entities. Further, FINTRAC indicated, with respect to actually disposing of reports already received, that: “while a more proactive manual approach is not feasible, FINTRAC will continue to explore the possibility of additional automated solutions to identify and segregate such reports.” It provided no time frame to implement this ‘exploration’ that it first undertook to do in 2017.
  2. These actions do not meaningfully address the existence of previously received reports within FINTRAC’s holdings that are outside its legislative authority – as described in our 2017 report. We are concerned by this lack of commitment to follow-through by FINTRAC on OPC’s recommendation. We do not see FINTRAC’s response as fulfilling or being responsive to our recommendation.
  3. Avoid collection of unnecessary personal information in the course of compliance exercises (and dispose of it where received): To address concerns we identified that FINTRAC’s compliance program (to ensure reporting entities meet their obligations under the PCMLTFA), was collecting and retaining unnecessary personal information, in 2017 we recommended: (i) that FINTRAC should expand its outreach efforts to specifically address the issue of personal information unnecessarily provided by entities during compliance examinations, and that (ii) FINTRAC should undertake an internal data minimization and purging exercise in order to dispose of personal information in its compliance examination files that is not needed to support deficiencies.
  4. At the time, FINTRAC accepted the recommendation and indicated: (i) that it was revising its notification letters accordingly and would look to expand its outreach efforts to address the issue; and (ii) that it was implementing a quality assurance process that would underpin a data minimization and destruction exercise aimed at disposing of information received and collected in the context of its compliance examinations that did not support identified deficiencies.
  5. With respect to (i), FINTRAC confirmed that its compliance notification letters now specify that reporting entities should provide the requested information only, and not further personal information of employees or clients, and that it provides related training to its compliance staff. FINTRAC also indicated it had expanded its outreach regarding personal information by developing two new staff training modules and a new policy regarding the handling of Reporting Entity Information. With respect to (ii), FINTRAC subsequently undertook three quality assurance reviews of its compliance activities, and an exercise to purge unnecessary personal information from examinations conducted on the banking sector in Toronto. It also noted that it continues to dispose of information collected during the examination process on an ongoing basis. However, FINTRAC provided no indication that unnecessary personal information has been purged from other pre-existing FINTRAC compliance files.
Recommendation and FINTRAC Response
  1. We therefore recommended that FINTRAC carefully review our 2017 recommendation and take steps to clearly and fully implement it within 12 months. FINTRAC indicated that it “accepts” this recommendation. However, it went on to indicate that its planned actions in response to the recommendation are: (i) to continue existing training and reminders to its compliance officers, (ii) undertake a validation exercise of a select sample of examination files from the last two fiscal years (2019-20 and 2020-21) to assess adherence to the existing policies; and (iii) to implement an ongoing quality assurance program to continually assess adherence to internal policies and procedures in relation to the purging data from examination files that was not needed to support the citation of deficiencies.
  2. FINTRAC provided no timeline for completing these activities, and still did not address the original recommendation by OPC to dispose of personal information in its pre-existing compliance examination files (i.e. from 2017 and prior) that is not needed to support deficiencies. We are concerned by this lack of commitment to follow-through by FINTRAC on OPC’s recommendation. We do not see FINTRAC’s response as fulfilling our recommendation.

Observations and Recommendations on Security Safeguards in the Current Review

  1. The Proceeds of Crime (Money Laundering) and Terrorist Financing Act (“PCMLTFA”) requires FINTRAC to ensure that appropriate safeguards are in place to protect the personal information it acquires and retains. TBS’s Policy on Government SecurityFootnote 3 (the Policy) prescribes safeguards to protect and preserve the confidentiality and integrity of government assets, including information, and establishes minimum mandatory security requirements.
  2. The Policy is given effect by standards and directives, which must be followed by government institutions. TBS’s Directive on Security ManagementFootnote 4 establishes guidance to achieve efficient, effective and accountable management of security within departments and agencies. The Directive includes mandatory controls for:
    • Security screening
    • Information Technology security
    • Security in contracts and other arrangements
    • Physical security
    • Business continuity management
    • Information management security
    • Security event management
    • Security awareness and training
  3. Accordingly, we expected to find that with respect to the protection of personal information, the safeguards in place by FINTRAC and SSC comply with the PCMLTFA, the Policy on Government Security and the Directive on Security Management – in order to protect personal information and mitigate the risk of breaches and inappropriate disclosures.
  4. With respect to the mandatory controls listed above, with the exception of two deficiencies described in more detail below, we found no indications, from our review of key documents and interviews, of inadequacies.
  5. We note some areas of strength:
  6. FINTRAC has established internal security principles to guide its security related activities as follows:
    • Need-to-Know Principle, wherein persons will only be provided access to sensitive information required to conduct their duties;
    • Access Control Criteria, which determines which persons will have a need-to-know and the required security screening and authorization before being granted access to sensitive information;
    • Security and change management to approved configurations such as new building set up or IT system design. These changes are evaluated to assess threats, vulnerabilities and risk so that senior management can make informed decision on assuming the residual risk; and,
    • Separation of Duties, wherein no one person will have complete and independent control over any security function, IT system or critical business process.
  7. Our review also found that FINTRAC’s security control framework document includes continuous threat and risk management to protect its employees, information and assets; rigorous personnel security screening; the establishment of progressively restrictive security zones; electronic access control devices; identification cards; and mandatory security awareness training. We found no indication of inadequacies in these areas.
  8. We also found that as part of its continuous monitoring regime, FINTRAC has a Vulnerability Assessment and Management Program (VAMP). The program assesses environments and infrastructure owned or used by FINTRAC, including network, server, workstation, and application infrastructure software and hardware. We found no indications of inadequacies in the VAMP Program framework.
  9. Similarly, we found no indications of deficiencies with respect to FINTRAC’s network security configurations. The institution uses network zoning and has separate networks for their Corporate, Analytics and Classified systems. Our discussions and review of related documents showed that FINTRAC’s personal information holdings on FINTRAC’s internet connected networks are protected by consolidating, monitoring and defending Internet gateways, timely patch management, minimization of administrative privileges, hardening of operating systems and applications, and segmentation and separation of information holdings.

Areas of Deficiency

Information System Audit Management

  1. The TBS Directive on Security Management, Appendix B: Mandatory Procedures for Information Technology Security Control includes requirements for institutions relating to monitoring of the threat environment to ensure appropriate security measures are maintained. This includes, specifically, efforts to “create, protect and retain information system audit logs and records to enable monitoring, reporting, analysis, investigation and implementation of corrective actions, as required, for each system, in accordance with departmental practices.Footnote 5
  2. Based on interviews and briefings from FINTRAC officials, the review team found that activity logs from a variety of systems are regularly inspected for indications of inappropriate access, use or disclosure and that a centralized log management solution has enhanced this capability.
  3. In addition, SSC and the Canadian Centre for Cyber Security monitor FINTRAC’s network perimeter and traffic flow. Reports of anomalies at the perimeter, which might indicate network attack activity, are communicated for FINTRAC personnel to investigate and review. Other monitoring activities include automated alerts and manual monitoring for threat hunting or virus alerts.
  4. However, we found these measures to be generally concerned with protection of assets from external threats, and the current monitoring of authorized users does not adequately protect personal information from insider threats. In this sense, the TBS Directive on Security Management broadly requires the monitoring of threats and vulnerabilities - it is not limited to external threats only.
  5. As noted in paragraph 23 above, since 2017, when we first raised this issue, FINTRAC took an introductory step of centralizing its logging system and advanced plans for further monitoring, such as the recent procurement of a software tool. However, we are concerned that despite the recommendation in 2017 to ensure that the activity logs of all IT systems are regularly monitored for indications of inappropriate access, use, or disclosure, FINTRAC has not yet taken adequate steps to do so with respect to internal users.
Recommendation and FINTRAC Response
  1. We therefore recommended that FINTRAC should, within 6 months, implement comprehensive measures to enable user activities to be authoritatively audited, to monitor the acceptable use of government information systems, and to act on potential instances of unacceptable use to ensure users are accountable for their activities.
  2. FINTRAC indicated that it “accepts” this recommendation. However, it went on to indicate that it would not commit to implementing the recommended measures (first raised by OPC in 2017) within 6 months. Rather, it stated: FINTRAC has recently enhanced its insider threat program (details not provided), and is moving forward with an initiative to increase its ability to detect and respond to inappropriate or malicious internal activity. Technology for this initiative has been identified and procured, and implementation is planned for the summer of 2022. It noted that “timelines need to take into consideration that FINTRAC is a small organization and implementing new initiatives take time.”
  3. Given the number of years since FINTRAC first accepted this recommendation, and the length of time FINTRAC has already been working on the planning phase of these enhancements, we are concerned by the unclear timeline and level of priority being given by FINTRAC to this matter.

Business Continuity Management – Personal Information Protection

  1. Personal information holdings are an asset that is critical for FINTRAC in carrying out its operational activities. There is a need to ensure continued protection of departmental assets in case of a business disruption. Moreover, ensuring that personal information is not disclosed without authorization is a legal obligation under the Privacy Act.
  2. The protection of Canadians’ personal information held by government institutions is therefore critical to the safe functioning of an effective government. Business continuity management in the federal public sector aims to support the continued availability of services and associated assets that are critical to the safety and security of Canadians or to the effective functioning of the government.Footnote 6
  3. In this review, we therefore examined FINTRAC’s business continuity plan (BCP). We observed that the plan consisted mainly of evacuation procedures for staff. Measures to ensure continued protection of personal information holdings were not a component of the BCP.
Recommendation and FINTRAC Response
  1. We therefore recommended that FINTRAC should, within 6 months, update its BCP to include measures to address the protection of its personal information holdings. FINTRAC accepted this recommendation, indicating that it has already taken steps to update its BCP template with the requirements of the TBS Directive on Security Management. Its Business Continuity Planning unit will work with each sector to ensure that the personal information holdings as described in FINTRAC’s Disaster Recovery Plan are incorporated in their BCP plans within 6 months.

Conclusions

  1. As noted above, our review identified two protection-related deficiencies, one of which, we are concerned to note, was not adequately remediated despite being identified by OPC in 2017. Similarly, we highlighted outstanding concerns relating to recommendations we made in 2017 with respect to collection and retention that have not been adequately implemented.
  2. We are concerned with FINTRAC’s lack of complete implementation of OPC’s recommendations from our 2017 report. This is compounded by FINTRAC’s unwillingness to commit to completing those recommendations, or to commit to implementing our recommendation on monitoring measures in a timely way. We call on FINTRAC to reconsider its position.

Annex A – MSFT Interim Authority to Operate (Expiry date December 2020)

Interim Authorization of the
Shared Services Canada (SSC)
Managed Secure File Transfer Service

As the Authorizing Official (AO), I hereby accept the security assessment of the SSC Managed Secure File Transfer (MSFT) service at a High level of assessed risk, for which the target level of acceptable residual risk is Low. I grant an interim Authority to Operate (iATO), with an expiry date of December 31, 2020, to process information up to and including PROTECTED B service delivery information with Medium integrity, and Medium availability commencing the day of the Authorizing Official’s approval with the following conditions:

  1. Produce comprehensive system and security documentation, including Build Books, Detailed Design Specification, System Security Plan and a Business Continuity Plan, relating to deploying and maintaining MSFT infrastructure.
  2. Train an alternate technical resource with the required skillset to assist/backup the prime technical resource.
  3. Leverage SSC’s Change Management practices (including a Configuration Management Plan) in accordance with SSC policies and procedures, whereby baseline configurations are documented and all changes including software and security updates are documented and vetted within a team.
  4. Implement a continuous security monitoring strategy and program whereby system security and audit logs are monitored and reviewed on an ongoing basis.
  5. Implement a vulnerability management program wherein vulnerability assessments are performed with security patches and mitigation applied in a timely manner.
  6. Leverage SSC’s operational audit review practice that includes regular review and analysis of audit processing failures, hardware/software failures and capacity management records. Establish real-time alerting sent to specified personnel, and to a separate and distinct environment for audit record storage accessible only by privileged users.
  7. Integrate this service into the SSC Service Authorization process to ensure that the service is listed in SCC’s Service Inventory and appropriately governed through its lifecycle.

The service must report on the progress towards satisfying these conditions, to the Assessment Authority through activities identified in a detailed Plan of Action and Milestones (PoAM). This will be reported on in the SSC ATO dashboard.

Any partner application processing information must be protected by safeguards commensurate to the appropriate categorization level, as assessed and implemented by the partner in consultation with SSC, before being transmitted and/or stored by SSC infrastructure services.

Throughout the iATO period, the service must also meet the following general terms:

  1. Conduct on-going periodic vulnerability scanning;
  2. Seek new authorization for the service:
    1. For each new service release or change that may cause the service to exceed the current level of acceptable risk;
    2. If new evidence is introduced that may invalidate the current risk calculations; or
    3. When requested by Assessment, Authorization, or Compliance Audit authorities.

Failure to meet the conditions/general terms of this iATO may result in actions by the Authorizer up to and including deactivation of the service. The Authorizer is the final arbiter on the meaning of iATO conditions/general terms statements.

 

Original signed by

Greg McKay
Authorizing Official
Senior Assistant Deputy Minister
Shared Services Canada

2020-06-04

Date

Original signed by

Rajagopalan Thuppal
Senior Assistant Deputy Minister
Networks, Security and Digital Services
Shared Services Canada

2020-04-24

Date

Original signed by

Dinesh Mohan
A/Senior Assistant Deputy Minister
Chief Technology Officer Branch
Shared Services Canada

2020-04-14

Date

Annex B – MSFT Interim Authority to Operate (Expiry date March 2022)

Interim Authorization of the
Shared Services Canada (SSC)
Managed Secure File Transfer

As the Authorizing Official (AO), I hereby accept the security assessment of the SSC Managed Secure File Transfer (MSFT) at a Medium level of assessed risk, for which the target level of acceptable residual risk is Low. I grant an interim Authority to Operate (iATO), with and expiry date of March 31, 2022 to process information up to and including Protected B service delivery information with a Medium integrity, and Medium availability commencing the day of approval with the following conditions:

  1. SSC service owner must conduct a Business Impact Analysis (BIA) to definitively establish system/service criticality to MSFT’s clients and stakeholders.
  2. SSC service owner to align with the SSC Patch Management Security Standard and to work with the vulnerability scanning service owner to institute regularly scheduled VA Scans.
  3. SSC service owner must Integrate this service into the SSC Service Authorization process to ensure that the service is listed in the SSC’s inventory and appropriately governed through its lifecycle

The service must report to the Assessment Authority on the progress towards satisfying these conditions through activities identified in the Plan of Action and Milestones (PoAM). This will be reported on in the SSC ATO dashboard.

Any partner Information must be protected by safeguards commensurate to the appropriate level, as assessed and implemented by the partner in consultation with SSC, before being transmitted, processed and/or stored.

Throughout the authorization period, the service must also meet the following general terms:

  1. Seek new or updated authorization for the service:
    1. For each new service release or change that may cause the service to exceed the current level of acceptable risk;
    2. If new evidence is introduced that may invalidate the current risk calculations; or
    3. When requested by Assessment, Authorization, or Compliance Audit authorities.

Failure to meet the conditions / general terms of this iATO may result in actions by the Authorizer up to and including deactivation of the service. The Authorizer is the final arbiter on the meaning of iATO conditions / general terms statements.

 

Original signed by

Greg McKay
Authorizing Official
Senior Assistant Deputy Minister
Shared Services Canada

2021-06-25

Date

Original signed by

Jacquie Manchevsky
Assistant Deputy Minister
Data Centre Services
Shared Services Canada

2021-06-14

Date

Original signed by

Matt Davies
Deputy Chief Technology Officer
Senior Assistant Deputy Minister
Shared Services Canada

2021-06-14

Date

Date modified: