Language selection

Search

Privacy in a pandemic

2019-2020 Annual Report to Parliament on the Privacy Act and Personal Information Protection and Electronic Documents Act

Office of the Privacy Commissioner of Canada
30 Victoria Street
Gatineau, Quebec K1A 1H3


Commissioner’s message

Photo of Daniel Therrien

The need for federal privacy laws better suited to protecting Canadians in the digital age has been a common thread in our annual reports to Parliament for many years.

Last year in this space, I noted how major investigations into Statistics Canada, Facebook and Equifax had all revealed serious weaknesses within the current legislation.

This year, the COVID-19 pandemic makes the significant gaps in our legislative framework all the more striking.

The pandemic has raised numerous issues for the protection of personal information. Around the world, there have been heated debates about contact tracing applications and their impact on privacy. Many of us have been asked to submit to health monitoring measures at the airport, or before we enter workspaces, restaurants and stores.

More broadly, the pandemic has accelerated the digital revolution – bringing both benefits as well as risks for privacy.

The need for social distancing has meant that even more of daily life takes place via the use of technology. Instead of meeting in person, we have further shifted to working, socializing, going to school and seeing the doctor remotely, through videoconferencing services and online platforms.

Technologies have been very useful in halting the spread of COVID-19 by allowing essential activities to continue safely. They can and do serve the public good.

At the same time, however, they raise new privacy risks. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.

As the pandemic speeds up digitization, basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.

We see, for instance, that the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.

The law is simply not up to protecting our rights in a digital environment. Risks to privacy and other rights are heightened by the fact that the pandemic is fueling rapid societal and economic transformation in a context where our laws fail to provide Canadians with effective protection.

In our previous annual report, we shared our vision of how best to protect the privacy rights of Canadians and called on parliamentarians to adopt rights-based privacy laws.

We noted that privacy is a fundamental human right (the freedom to live and develop free from surveillance). It is also a precondition for exercising other human rights, such as equality rights in an age when machines and algorithms make decisions about us, and democratic rights when technologies can thwart democratic processes.

Regulating privacy is essential not only to support electronic commerce and digital services; it is a matter of justice.

When the COVID-19 pandemic emerged in Canada, some felt privacy should be set aside because protecting the lives of Canadians was more important. Given the gravity and the immediacy of the situation, that reaction was not surprising. However, it was based on a false assumption: either we protect public health, or we protect privacy.

When I appeared before the House of Commons Standing Committee on Industry, Science and Technology in May 2020, I indicated that, when properly designed, tracing applications could achieve both public health objectives and the protection of rights simultaneously.

Technology itself is neither good nor bad. Everything depends on how it is designed, used and regulated.

Technology can be used to protect both public health and privacy. I firmly believe that privacy and innovation brought about by new technology are not conflicting values and can coexist. We have seen some good design choices in developing certain public health measures – notably with respect to the federal government’s COVID Alert application.

That being said, the fact remains that our existing legislative framework for privacy is outdated and does not sufficiently deal with the digital environment to ensure appropriate regulation of new technologies.

A recovery based on innovation will only be sustainable if it adequately protects the interests and rights of all citizens. These can, and should, be reflected in our laws.

We need a legal framework that will allow technologies to produce benefits in the public interest while also preserving our fundamental right to privacy. This is an opportune moment to demonstrate to Canadians that they can have both.

OPC response to the pandemic

Throughout the pandemic, my office has recognized that the current health crisis calls for a flexible and contextual application of privacy laws. However, because privacy is a fundamental right, it is very important in our democratic country based on the rule of law that key principles continue to operate, even if some of the more detailed requirements are not applied as strictly as they normally would be.

With a view to achieving both greater flexibility and ensuring respect for privacy as a fundamental right, in April we released a framework to assess privacy-impactful initiatives in response to the pandemic.

We also published guidance to help organizations subject to federal privacy laws understand their privacy-related obligations during the pandemic.

In May, we issued a joint statement with our provincial and territorial counterparts, outlining key privacy principles to consider as contact tracing and similar digital applications are developed.

Privacy guardians from across the country felt it was important to issue a common statement because these applications raise important privacy risks.

But the fact that such a statement was necessary is an unfortunate reminder that some of Canada’s privacy laws – certainly its federal laws – do not provide a level of protection suited to the digital environment.

Respect for privacy rights should not be a suggested best practice left to the goodwill of government officials or big tech. It should be a clearly codified and enforceable requirement.

In a joint resolution last fall, our provincial and territorial counterparts also joined us in calling for effective privacy legislation in a data driven society.

Our response to the pandemic as well as how this enormous public health challenge demonstrates the importance of rights-based law reform is described in more detail in the next section of this annual report, Privacy in a pandemic.

Conclusion

Incorporating good privacy design into COVID-related initiatives will help to build public trust in public health measures, in government and in the digital tools that have become so important to day-to-day life.

The choices our government makes about protecting public health and fundamental values such as the right to privacy will have long-term impacts for all Canadians.

Similarly, the path that the government ultimately chooses to take when it comes to legislative reform will have a significant effect on future generations.

You will read in later chapters of this report about other work done by the Office of the Privacy Commissioner (OPC) during the past year. You will learn of a variety of investigations, of our success in reducing the number of older investigation files, and of the advice we provided to government and businesses to ensure privacy and public health are protected simultaneously. These activities continued as we were faced with unprecedented operational challenges.

On a closing note, as I look back at the past year, I feel fortunate to have talented, dedicated people working alongside me. When the pandemic emerged, the team supported public and private sector organizations. They answered questions from the public and the media on privacy and the pandemic, and helped me respond to requests from parliamentarians related to the health crisis – all while still taking complaints, reviewing breach reports, investigating potential violations of the law, and keeping our IT system running smoothly so as to be able to offer services to Canadians.

They have accomplished all this as they themselves were transitioning to telework and rearranging their lives to comply with public health guidelines. Canadians are very fortunate to have people with such passion and knowledge working to protect their privacy at all times.

Privacy by the numbers

  
Privacy Act complaints accepted 761
Privacy Act complaints closed through early resolution 338
Privacy Act complaints closed through standard investigation 997
Well-founded complaints under the Privacy Act 82%
Data breach reports received under the Privacy Act 341
Privacy Impact Assessments (PIAs) received 78
Advisory consultations with government departments 66
Advice provided to public-sector organizations following PIA review or consultation 121
Public interest disclosures by federal organizations 611
PIPEDA complaints accepted 289
PIPEDA complaints closed through early resolution 221
PIPEDA complaints closed through standard investigation 97
Well-founded complaints under PIPEDA 78%
Data breach reports received under PIPEDA 678
Advisory engagements with private-sector organizations 19
Bills, legislation and parliamentary studies reviewed for privacy implication 29
Parliamentary committee appearances on private- and public-sector matters 8
Information requests 11,450
Speeches and presentations 74
Visits to website 2,813,127
Blog visits 95,190
Tweets sent 1,054
Twitter followers 17,092
Publications distributed 43,746
News releases and announcements 50

Privacy in a pandemic

Emergency response highlights the importance of protecting rights and ensuring trust

The COVID-19 pandemic has caused tragedy and disruption world-wide. Efforts to contain the virus and cope with its social and economic fallout have prompted abrupt and colossal change.

Technology is playing a central role as the world looks to halt the spread of COVID-19 and adapt regular activities to the need for social distancing. Public health and government officials have turned to digital tools such as contact tracing applications as part of the answer to combatting the deadly virus.

Meanwhile, the pandemic has dramatically accelerated the already rapid adoption of disruptive technologies into our day-to-day lives. More of our interactions take place online – be it for socializing with friends and family, doing business or going to school. By necessity, telework, e-learning, and telemedicine are suddenly far more prevalent.

Videoconferencing and other online services have been helpful in allowing some semblance of normal life to continue in the wake of the pandemic. At the same time, however, they are creating important new risks to our privacy rights.

This is particularly concerning given our current privacy laws do not provide an effective level of protection suited to the digital environment.

In our previous annual report, we urged Parliament to adopt rights-based privacy laws that would better protect Canadians in the face of data-driven technologies.

We noted once again that dated federal privacy laws designed for different times hamper our work and are no longer up to the task of ensuring respect for the privacy rights of Canadians.

With the pandemic accelerating the digitization of just about every aspect of our lives, the future we have long been urging the government to prepare for has arrived in a sudden, dramatic fashion. This rapid societal transformation is taking place without the proper legislative framework to guide decisions and protect fundamental rights.

While we look forward to more in-person activities once the threat of the pandemic eases, the so-called “new normal” will likely include more digital interactions and more unwanted – and often undetectable – intrusions into our privacy.

We are collectively encountering uncertain and extraordinary circumstances during the pandemic. These circumstances reveal just how important it is to have laws anchored on a human rights foundation to guide decision-making. Entrenching privacy in its proper human rights context remains not just relevant, but more necessary than ever.

This chapter addresses how the pandemic reinforces the need for laws that protect rights and reflect fundamental Canadian values. It also discusses our office’s approach to privacy issues raised by the pandemic. This includes our work with government institutions and commercial organizations to ensure important privacy principles are integrated in the design of COVID-19-related initiatives.

Finally, it provides an update on developments related to law reform since we published our blueprint for legislative modernization in our previous annual report.

Response to the fallout of COVID-19

Throughout the pandemic, our office has recognized that the current health crisis calls for a flexible and contextual application of privacy laws. However, because privacy is a fundamental right, it is very important in our democratic country based on the rule of law that key principles continue to operate, even if some of the more detailed requirements are not applied as strictly as they normally would be.

With a view to achieving both greater flexibility and ensuring respect for privacy as a fundamental right, in April the OPC released a framework to assess privacy-impactful initiatives in response to the pandemic.

Following this, in May we issued a joint statement with provincial and territorial privacy commissioners on privacy principles that should be respected in the design and during the use of any contact tracing or similar application.

Both these documents were meant to offer clear guidance on how to incorporate privacy into the design of government programs to address the pandemic, in recognition of the fact that our laws do not provide an effective level of protection suited to the digital environment.

Some of the principles put forth in our guidance documents are not legal requirements in our current privacy laws, yet are considered internationally to be fundamental privacy protective measures.

The framework sets out the most relevant privacy principles in the context of the pandemic, without abandoning others. Some of the key principles we noted were:

  • legal authority: the proposed measures must have a clear legal basis;
  • measures must be necessary and proportionate, and therefore be science-based and necessary to achieve a specific identified purpose;
  • purpose limitation: personal information must be used to protect public health and for no other purpose;
  • use de-identified or aggregate data whenever possible;
  • exceptional measures should be time-limited and data collected during this period should be destroyed when the crisis ends; and
  • transparency and accountability: government should be clear about the basis and the terms applicable to exceptional measures, and be accountable for them.

Similar to the OPC framework, the joint statement called on governments looking to adopt tracing applications to respect a number of key principles. For example, we said that the use of apps must be voluntary and that their use must be necessary and proportionate, which requires evidence of their likely effectiveness.

Advisory work on COVID-19 initiatives

During the pandemic, a number of government and non-government organizations consulted our office on various initiatives related to the pandemic.

These initiatives included a contact tracing app developed by a non-governmental organization, the federal government’s COVID Alert application, as well as a retailer’s proposal to introduce temperature checks at the front door of its stores.

In the weeks preceding the launch of the COVID Alert exposure notification app, our office and the Office of the Information and Privacy Commissioner of Ontario (IPC) engaged in productive and in-depth discussions with the federal and Ontario governments.

We provided recommendations to our respective governments based on the key privacy principles outlined in our joint federal, provincial and territorial statement on tracing applications. Because the app was being positioned as a national initiative, our office and the IPC also consulted other provincial and territorial privacy commissioners.

Our office and the IPC supported the use of the COVID Alert app by individuals based in part on an understanding that using the app would be voluntary. However, while the use of the app may be voluntary as it relates to the federal and Ontario governments, there is still a risk that third parties may seek to compel app users to disclose information as to their use of the app, including any exposure notifications.

We also supported the use of the COVID Alert app on the basis that it be effective. The governments sufficiently demonstrated that the application, although new and untested, was likely to be effective in reducing the spread of COVID-19, as part of a broader set of measures that includes manual contact tracing. However, because the effectiveness was uncertain, we recommended that the implementation of the app be closely monitored and that the app be decommissioned if new evidence indicated it was not effective in achieving its intended purpose.

Independent oversight will be important to foster public trust. The federal government agreed to involve our office in an audit of the app after it is up and running. The audit will include ongoing analysis of the necessity and proportionality of the app, including its effectiveness, and an assessment of respect for the federal, provincial and territorial joint statement principles in the design and implementation of the app.

In addition to the work on the COVID Alert application, we were consulted by a number of public sector institutions on initiatives developed in response to the COVID-19 pandemic. This work is ongoing at the time of writing.

This has included providing advice on new activities including those related to social benefit programs, the federal government’s COVID-19 mandatory isolation order for people entering Canada, and temperature screening of passengers on flights in and out of Canada.

One of the social benefit programs we examined in our work was one-time $600 payment in recognition of the extraordinary expenses faced by persons with disabilities during the COVID-19 pandemic. Because the initiative is jointly administered by the Canada Revenue Agency, Veterans Affairs Canada and Employment and Social Development Canada, we suggested that the letters of agreement between the departments include clear provisions on purpose limitations and retention.

In reviewing measures to support the mandatory isolation order, which requires individuals entering Canada to isolate for 14 days, we recommended that institutions retain personal information only for as long as is necessary. We also recommended institutions limit the use of personal information to the purpose for which it was collected, namely public health follow-up and compliance verification.

At the time of writing we had yet to complete our review of the temperature screening program in Canadian airports. However, we had provided recommendations to the Canadian Air Transport Security Authority (CATSA) on transparency and on the collection, use, disclosure and retention of personal information.

We also advised Transport Canada and CATSA to keep informed of public health guidance that may be released on temperature screening, and to consider this guidance when assessing the effectiveness of the initiative.

On the private-sector side, a national retailer contacted our office to seek advice as it was considering screening the temperature of customers entering its stores to help mitigate against the risk of the spread of COVID-19. In particular, the organization indicated that an end result of the temperature screening program it was contemplating included requiring individuals to wear masks in their stores.

We were encouraged to see that the retailer’s considerations included a number of key privacy protective measures, such as obtaining express consent from customers before performing a temperature check, and not recording or retaining any personal information obtained through thermal cameras.

We recommended the organization consider the impact of new requirements to wear masks or other face coverings on its stated needs for the program and, overall, whether it could achieve its goals through less privacy invasive means. We provided a number of recommendations to the retailer should they decide to proceed with temperature screening, including that they only use the thermal cameras in “live feed” mode so as to not retain any personal information, ensure they obtain meaningful consent from customers, and that the program be regularly re-evaluated to ensure alignment with any new guidance released by relevant public health authorities.

COVID-19 and law reform objectives

The pandemic has accelerated the digitization of our lives, moving more of our activities online in an effort to remain safe.

While technology offers tremendous benefits, it also raises risks that are not properly mitigated under the current legislative framework.

We have welcomed efforts by certain organizations in both the private and public sectors to design their initiatives in a privacy protective way.

The federal government’s COVID Alert application is a case in point. When the government first consulted us, the design of the app was good, but it did not meet all key privacy principles outlined in our framework. After productive discussions, the app ultimately did comply with these principles. This example shows that when government institutions or companies want to adopt Privacy by Design, they can.

However, the positive examples we see don’t take away from the urgent need for law reform. While some try to take the right steps to protect privacy, others do not.

Even in the largely positive story of the COVID Alert app, discussions with the federal government highlighted broader concerns related to law reform.

Despite the fact that such applications are extremely privacy sensitive and the subject of global concern for the future of democratic values, the government asserted that the federal Privacy Act does not apply to the initiative, based on its view that the COVID Alert does not collect personal information.

And while the app is voluntary as it relates to the government, there is still a possibility that third parties may seek to compel app users to disclose information as to their use of the app, including any exposure notifications. The governments undertook to communicate publicly that individuals should not be required to use the app or to disclose information about their use of the app.

This step will mitigate, but not eliminate, the risk to the voluntary nature of the app. Other countries have legislated to ensure similar apps are completely voluntary. This again highlights an issue that should be examined as part of legislative modernization.

Other advisory work during the pandemic highlighted the need to examine issues related to public-private partnerships.

We noted that several COVID-19 related initiatives involved the collection of personal information by commercial entities, whether through public-private partnerships or other forms of reliance on the private sector.

We saw an instance where agreements related to a product allowed a company to use personal information beyond the purposes specified for its collection. These agreements, where the terms were difficult for individuals to understand, were used as a basis for consent. Following discussions with the government department, the privacy notice upon which user consent is obtained was improved.

At the outset of the government's response to the pandemic, the Treasury Board of Canada Secretariat (TBS) introduced interim measures to relax the privacy impact assessment (PIA) process for initiatives adopted by federal institutions in response to the COVID-19 pandemic.

In our office's review of these measures, we observed that a number of COVID-19-related initiatives involved partnerships with the private sector, but that in cases where the legal authority for an initiative was based on consent obtained by a private-sector organization, there was no policy requirement for government institutions to ensure that this consent was meaningfully obtained.

In response, the TBS asserted that it could not add such a requirement because it was limited to the existing legal framework. According to the TBS, compelling departments to ensure their private-sector partners had obtained meaningful consent would require legislative amendments.

As a result, it remains the case that a public sector institution could deploy a technological solution to the pandemic that allows its private-sector partner to use the personal information collected for commercial purposes unrelated to public health.

This lack of clarity around data collected for a public purpose by a private entity could potentially result in a company releasing an application and using the information for commercial purposes, provided consent is obtained, even if it is done in incomprehensible terms.

Since the beginning of the pandemic, we have also witnessed an increased use of virtual medicine and e-learning platforms, which has been extremely helpful in allowing education and medical consultations to continue, but comes with new sets of risks.

Access to the doctor’s office has been severely limited in the short term. With virtual medicine, there is a risk of losing doctor-patient confidentiality by virtue of private-sector service providers potentially having access to information from medical visits.

Similarly, many students have been required to use e-learning platforms and videoconferencing during the pandemic, which can result in commercial organizations having access to information related to learning difficulties or other behavioural data of students.

In fact, during the pandemic, we have seen accelerated uses of videoconferencing tools, including for virtual health and learning purposes identified above. Given the sensitive information and vulnerable populations involved, our office joined forces with global counterparts to publish an open letter to video teleconferencing companies reminding them of their obligations to comply with privacy laws and handle people’s personal information responsibly.

We need laws that set explicit limits on permissible uses of data, rather than be left to rely on the good will of companies to act responsibly.

The growing role of public-private partnerships creates additional complexity and risk. At a minimum, we need common privacy principles enshrined in our public- and private-sector laws.

We urgently need rights-based privacy laws that allow technologies to produce benefits in the public interest and ensure privacy rights are protected.

As a former advisor to President Obama recently wrote, no one would think that freedom of assembly is at risk, despite the temporary limitations imposed by the current health crisis. That is because freedom of assembly (in Canada, freedom of association) is constitutionally protected.

There is no such certainty for rights in the digital sphere. Privacy is at risk, and modern laws are required to appropriately protect it as a fundamental right.

Even prior to the pandemic, trends such as increased reliance on technology and on public-private partnerships had reached a tipping point where privacy and democratic rights were strained and reform was overdue.

Our previous annual report discussed how the Facebook-Cambridge Analytica scandal, as well as Facebook’s refusal to address privacy deficiencies, spotlighted the need for better protections. That spotlight is now even more intense in the face of the pandemic and its related privacy issues.

As noted previously, our office reviewed interim policy measures introduced by the TBS to provide greater flexibility to federal public sector institutions in responding to the COVID-19 pandemic.

We expressed concern about these measures, which relaxed existing requirements related to PIAs without offering adequate replacements. Therefore, in our view, the measures did not offer a balanced approach in assessing the impact to privacy of urgent COVID-19-related initiatives.

To address this imbalance, we recommended that the TBS include, in the Interim Directive on Privacy Impact Assessment, a policy statement reminding institutions of the government's commitment to privacy as a fundamental, quasi-constitutional human right, even though the usual rules have been suspended.

In response, the TBS took the position that such a statement would be inconsistent with its legislative and policy framework, going beyond targeted changes to provide flexibility in responding to the pandemic. Despite numerous exchanges, the TBS maintained that it could not do so within the existing legislative framework.

Such reluctance on the part of government to assert privacy as a fundamental right absent such direction from Parliament after debate among elected officials is frankly difficult to understand and is further evidence of the imperative need to reform our legislation.

Blueprint for reform

The recommendations for legislative change set out in our 2018-2019 annual report remain extremely relevant for addressing the new challenges raised by the pandemic. We developed them in the context of a crisis of trust that has been building for years.

Data breaches have affected tens of millions of Canadians. Our public opinion research tells us that some 90% of Canadians are concerned about their inability to protect their privacy. Only 38% believe businesses respect their privacy rights, while just 55% believe government respects their privacy.

There can be no trust if rights are not respected. Good privacy laws are key to promoting trust in both government and private sector organizations. This was true before COVID-19, and it has become even more important in the wake of the pandemic. Canadians want the benefits of digital technologies, and the assurance that they may use those technologies without forgoing their rights.

In our blueprint for legislative reform, we said that the starting point should be to give new privacy laws a rights-based foundation. A central purpose of the law should be to protect privacy as a human right in and of itself and as an essential element to the realization and protection of other rights.

We suggested that both PIPEDA and the Privacy Act include preambles and purpose statements that entrench privacy in its proper human rights framework.

These would serve to bridge the gap between data protection and privacy. They would also provide the values, principles and objectives to guide how the data protection principles in both federal acts are interpreted and applied.

We also advocated for a reformed private-sector privacy law that would no longer be drafted as an industry code of conduct, thus putting an end to self-regulation.

To ensure Canadians can enjoy the benefits of digital technologies safely, we proposed the addition of enforcement mechanisms to offer quick and effective remedies for people whose privacy rights have been violated and to encourage compliance with the law.

These mechanisms would include empowering the Privacy Commissioner to make binding orders and impose consequential administrative penalties for non-compliance with the law, as well as proactive privacy inspections by our office to ensure demonstrable accountability. Such enforcement powers are being used successfully by other data protection authorities around the world.

These elements we have called for in our rights-based approach to law reform are in place in the laws of most of Canada’s trading partners, as you can see in Figure 1.

Figure 1 – Privacy Protection: Canada and its trading partners

Figure 1: Privacy Protection: Canada and its trading partners: see text version.

 

 
Privacy Protection: Canada and its trading partners
Jurisdiction Year privacy law last updated Defining privacy as a human right Rule-making authority Demonstrable accountability Order-making powers Administrative monetary penalties Private right of action
Canada (PIPEDA) 2015 no no no no no no*
Argentina 2018 yes yes yes yes yes yes
Brazil 2018 yes yes yes yes yes yes
European Union 2018 yes yes yes yes yes yes
United Kingdom 2018 yes yes yes yes yes yes
Australia 2012 yes yes yes yes yes yes
Mexico 2016 yes yes yes yes yes no
South Korea 2018 yes yes yes yes yes no
New Zealand 2020 yes yes yes yes no no
Singapore 2012 no yes yes yes yes yes
Japan 2015 no yes yes yes yes no
California (California Consumer Protection Act) 2019 no yes yes no Yes yes

* The Personal Information Protection and Electronic Documents Act (PIPEDA) currently provides for a right for individuals to bring an organization to the Federal Court to seek remedies such as an order requiring the organization to correct its practices and/or damages, but only following an investigation and report of findings, or notice of discontinuance by the Office of the Privacy Commissioner of Canada (OPC).

Defining privacy as a human right: Legislation recognizes privacy as a human right, or adherence to an international agreement that does so (e.g., Convention 108).

Rule-making authority: Data protection authority or other public authority can issue enforceable codes of conduct, standards, guidance and/or regulation.

Demonstrable accountability: Data protection authority can legally seek production of specific records to prove data management practices, prior to investigation, and/or data protection authority has legal authority to conduct proactive inspections, reviews, audits to verify compliance, absent specific grounds to suspect or believe a specific infraction has occurred.

Order-making powers: Data protection authority has the power to back findings with orders for particular remedies.

Private right of action: Legal provisions allowing individuals to directly seek remedies and/or compensation from a court for breaches of privacy laws.


 

Canada used to be a leader in privacy law, but has clearly fallen behind other jurisdictions in the world.

Our proposals are realistic and contemporary, and they would also improve the interoperability of our laws with other jurisdictions, providing predictability and potentially reducing the cost of compliance for Canadian businesses.

Our proposals also recognize legitimate interests of organizations and government.

We have said that the best way for Canada to position itself as a digital innovation leader is to demonstrate how we can establish a framework for innovation that also successfully protects Canadian values and rights, as well as our democracy. This will also be fundamental to maintaining public trust in both businesses and government.

Update on the road towards reform

We have heard more talk about law reform over this last year than any other time in recent memory.

Back in May 2019, the crisis of trust led the federal government to propose a Digital Charter, which includes plans to update PIPEDA. The government has since reiterated its intent to reform both PIPEDA and the Privacy Act.

However, more than a year later, we have yet to see the specific ways in which our legislative framework would be modernized to live up to the challenges of the digital age – and to Canadians’ expectations.

Shortly after our previous annual report was tabled in Parliament in December 2019, the Prime Minister issued mandate letters to his newly appointed cabinet members. Several of these letters included direction related to privacy law reform.

Various cabinet members were asked to collectively advance a number of legislative and policy elements that could lead to stronger privacy protection for Canadians. Several of these elements referred back to Canada’s Digital Charter.

The proposals for PIPEDA reform under the Digital Charter have some positive elements. For example, it refers to a new set of online rights to further protect Canadians’ privacy, such as data portability, explicit rights to delete online information (source takedown), and increased transparency about the use of automated decision-making.

We addressed some of our concerns about elements of the Digital Charter in last year’s blueprint, most notably related to the proposals on providing our office with “circumscribed” enforcement powers, and on exceptions to consent.

“Circumscribed” order-making – rather than broad order-making powers, which are common in most jurisdictions – is not only inefficient, but ineffective. Against the backdrop of a fast-moving digital economy where risks arise and evolve on a daily basis, it would cause delay and encourage limited compliance. In contrast, broad order-making powers would mean organizations would see an interest in complying, and provide Canadians with quick and effective remedies in cases of non-compliance.

We have also called for changes that would enable our office to impose administrative monetary penalties, rather than a framework that only envisions fines imposed by the courts.

In most other jurisdictions, notably the European Union and the United States, laws provide for significant administrative monetary penalties imposed by the regulator. There is no reason things should be different in Canada. Ontario has that power under its Personal Health Information Protection Act. Quebec’s Bill 64 proposes similar powers, as well as a number of provisions similar to those of the General Data Protection Regulation (GDPR). The bill is the most progressive legislative proposal in Canada of late.

We also remain concerned about the concept of an exception to consent for “standard business practices” as proposed under the Digital Charter. As defined, it is much too broad of a concept, one that risks becoming a catch-all exception, if not a gaping hole. Businesses should not be allowed to dispense with consent merely because a practice is one they determine to be “standard.”

Our law reform proposals include exceptions to consent that would facilitate innovative uses of technologies where consent is not possible, and our proposed preambles and purpose statements for PIPEDA would serve to recognize the legitimate interests of businesses, but within a rights-based framework.

The Commissioner wrote to several key cabinet members in the weeks that followed the last federal election. He shared with them his advice on how to achieve the objective of protecting Canadians’ rights in a way that fosters trust and innovation, and has subsequently had the opportunity to discuss the issue with them further. We remain committed to continue working with government on these issues, and look forward to the opportunity to be engaged in the review of specific legislative proposals.

On the public sector side, the government has plans to advance the Data Strategy Roadmap for the federal public service, which includes an engagement to strengthen data privacy protections as well as to increase government use of data and automated decision-making.

In parallel, the Department of Justice has engaged targeted stakeholders about modernizing the Privacy Act through a series of discussion papers. We responded to the Department’s consultation on Privacy Act modernization in December 2019.

The Government of Canada’s National Data Strategy Roadmap focuses heavily on “unlocking the value” of data. It states that data have the power to enable the government to make better decisions, design better programs and deliver more effective services for the public.

In order to optimize the value of data and create efficiencies there is a focus on increased sharing, both between government departments and between the public and private sectors.

The public and private sectors are increasingly working together in developing standards, in the storage of government data and in delivering digital government programs to the public, among other activities. Given this greater focus on public-private partnerships, it is important that both our federal statutes adopt similar principles. The public sector should not be held to a different standard than the private sector. In fact, it should be a leader in privacy protection.

In our submission in response to the Department of Justice consultation, we recognized that technology can provide better services to Canadians and help achieve important public interest objectives. We provided advice on how the Privacy Act could be modernized to ensure privacy rights and Canadian values are respected, consistent with our blueprint.

These key issues we identified for law reform are proving to be all the more relevant in the context of the COVID-19 crisis, and the move towards digital government in general, which only serves to reinforce the need for modernized federal privacy laws.

Ongoing policy work

Since the publication of our blueprint, our office has continued its policy analysis of both federal privacy laws with a view to providing further advice on legislative reform.

Artificial intelligence presents fundamental challenges to all of PIPEDA’s privacy principles. Responsible innovation involving artificial intelligence systems must take place in a regulatory environment that respects fundamental rights and creates the conditions for trust in the digital economy.

In early 2020, we launched a public consultation to seek input on protecting Canadians’ rights as artificial intelligence expands.

As part of our consultation, we issued a discussion paper containing key proposals for PIPEDA reform that would bolster privacy protection and achieve responsible innovation involving artificial intelligence systems. We sought input on a number of targeted questions in order to elicit feedback from experts in the field as to whether our proposals would be consistent with the responsible development and deployment of artificial intelligence systems.

As we noted in the discussion paper, we are paying specific attention to artificial intelligence systems given their rapid adoption for the purpose of processing and analysing large amounts of personal information. Their use for making predictions and decisions affecting individuals may introduce privacy risks and discrimination.

As with other technologies, artificial intelligence comes with both benefits for the public interest and risks for human rights.

For example, artificial intelligence has great potential in improving public and private services. It has helped spur new advances in the medical and energy sectors. However, the impacts to privacy and human rights will be immense if legislation does not include clear rules that protect these rights against the negative outcomes of artificial intelligence and machine learning processes.

The input we have received from stakeholders through this consultation will serve to refine our thinking and make our law reform proposals more relevant.

Recent investigations and law reform

As with last year, a number of investigations discussed in this annual report highlight gaps and other shortcomings in the current legislative framework.

For example, under The Privacy Act: A year in review, we summarize our investigation into a leak related to a Supreme Court nomination. The complainant requested that we investigate the roles of various institutions, some of which are under the jurisdiction of the Privacy Act, and others that are not. Our investigation was constrained by the jurisdictional limitations of the Act.

The Privacy Act section of this report also describes follow-up work related to an investigation into Statistics Canada’s use of detailed financial information about millions of Canadians. The investigation had found the initiatives did not respect the principles of necessity and proportionality, which are recognized globally as fundamental privacy principles but are not reflected in our federal laws.

Under The Personal Information Protection and Electronic Documents Act: A year in review, we examine issues related to protecting privacy in the context of outsourcing as part of investigations into TD Canada Trust’s and Loblaw Co. Ltd.’s transfers of customers’ information to service providers outside Canada for processing.

Also under The Personal Information Protection and Electronic Documents Act: A year in review, we summarize our investigation into RateMDs.com, a website where patients can rate and review health care professionals. This investigation demonstrates how the existing legislative framework is insufficient in upholding the reputational rights of Canadians in the digital economy.

Declarations in support of reform

Over the last year, our office has welcomed support for law reform to protect human rights from within Canada and beyond.

Privacy commissioners from around the world adopted a resolution on privacy as a human right and precondition for exercising other fundamental rights at their annual meeting last fall in Tirana, Albania.

The resolution from the recently renamed Global Privacy Assembly was an important step in the commitment to privacy as a human right worldwide.

The United Nations declared privacy an inalienable and universal human right in 1948, and in 1966 the International Covenant on Civil and Political Rights reaffirmed the central role that privacy plays in democracy. Since then, over 80 countries worldwide have enshrined privacy rights for individuals in their laws and regulations.

The resolution noted growing support and increased calls from civil society, academia, media organizations, legal professionals and others to assert and protect privacy rights globally.

It called on governments to reaffirm a strong commitment to privacy as a human right and value in itself, and to ensure legal protections. It asked legislators to review and update privacy and data protection laws, and encouraged regulators to apply all relevant laws to activities in the political ecosystem.

Finally, it called upon businesses to show demonstrable accountability across commercial activities; civil society organizations (including media and citizens) to exert their privacy rights; and for all organizations to assess risks to privacy, fairness, and freedom before using artificial intelligence in their activities.

Closer to home, our provincial and territorial counterparts joined us at our 2019 annual meeting in endorsing a resolution on the need for effective privacy legislation in a data driven society.

We noted at the time that many access and privacy laws in Canada have not been fundamentally updated in decades. As a result, the level of privacy protection granted to Canadians has been outpaced by what many other countries provide to their citizens.

Legislative developments in Canada and beyond

Some jurisdictions around the world are seeing positive developments with respect to law reform.

Europe is taking stock of the GDPR after two years of implementation. The GDPR significantly raised the bar for privacy and showed us how a human rights-based law can work in practice. As we look to Europe, we see that companies there are continuing to operate successfully under the GDPR. Rights-based laws are not an impediment to innovation. To the contrary, they help to build the consumer trust necessary to support and drive an efficiently operating digital economy.

In the United States, various jurisdictions are adopting data protection laws – including in digital revolution hubs such as California and Washington State.

Closer to home, Quebec’s National Assembly is examining a bill intended to, in the words of the former Justice Minister, be “very closely modelled on European best practices” and “give more teeth” to the province’s data protection statutes.

With Bill 64, the Quebec government is proposing to grant citizens clear, enforceable rights such as the right to erasure, and to significantly increase the enforcement powers of the Commission d’accès à l’information (CAI), bringing its powers more in line with those of data protection authorities in other jurisdictions, including the European Union.

Conclusion

These are unprecedented times for Canada and countries around the world.

Getting privacy right during the pandemic will serve to build trust in public health institutions and in the public sector at large, as well as in the digital tools that are becoming essential to live safely.

As we said in our joint statement with provincial and territorial colleagues: “The choices that our governments make today about how to achieve both public health protection and respect for our fundamental Canadian values, including the right to privacy, will shape the future of our country.”

Social and economic recovery will not be sustainable unless the interests and rights of individuals are respected.

The Privacy Act: A year in review

Some of the files we worked on in 2019-2020 would have been inconceivable when the Privacy Act came into force more than 35 years ago. For instance, we advised the RCMP on the use of drones and DNA, investigated complaints on mass harvesting of private sector data by Statistics Canada, and discussed the privacy considerations of video recruitment with government departments.

In addition, we have continued to help federal institutions adopt privacy protective measures as they develop new initiatives. We redesigned our guide on PIAs with an eye to making it more practical, and engaged in outreach activities with institutions to help them better understand their obligations.

On the investigations side, we made great strides in reducing a backlog of cases. We achieved these results in part through enhancing our processes, and adopting a new approach when institutions are not responding to their personal information requests in a timely or adequate manner.

The following section highlights key initiatives under the Privacy Act.

Privacy Act enforcement

Operational updates and trends

In 2019-2020, we accepted 761 complaints under the Privacy Act. Although this seems like a significant decrease compared to the previous year, the change is mostly due to an evolution of our counting methodology towards enhancing accuracy and consistency.

We have adjusted how we track and report on complaints and investigation findings. Since April 1, 2019, when an individual’s complaint about a single matter represents potential contraventions of multiple sections of the Privacy Act, or when an individual complains following multiple access requests made to one institution, we track and report these as a single complaint.

In our view, this method more accurately represents the number of individuals raising privacy concerns, and provides a more consistent picture of our work across both the Privacy Act and PIPEDA.

We noted a decrease in complaints at the very end of the fiscal year, as the global COVID-19 pandemic spread to Canada. This was predictable and understandable given the magnitude of the public health crisis.

During 2019-2020, the majority of complaints we accepted were related to access to personal information (28% of accepted complaints) and to institutions failing to respond to access requests within the time limit required under the Act (45% of accepted complaints).

Of significant note this year are important reductions in our office’s response times for files that weren’t previously backlogged. This is due in part to the success and efficiency of our restructured intake and early resolution functions, which improved our early resolution process. These improvements were gained by implementing a new deemed refusal approach and by resolving a greater number of complaints through summary investigations where appropriate.

We also launched a new online complaint form, which has streamlined and automated our process for receiving and triaging complaints, resulting in greater efficiency and usability.

We have also continued to implement the deemed refusal approach to time limit complaints described in our previous annual report. Partly as a result of this approach, we have observed a sharp increase in the number of well-founded (and not resolved) complaints: in 2019-2020, we have found 177 complaints to be well founded, compared to 49 the previous year. Of these well-founded complaints, 146 were time limit complaints in 2019-2020, compared to 18 the previous year.

Deemed refusal approach

Federal institutions too often fail to meet their obligations to respond to personal information requests made under the Privacy Act within the specified time limits. Year after year, we receive many complaints from individuals alleging that a government institution unjustly denied them timely access to their personal information.

This past year, our office has taken firm steps to both: (i) incite superior engagement, responsiveness and timeliness on the part of institutions under investigation; and (ii) better empower complainants who raise issues of institutions failing to respond to personal information requests within the time limits provided in the Act.

We have therefore instituted our deemed refusal approach for time limit complaints, issuing deemed refusal findings to address situations where institutions are not responding in a timely or adequate manner, or are unable to commit to a release date for access to personal information requests.

A deemed refusal finding allows Canadians to exercise their right to apply before the Federal Court in a timely manner if they have faced challenges when attempting to access their personal information.

An important component of the deemed refusal approach includes a process to conditionally resolve complaints where an institution commits to responding to personal information requests within an acceptable period of time. This has resulted in the expedited conclusion of 191 complaints.

Unfortunately, institutions would not provide a commitment date in 146 cases, which resulted in deemed refusal findings. In 2019-2020, we issued deemed refusal letters to 11 government institutions, including 98 letters to Correctional Service Canada (CSC) and 30 to the RCMP. By comparison, in 2018-2019, we issued 31 deemed refusals against three government institutions (CSC, the RCMP, and Health Canada).

Owing to the approach, we were also able to resolve 372 complaints on an expedited basis without needing to apply a commitment date or issue a deemed refusal finding. In such cases, institutions responded promptly to resolve the complaint. The remainder of the complaints were either not well-founded or abandoned by the complainants.

Our deemed refusal approach has significantly reduced the treatment time of time limit investigations where we face resistance or lack of responsiveness from organizations, ensuring the investigation will not exceed a year.

On a government-wide level, it is evident that responding to Canadians' requests for access to their personal information remains a chronic and widespread problem. In our view, this is largely due to sub-optimal funding, prioritization, and resource allocation for this critical function by federal institutions.

Backlog reduction

Over the year, we made considerable progress in reducing our backlog of complaints. We are pleased to report that we reduced the overall backlog of Privacy Act complaints older than 12 months by 56%. Across both the Privacy Act complaints and PIPEDA complaints, the reduction was 50%.

We attribute this success to a new operational strategy and an increase in resources. Our strategy involved enhancing procedural efficiency and setting higher expectations with respect to institutions’ engagement, timeliness and responsiveness to our investigations. When combined with the increase to our office’s resources provided in the 2019 federal budget, the strategy allowed us to reach our backlog reduction milestones.

Specifically, we hired new staff and redistributed files, which increased capacity and allowed investigators to reprioritize and focus on aging files and those at risk of becoming backlogged. We also engaged consultants to lend regulatory expertise in sharing the backlog reduction workload.

Looking forward, we expect to reduce the backlog by 90% by the end of 2020-2021. To that end, we continue to adopt and refine enforcement strategies that demand timely and quality stakeholder engagement, which results in heightened protection of Canadians’ privacy rights, and mitigates the risk of similar backlogs in the future.

We have created a work unit that focusses specifically on the early resolution of complaints using various operational and administrative strategies. For example, we improved the early resolution process by shifting towards gathering information by phone in real time, which streamlined our efforts and reduced the delays associated with obtaining information in writing.

The early resolution unit also issues summary investigation reports, which are shortened investigations for typically straightforward matters that result in expedited findings, such as in the deemed refusal approach described above. Combining early resolution and summary investigations, the unit closed 60% of all accepted complaints under the Privacy Act, including time limit investigations.

Of the 1,335 Privacy Act complaints we closed in 2019-2020, 338 were handled through early resolution. The remaining 997 complaint files were closed as a result of regular or summary investigations, of which 751, or 82%, were well-founded. It is important to note that 654 of those well-founded complaints were time limit complaints, some of which were closed as per our new deemed refusal approach.

The high number of time limit complaints is a clear indicator that government institutions continue to face significant challenges in responding to access to personal information requests.

New online complaint form

In September 2019, we introduced a new, optimized and automated online complaint form for both Privacy Act and PIPEDA complaints. This created efficiencies by pre-populating complaint files, which in turn accelerates the triaging process.

The form provides complainants with relevant information as they progress through it. For example, it lays out our jurisdiction under federal laws, which allows complainants to quickly determine whether their complaint should be made to our office or to a provincial privacy authority.

This has contributed to a decrease in the number of complaints ultimately filed with our office, and to a significant reduction in complaints that fall outside our jurisdiction. As a result, complaint treatment times at the intake stage has been reduced, and the review process by early resolution investigators is more efficient.

Additionally, providing information to individuals as they fill out the form makes them more aware of the key supporting documents required. This has reduced the need to follow up with complainants to obtain necessary documents.

In the end, introducing the new form resulted in fewer and more relevant complaints, better access to necessary supplementary materials, and a more expedited handling of complaints by our office.

Post-investigation compliance monitoring

After expanding our compliance monitoring unit’s functions to include investigations under the Privacy Act in addition to those under PIPEDA, we can now more effectively follow how our office’s recommendations have been implemented across the public sector.

Since last year, the unit monitors whether the recommendations we made during key investigations under the Privacy Act are being applied, allowing us to assess whether federal institutions are meeting their commitments to our office and to Canadians.

In 2019-2020, eight complaints were directed to the compliance monitoring unit. This included our earlier investigations of the Canadian Transportation Agency and the Canadian Border Services Agency (CBSA), as well as follow-up work related to our 2019 investigation into Statistics Canada, which was summarized in our previous annual report to Parliament.

Statistics Canada

Our investigation focused on Statistics Canada’s use of detailed financial information about millions of Canadians, which the federal agency had acquired or was planning to acquire from private-sector organizations, in the context of two projects, namely the Credit Information Project and the Financial Transactions Project.

Statistics Canada ultimately agreed to follow our recommendations and did not implement the projects as originally designed. Our office is providing a full-time resource to support Statistics Canada in redesigning the projects to bring them to a level that meets our recommendations.

The agency has also been working with our office to develop and implement policies and procedures aimed at incorporating necessity and proportionality more broadly into its statistical methods.

Furthermore, at the invitation of the Chief Statistician of Canada, Commissioner Therrien participated in an international panel at a United Nations Statistical Commission event in March 2020. He presented the results of our investigation and highlighted their potential relevance for every statistical agency faced with similar data challenges and opportunities, and similar responsibilities to respect citizens’ privacy rights.

It should be noted that due in part to the old age and inadequacy of Canada’s laws to deal with 21st century privacy issues, our investigation did not find any legal violations.

However, we found that the two projects as originally designed did not meet necessity and proportionality requirements. The necessity principle is currently adopted as government policy at the federal level and is a legal principle in many jurisdictions across the world, but it is not included in the Privacy Act.

Our investigation into the agency’s use of administrative data collected or compiled by private-sector organizations also raises the importance of ensuring Canadians are protected through, at minimum, common privacy principles included in laws that govern both government and the private sector.

The growing role of public-private partnerships is becoming more apparent, and these partnerships create additional complexity and risk. Our investigation serves as an appropriate example of how the pursuit of laudable public interest goals can lead to highly privacy invasive results when privacy is not taken into account. Now more than ever, Canadians need legal assurance that their privacy rights are protected.

Key investigations

Leak about Supreme Court candidate highlights need for law reform

In March 2019, media reports claimed that documents from an anonymous source demonstrated a disagreement between the Prime Minister and the Attorney General concerning the Attorney General’s recommendation of a candidate to the Supreme Court. The candidate subsequently provided a public statement stating that he had withdrawn his candidacy due to his wife’s health.

In the wake of these media reports, a Member of Parliament filed a complaint to our office alleging a breach of the Privacy Act. The complainant requested that we investigate the roles of the Privy Council Office (PCO), the Department of Justice, the Office of the Commissioner of Federal Judicial Affairs (CFJA), and the Office of the Prime Minister of Canada (PMO).

Our jurisdiction under the Act does not extend to the information handling practices of either the CFJA or the PMO. We therefore focused our investigation on the PCO and the Department of Justice.

According to the CFJA, following a Supreme Court appointment process led by an independent advisory board, the PMO was provided with a shortlist of candidates to consider. The PMO in turn provided the shortlist to the Attorney General who then consulted with stakeholders and advised the Prime Minister of the individual whom she recommended. Subsequently, the Prime Minister announced the nomination of the individual who was ultimately appointed.

During our investigation, we spoke with officials from the PCO, who confirmed their office does not play a role in identifying or assessing judicial appointment candidates. We gathered documentary evidence and testimony from the Department of Justice. We found no evidence the PCO or the Department of Justice had access to the information about the former Attorney General’s recommendation for appointment to the Supreme Court, and therefore found no evidence the disclosure originated from the PCO or the Department of Justice.

Accordingly, both the complaints against the PCO and the Department of Justice were deemed to be not well-founded.

While we did not find a contravention of the Act by the government institutions that fall under our jurisdiction, it is clear that a candidate’s privacy was compromised and negatively impacted by the disclosure of his personal information relating to the Supreme Court application and nomination process. This has resulted in injuries, not only to the reputation of the candidate, but as well, to the integrity and confidentiality of the judicial nomination process.

The fact that our investigation was constrained by the jurisdictional limitations of the Act, in our view, is yet another example of the need for legislative reform. Over the past several years, the Privacy Commissioner has requested on numerous occasions that the Act be amended to extend coverage to all government institutions, including Ministers’ Offices and the PMO. We believe this broadened jurisdiction would allow to us to fully investigate complaints such as this one.

Privacy in a litigation context

When an individuals’ personal information is implicated in legal proceedings, the nature of the information is often particularly sensitive for the individual. The three investigations summarized below highlight the intersection between the open court principle and the protections under the Privacy Act.

Public disclosure of medical information during military trial consistent with Privacy Act

A former member of the military filed a complaint against the Department of National Defence (DND) in relation to the disclosure of his personal information during a military trial.

The complainant alleged he was required to publicly disclose medical information during a summary trial as part of his defence in relation to a charge against him. He alleged that since his request to be tried at court martial was declined, he was compelled to wrongfully disclose his medical information at the summary trial.

DND submitted that summary trial proceedings are subject to the open court principle, which generally requires such proceedings and related records before the court to be open and available for public scrutiny, except to the extent the court otherwise orders.

After reviewing the evidence, we concluded that the disclosure of the complainant’s personal information during the summary trial was consistent with the disclosure provisions in the Privacy Act.

The military justice system shares several underlying principles with the civilian justice system. One of these principles is that summary trials are public by default.

The complainant’s personal information disclosed to the public involved testimony in the context of the summary trial proceedings in order for the presiding officer to issue a finding as to whether there had been a violation of the military’s Code of Service Discipline, in accordance with the National Defence Act.

The Privacy Act allows for the disclosure without consent of personal information for the purpose for which the information was obtained or for a consistent use, as well as where disclosure is authorized by law.

For those reasons, we concluded that the complaint was not well-founded.

Nonetheless, we encourage government institutions to consider measures to ensure that participants in public hearings be advised, in advance, that information they disclose will be considered publicly available, and are aware of any steps that may be available to protect personal information from disclosure.

Disclosure of personal information for litigation purposes permissible under the Privacy Act

A military member filed complaints against the Department of Justice and DND in relation to the disclosure of his personal medical information by DND to the Department of Justice for the purpose of defending against litigation filed by the military member.

The litigation involved a statement of claim filed in the Ontario Superior Court against DND for defamation and military police negligence. The Attorney General of Canada was named as the respondent.

In preparing to defend against the complainant’s allegations, legal counsel for the Department of Justice issued a document requiring DND to collect and produce all documents that may be relevant to the complainant’s lawsuit. The order specified, among other things, that the complainant’s physical and mental health files should be provided. DND disclosed the requested information to the Department of Justice.

The Department of Justice argued that the Privacy Act is not intended to restrict a federal institution’s ability to communicate personal information with its legal counsel in order to determine whether it must ultimately produce such information as being relevant to a civil proceeding.

The complainant took the position that while it is expected that DND would disclose certain information for the defence of his claim, the request for medical, mental health, and other health records was overbroad, overly intrusive, and a contravention of the Act. He also raised concerns related to doctor-patient confidentiality.

In our investigation, we found that having named the Attorney General of Canada in his statement of claim, the disclosure of the complainant’s personal information was for use in legal proceedings involving the Government of Canada and appears to have been directly relevant to the complainant’s legal claim. Therefore, we found that the disclosure and collection of personal information were consistent with the Privacy Act and considered the complaints to be not well-founded.

CBSA’s disclosure of medical information to a third party leads to complaint

The CBSA sent an individual’s personal medical information to his bondsperson. The individual complained to our office that the CBSA contravened the disclosure provisions of the Privacy Act in doing so.

The complainant had become involved with the CBSA in the process of applying for refugee status or permanent residency status. As part of the process, he was a party to numerous actions before the Federal Court. In the Federal Court records, the complainant referred to changes in his health while in CBSA custody. These court files form part of the public record and are accessible to the public at large in accordance with the “open court” principle.

At one point during his ongoing application process, the complainant sought to have the CBSA amend the conditions of his bond. The CBSA sent a letter to the Immigration and Refugee Board, notifying it of the changes, with carbon copies to the complainant and to his bondsperson. The letter included details about the complainant’s allegation that he had suffered health consequences during CBSA detention. However, the complainant’s medical information was unrelated to the purpose of informing the bondsperson of the changes in the bond conditions.

Section 8 of the Privacy Act prevents a government institution from disclosing personal information under its control without consent except under specific circumstances. However, subsection 69(2) of the Privacy Act states that section 8 does not apply to personal information that is publicly available. Our office determined that the complaint was not well founded because the complainant’s medical information was publicly available through Federal Court records.

The meaning of the term “publicly available” is explored in the Federal Court of Appeal decision of Lukács v. Canada (Transport, Infrastructure and Communities), 2015 FCA 140, where the Court interpreted this term to mean “available to or accessible by the citizenry at large.”

However, it should be noted that if the complainant’s personal information had not already been publicly available in court records, the decision by the CBSA to send a copy of this information to a third party would have been a breach of the disclosure provisions of the Privacy Act. It is solely through the operation of subsection 69(2) of the Act that the complaint is not well-founded.

Surveillance in the workplace

We regularly receive complaints related to workplace surveillance issues – in particular, employees raise concerns with respect to video surveillance, which can represent a particularly privacy intrusive collection of personal information.

Video recording in the workplace at correctional institutions

We received three complaints alleging that Correctional Service Canada (CSC) was using video footage to monitor employee performance in contravention of the Privacy Act.

The complainants alleged CSC improperly used their personal information when a manager reviewed video of their patrols. According to the complainants, the manager was using the video to monitor employee performance.

According to CSC, its purpose for using video surveillance is to maintain the security of institutions and to investigate incidents such as violence, allegations made against staff and overdoses.

In this case, an investigation into the death an inmate in custody had revealed deficiencies in patrols, and the CSC developed an action plan to address those issues. This plan included reviews of video of random patrols from a limited time period in order to identify systemic deficiencies.

The correctional manager tasked with reviewing the video also provided informal feedback to employees whose patrols were reviewed. In light of the above, we accepted CSC’s position that the purpose of reviewing the video and providing informal feedback in these cases was not to monitor employee performance, but rather, for the purpose of ensuring inmate security with respect to the quality of patrols. We therefore concluded that this consistent use was in accordance with paragraph 7(a) of the Privacy Act and therefore the complaints were not well-founded.

ESDC use of security camera footage for fact finding regarding employee’s hours worked

As we have publicly noted, given the inherent intrusiveness of video surveillance, organizations should consider the least privacy-invasive means of achieving any given purpose before resorting to video surveillance.

An employee of ESDC filed a complaint alleging the department contravened the Privacy Act when it used building security video to conduct a fact-finding exercise to determine when the employee, who worked in a different location than her manager, was leaving work.

This case serves as an important reminder to government institutions that they must limit the use of the personal information they collect to the purpose for which it was originally collected, or to uses consistent with that initial purpose.

Using the video recordings to verify the complainant's departure times as part of a fact-finding exercise went beyond the scope of the security purposes identified in ESDC’s relevant personal information bank.

Further, ESDC could not demonstrate that it had informed individuals of the purposes for which the video surveillance could be used as required by the Privacy Act.

ESDC accepted our recommendations to: (i) establish a clear internal policy to ensure that video surveillance is not used inappropriately; and (ii) inform individuals who may be captured by surveillance cameras of the purposes for which the video surveillance may be used.

We therefore found the complaint well-founded and conditionally resolved.

Investigations related to travel

CBSA should only retain travelers’ digital device passcodes when necessary

Our office received a complaint that the CBSA inappropriately collected an individual’s cell phone passcode when he returned to Canada.

The complainant asserted this collection was unauthorized because the border services officer who inspected his cell phone could not point to any specific legislation, policy, or procedure that requires it. He also argued the collection was unnecessary because he offered to unlock the cell phone himself, but the officer refused.

A separate investigation by our office summarized in our previous annual report to Parliament concluded the CBSA has the authority under the Customs Act to examine material stored directly on digital devices under certain conditions. In that investigation we nonetheless made a number of recommendations in light of the fact that a search of an electronic device at the border is an extremely privacy intrusive procedure. Digital device passcodes, examined in this investigation, are similarly sensitive.

Our office considers a passcode to be sensitive personal information when paired with other identifiers or when it is matched with the device it unlocks. A device protected by a passcode may contain information that an individual considers most sensitive. Additionally, the sensitivity of a passcode may be increased if it is reused across multiple accounts or activities.

CBSA asserted that it collects passcodes, rather than allowing an individual to unlock their digital device themselves, in order to prevent travellers from intentionally or inadvertently erasing or altering data before providing the unlocked device to the border service officer for inspection, and to ensure the continuity of evidence should the interaction be implicated in judicial proceedings. In this context, our office accepted that the CBSA does have the authority to require individuals to provide a passcode to unlock a digital device under the Customs Act.

However, when handling such sensitive personal information, government institutions should take care to be aware of and follow their own policies. They should also only retain information when it is necessary to do so.

In this case, the CBSA acknowledged the officer failed to follow policy when she did not take handwritten notes about the interaction. Further, the officer could not recall whether she had informed the complainant, as required by the policy, that his passcode would be retained and that he may change his passcode.

In response to these errors, the CBSA committed to providing more training to officers with respect to collecting personal information from travellers. It also committed to rewriting the policy to provide clearer direction to officers.

Finally, in retaining passcodes even when the search of the device led to no further action, the CBSA was retaining personal information unnecessarily and keeping it alongside other personal identifiers. We questioned whether it was necessary to retain the passcodes beyond the examination process if the CBSA has not seized the device.

The CBSA has since revised its policy that covers the examination of digital devices at the border. Border services officers are now required to take the more privacy sensitive approach of writing passcodes on a separate piece of paper and returning that paper to the traveller unless the traveller is detained further or their device is seized.

CATSA notification of police about travellers with cannabis inconsistent with Privacy Act

An individual filed a complaint accusing the Canadian Air Transport Security Authority (CATSA) of overstepping its authority when it notified local police after finding cannabis in his possession.

In 2017, a CATSA screening officer stopped the complainant while he was travelling from Toronto to Ottawa with medical cannabis. The complainant stated the CATSA officer took out his prescription bottles. The officer recorded information from the bottles, the complainant’s boarding pass and photo identification on a scrap piece of paper. The officer then contacted Peel Regional Police, which sent an officer to verify the complainant’s medical documentation.

During the course of our investigation, we examined the specific circumstances surrounding the complainant’s allegations, as well as CATSA’s practices more broadly as they relate to travellers in possession of cannabis, since the legalization of cannabis in 2018. CATSA confirmed that it continues to collect information to assess cannabis possession that it discovers inadvertently – such as determining if the possession is medical or recreational, in order to notify police if the amount appears to be over the respective legal possession limits, currently 150 g and 30 g.

We noted that cannabis is not listed on Transport Canada’s Prohibited Items List as an item that could pose a threat to aviation security and found that CATSA does not have the authority to collect personal information for general law enforcement purposes (such as making a determination as to whether or not the passenger is carrying a legal quantity of cannabis). Therefore, CATSA’s practices relating to collection and subsequent disclosure of personal information of passengers who possess cannabis contravenes the Privacy Act and we considered the complaint to be well-founded.

We recommended that CATSA stop such collection and disclosure, and update its policies accordingly. We also recommended that CATSA review its records to ensure that any personal information in its possession that relates to cannabis possession be destroyed.

CATSA accepted our recommendations and advised that it will implement policy changes by November 30, 2020, instructing screening officers not to notify police of the discovery of cannabis unless the amount is “clearly illegal”. We have conveyed to CATSA our expectation that “clearly illegal” would: (i) entail no need for any further fact finding (such as the review of medical documentation); and (ii) be restricted to consideration of volumes of cannabis that are clearly over 150 g (as per the current regulations).

Additionally, CATSA has already purged personal information related to cannabis possession from its records management system. We find this to be an acceptable implementation of our recommendations and therefore consider the complaint conditionally resolved.

Breaches

Public sector organizations must notify affected individuals, our office and the TBS of breaches involving sensitive personal information where individuals could reasonably expect injury or harm. These reporting requirements have been in effect for the public sector since 2014, when a directive from the President of the Treasury Board made them mandatory.

In 2019-2020, we received 341 breach reports, in comparison to 155 reports a year earlier. The increased number of breach reports is not necessarily indicative of improvements in how federal government institutions overall detect and manage privacy breaches, for the following reasons.

While the number of institutions that reported breaches to our office increased from 29 to 34 this year, this number represents less than 14% of the approximately 250 organizations that are subject to the Privacy Act. One institution alone, Employment and Social Development Canada, also accounted for 211 (62%) of the breach reports received.

Of particular concern, several large institutions have been conspicuously absent from the breach reports we receive. These institutions, which hold significant volumes of personal information, sometimes of a highly sensitive nature, have reported very few breaches in 2019-2020, and in some cases no breaches at all. These institutions include the CBSA, DND, Global Affairs Canada and Veterans Affairs Canada.

We continue to believe that the number of privacy breaches reported to our office represents only the tip of the iceberg. Action is needed to address systemic under-reporting.

Treasury Board of Canada Secretariat breach action plan making slow progress

In our 2017-2018 annual report, we described the results of a review prompted by concerns about underreporting of data breaches in the public sector. The review showed that many federal employees, particularly front-line workers, did not fully grasp their obligations under the Privacy Act, or even what constitutes personal information.

We urged the TBS to strengthen its policy guidance and tools, raise awareness, and improve training in the federal government in order to support better compliance with the Act, prevent breaches and ensure appropriate breach reporting.

At that time, we also reported on a TBS action plan, which sets out specific actions to strengthen the management of privacy breaches across the government.

Disappointingly, the TBS’s action plan has been slow in advancing over the two years that have followed.

Promised actions to strengthen policies, guidance and tools are only scheduled for completion in 2021-2022, and in some cases we have not even received projected completion dates.

The TBS’s actions to improve government-wide privacy training are also behind schedule.

Although the TBS has delivered presentations on privacy breach awareness to employees responsible for access to information and privacy, information management, information technology and security, more needs to be done beyond those groups. There needs to be training and clear policies and guidance for all federal employees.

We once again encourage the TBS to address these issues on an urgent basis.

Types of breaches reported in 2019-2020

We have noted some significant differences in breach reports provided to our office by private and public sector organizations.

The vast majority of the breach reports received by our office from federal institutions (85% of all reports) relate to data that has been lost or accidentally disclosed. By contrast, approximately 32% of the reports we received under PIPEDA were related to accidental disclosure and loss.

We also received 18 reports (5% of all reports) attributed to theft. The remaining 32 reports of breaches (9% of all reports) involved unauthorized access, including 24 reports of unauthorized access by employees.

We note with some concern that very few privacy breach reports we receive from federal institutions mention cyber attacks.

It is difficult to reconcile public sector numbers (5 reported cyber security events, representing less than 2% of all reported breaches in 2019-2020) with what we see in the private sector. Under PIPEDA, 42% of incidents reported to us are attributed to malware, ransomware, social engineering, password attacks or other cyber threats.

It is unclear why there is such a significant discrepancy between the numbers.

Given that cyber incidents involve particularly high risks for privacy, we will be reaching out to federal departments to remind them of the need to report privacy breaches involving cyber attacks to our office.

As we discuss later in this report, under International and domestic cooperation, we are hopeful that our upcoming work with the National Security and Intelligence Review Agency (NSIRA) will give us better insight into the impact of cyber attacks on personal information in the public sector.

Breach case summaries

Office of the Correctional Investigator employee lost documents while visiting a correctional institution

The Office of the Correctional Investigator (OCI) reported that an OCI investigator lost a binder containing printouts of inmate files as well as information about an employee at a Correctional Service Canada institution. Despite several searches, the OCI did not recover the binder.

The breach was reported seven months after it had occurred, and the personal information breached was highly sensitive. The types of compromised information potentially included fingerprint system number, date of birth, criminal history, security information and medical information.

During our review, the OCI was not able to provide evidence that it had established the privacy policies and procedures required by the Treasury Board Directive on Privacy Practices.

We recommended that the OCI establish plans and procedures for addressing privacy breaches. We also recommended the OCI undertake a thorough review of the breach to identify and notify those individuals whose personal information was affected.

The OCI accepted our recommendations and submitted a draft policy document on privacy breaches for our review. The OCI revised this policy subsequent to our comments.

To comply with accountability requirements, federal institutions must implement privacy plans and procedures. The circumstances of this case, in which a breach allowed an institution to identify and correct gaps, show how reporting breaches to our office can lead to a productive engagement that leads to improved privacy practices and, as such, to longer-term benefits for the institution.

Global Affairs Canada sent ship’s manifest to all Canadian passengers

A cruise ship with Canadian citizens on board was denied entry to a number of ports due to the presence of COVID-19 cases among passengers. Given uncertainty related to the processes for disembarkation and repatriation to Canada, Global Affairs Canada (GAC) contacted Canadian citizens on the ship to provide them with information and support.

Due to an administrative error, a manifest including the name, gender, contact information, date of birth, nationality and passport number of each Canadian citizen was inadvertently sent to all 247 Canadian passengers onboard the ship.

GAC reported this breach to our office one week after it occurred. We also received complaints from affected individuals. However, in the early stages of our investigation, GAC proactively took steps to mitigate the potential impact of the breach for affected individuals – including credit monitoring and arranging for new passports.

GAC committed to implementing new procedures for mass communications to prevent future occurrences of a similar breach. Given that GAC provided sufficient information to affected individuals and took corrective steps in a timely manner, we were able to quickly conclude our investigation of the complaints received.

Advice and outreach to government departments

Our office’s Government Advisory Directorate provides advice and recommendations to federal public sector institutions during consultations on specific programs and initiatives, and more formally as part of our review of PIAs and information sharing agreements submitted by departments and agencies.

The Directorate also undertakes various general and targeted outreach initiatives with the federal public sector to encourage compliance with the Privacy Act and with the relevant policies and directives issued by the TBS.

One key initiative was to update Expectations: OPC’s Guide to the Privacy Impact Assessment Process. This guide provides direction to federal government institutions on how to comply with the Privacy Act and effectively manage privacy risks as part of the PIA process. It presents key concepts and best practices, and lays out how an institution may assess its programs and activities, including the legal requirements and privacy principles to consider. It also clarifies our office’s role in the PIA process and sets out our expectations of government institutions submitting PIAs for our review.

The Government Advisory Directorate also conducted two outreach sessions with 80 representatives from program areas and access to information and privacy (ATIP) units from 20 different federal institutions.

We facilitated discussions with participants about the types of privacy challenges institutions face when adapting to digital government or when complying with open government directives.

The participants brought up a number of challenges, including:

  • a lack of resources and practical guidance to deliver on privacy, given the pressure to innovate and move to digital quickly;
  • difficulties balancing transparency and modernization with privacy and national security; and
  • a perceived tension between open government and privacy protections.

Participants brainstormed possible solutions to obstacles and were able to openly discuss and share procedures and best practices. Some ideas included:

  • establishing the importance of privacy at the outset and during the project management stage;
  • ensuring all employees are aware of the importance of privacy as it applies to the programs they work on; and
  • sharing information about best practices and insights among organizations, including our office.

Over the year, the Government Advisory Directorate offered advice on a variety of government initiatives related to technology, data collection and sharing, and digital government.

RCMP use of remotely piloted aircraft systems

The Royal Canadian Mounted Police (RCMP) has been using remotely piloted aircraft systems – commonly called drones – since 2010. Our office had consulted with the RCMP on this issue on several occasions before receiving a PIA in 2019.

Remotely piloted aircraft systems can be a privacy intrusive technology. According to the PIA, the RCMP operates more than 200 drones of different types that carry both still and video cameras, infrared cameras or thermal imagery sensors.

The RCMP has been using drones to examine crime scenes, to reconstruct collision scenes, to conduct search and rescue operations, to conduct investigations at international borders, to monitor emergency response teams, and to test and research means to prevent potential harm caused by drones.

According to the PIA, the RCMP only deploys drones for surveillance purposes with prior judicial authorization, except in cases where urgent or exigent circumstances make obtaining a search warrant impracticable.

We provided several recommendations in response to the RCMP’s PIA about the drone program, including:

  • assessing necessity and proportionality, and minimal intrusiveness before deploying a drone in a given situation;
  • clarifying who assesses if a drone has collected personal information, as well as how that determination was made; and
  • assessing the effectiveness of the program as a whole.

At the time of writing, we are awaiting the RCMP’s response to our recommendations.

Drones are an example of technology that raises important privacy questions and concerns. The RCMP had been using drones for nine years prior to completing a PIA. Given the invasive nature of the program, privacy risks should have been thoroughly assessed before the program was deployed.

Our office stresses the need for federal institutions to assess the privacy risks of initiatives and programs early on. PIAs are most effective when conducted before full implementation of a program, as they help address identified privacy risks before personal information is put at risk.

National DNA Data Bank

The National DNA Data Bank (NDDB) was established in 2000 to collect, store and compare DNA profiles. These are obtained from samples collected at crime scenes, and from individuals convicted of certain serious crimes who are required, under court order, to submit samples. The National DNA Data Bank is managed by the RCMP.

The DNA Identification Act was amended in 2014 to expand the role of the NDDB and allow the use of DNA profiles to investigate missing persons and unidentified human remains. Five additional indices were created. These indices include: DNA profiles submitted voluntarily from the relatives of missing persons; profiles obtained from material collected from the personal effects of missing persons; profiles developed from human remains; profiles obtained from DNA evidence recovered from crime victims; and DNA profiles obtained from samples given by voluntary donors.

In 2014, we received and reviewed a PIA from the RCMP for the creation of the National Centre for Missing Persons and Unidentified Remains. The Centre provides specialized services for all investigations into missing persons and unidentified remains in Canada, and shares data and analysis nationally.

In 2019, the RCMP submitted an addendum to their initial PIA from 2014. The addendum addresses the privacy risks associated with the coming into force of the amendments to the DNA Identification Act in 2018, including the creation of the new indices.

It also addresses the development of the National Missing Persons DNA Business Model. The Business Model outlines how the DNA indices are managed, including the processes for including, authorizing and accepting biological samples and DNA profiles into the database; how investigators are informed about results; and the management of retention periods and destruction processes.

Our office has consulted with the RCMP and provided advice on the privacy requirements of the data bank. We made one recommendation concerning the various consent forms for the provision of biological samples. We suggested they more clearly indicate that under particular circumstances, DNA profiles may not be removed upon request. The RCMP accepted our recommendation.

We also recommended that agreements with foreign jurisdictions contain privacy protection clauses to ensure receiving jurisdictions protect DNA profiles and other sensitive personal information shared by Canada. In response to this recommendation, the RCMP provided more information about what can be shared with foreign policing agencies.

Along with our ongoing collaborative engagement with the RCMP on the NDDB, our office has also continued its involvement as an ex officio member of the NDDB Advisory Committee since it was formed in 2000.

Entry/Exit initiative

The Entry/Exit initiative was introduced jointly by Canada and the United States in 2011 to facilitate the collection of data about individuals entering and exiting Canada. Our office has engaged with the Canada Border Services Agency (CBSA) over the years on the project, through consultations and reviews of PIAs as the initiative moved through its multiple implementation phases.

At the end of 2018, the passage of Bill C-21, An Act to amend the Customs Act, increased the scope of the Entry/Exit initiative, giving the CBSA the authority to collect biographic information on all travelers exiting Canada by land and air. This includes details such as name, date of birth, sex, citizenship or nationality, travel document information, and travel details.

During 2019-2020, our office recommended the CBSA review the purposes for the collection of information about entries and exits, and set retention periods specific to each purpose. We also recommended that the CBSA provide greater transparency about its activities related to entries and exits.

Other federal government departments are involved in the Entry/Exit initiative, namely Immigration, Refugees and Citizenship Canada (IRCC) and Employment and Social Development Canada (ESDC).

IRCC is working on the Entry/Exit initiative in two ways: by receiving data on entries and exits to confirm the presence or absence of an individual with an immigration application or investigation; and by providing length-of-stay data to the CBSA for verification of overstays in Canada.

We recommended IRCC track the results of its objectives and reconsider the use of this information where it is not demonstrably effective in achieving established goals. We also made recommendations towards accuracy of personal information, data retention, and de-identification standards.

We also conducted two consultations with ESDC on their planned use of Entry/Exit data to assist in administering social programs. We have since received ESDC’s PIA for the use of Entry/Exit data for enforcement of the Employment Insurance program, and are anticipating a PIA for enforcement of the Old Age Security program. ESDC briefed us on the status of the initiatives, and we discussed high-level privacy considerations such as transparency and the legal requirements that govern the use of personal information for administrative purposes.

At the time of writing this report, we are in the process of reviewing CBSA’s information sharing agreements with IRCC, ESDC, the CRA, and the RCMP. The CBSA is currently finalizing its information sharing agreement with the Canadian Security Intelligence Service (CSIS), and has committed to providing our office with a copy.

Guidance on the Security of Canada Information Disclosure Act

The Security of Canada Information Disclosure Act (SCIDA) was adopted in 2019 with an aim to enable more timely and effective information sharing across government for national security purposes. SCIDA authorizes institutions to disclose information relevant to national security, including personal information, to a select group of federal government institutions with national security mandates.

SCIDA is the successor to the Security of Canada Information Sharing Act (SCISA). In 2017, our office conducted a review of how SCISA was put into practice. Our review found significant procedural deficiencies in how the Act was put into practice, including with respect to record keeping and internal controls.

SCIDA specifies that a disclosure is only authorized if it will contribute to the national security responsibilities of the recipient institution. It also requires the disclosing institution to maintain records for every disclosure it makes. Public Safety Canada has a leadership role in developing resources, training and guidance on SCIDA to help officials understand and administer the new legislation.

Our office provided comments to Public Safety on their SCIDA Guide to Responsible Information Sharing. Our comments aimed to ensure institutions receive clear guidance on the threshold for disclosure under the Act, the accuracy and reliability of information, and record keeping practices.

Our recommendations were incorporated into the guide. In addition, Public Safety provided us with an overview of the training it offers to disclosing partners to support the guide.

By working with Public Safety on their SCIDA guidance material, we raised awareness among institutions about their obligations under both the SCIDA and the Privacy Act. We plan to continue our engagement with Public Safety and to discuss further opportunities for us to provide information about privacy obligations.

On a related note, the passage of the National Security Act in 2017 allowed our office to collaborate with the National Security and Intelligence Review Agency (NSIRA), notably in the review of disclosures to determine whether they respect the criteria outlined in the Act. We look forward to conducting a review of these disclosures with NSIRA in 2020-2021. We discuss our work with NSIRA more broadly later in this report, under International and domestic cooperation.

Canadian Framework for Collaborative Police Response on Sexual Violence

The Canadian Framework for Collaborative Police Response on Sexual Violence was developed in 2017 by 14 police services across Ontario. It included contributions from the Office of the Information and Privacy Commissioner of Ontario. The Framework aims to provide police services across Canada with leading practices to address sexual violence, including the review of cases not cleared by charges.

In 2019, the Canadian Association of Chiefs of Police (CACP) approached our office seeking advice and support for the initiative, as the CACP was formally going to endorse the framework nationally. We reviewed the framework and made two recommendations, both of which were accepted by the CACP.

As the framework was originally developed in Ontario, it only referenced Ontario legislation. Therefore, our first recommendation was for the framework to specify that police agencies would need to adapt the framework to their own legislative context, including privacy legislation.

Our second recommendation was in relation to the confidentiality agreements that members of the sexual review committees needed to sign, prior to reviewing cases. We recommended that the CACP add a provision that required members to notify the appropriate agency in the event of a breach of personal information.

Following our review, the Privacy Commissioner of Canada sent a letter of support for the Framework noting it allows for implementing review programs of sexual violence cases that are governed by strong privacy protective measures. He added it will be a useful reference, not only for establishing new programs but also for refining existing ones, such as the programs run by the RCMP and the Canadian Armed Forces.

The review of sexual violence cases by external members involves the disclosure of investigative files and highly sensitive personal information. However, when institutions include privacy protective measures in the design of their programs, they reduce the risk of data breaches, which helps establish trust from the individuals involved in those cases.

The framework illustrates that with appropriate safeguards in place, privacy is not a barrier to the disclosure of personal information, which is necessary to achieve the important public goals of review programs.

VidCruiter

VidCruiter is a Canadian company offering video recruitment and other hiring solutions, including applicant filtering and screening, interview scheduling, video interviewing and reference checks.

At the time of writing this report, our office has consulted with the Department of Justice, the Canadian Space Agency, Health Canada and ESDC about the use of this staffing platform. We anticipate more widespread use among federal public institutions, and are aware of interest from the Deputy Ministers Task Force on Public Sector Innovation.

Additionally, given the circumstances surrounding the pandemic, it is clear that interest in remote interview solutions will increase. Our privacy concerns were related to limiting the collection, use, disclosure and retention of personal information, and safeguarding it.

Our office made several recommendations regarding the use of VidCruiter. In particular, we advised that government institutions wanting to use this service should complete a privacy impact assessment.

We made these recommendations on the basis that the service involves contracting out to a third party and substantial modifications to existing staffing activity, both of which are conditions that trigger a PIA according to the Treasury Board Directive on Privacy Impact Assessment.

Using the service also creates a new collection of personal information in audio and video format, which is currently not accounted for in the standard personal information bank (PIB) on staffing. Our office has discussed the need to update this PIB with the TBS.

At the time of writing this report, our office is awaiting responses from the above noted institutions concerning the advice we provided. ESDC and Health Canada indicated they are preparing PIAs on the activity, while the Department of Justice and the Canadian Space Agency said they were establishing a working group to potentially create a multi-institutional PIA.

Our Business Advisory Directorate, meanwhile, was engaged in a consultation with VidCruiter and provided a number of recommendations to the company to ensure better privacy protections. This work is described later in this report, under The Personal Information Protection and Electronic Documents Act: A year in review.

Further reading

Reciprocal information sharing agreement related to immigration sponsors

Immigration, Refugees and Citizenship Canada (IRCC) and the Ontario Ministry of Children, Community and Social Services (MCCSS) have an information sharing agreement for three different information-sharing activities.

First, the IRCC can verify with the MCCSS whether an individual applying to sponsor a foreign national to Canada is a welfare recipient, as this would render them ineligible to sponsor. Second, the MCCSS can obtain information from IRCC about sponsors who owe a debt to the province related to their sponsorship commitments. Third, the MCCSS is able to obtain information from IRCC on a case-by-case basis regarding the status of an individual who is seeking welfare assistance and whose status with IRCC is unclear.

The system to support this information exchange was not working well, as it relied on outdated technology and was inefficient. The parties developed a pilot project to test the viability of matching identities between their respective databases.

IRCC consulted with our office, and later submitted a draft copy of their information sharing agreement with the MCCSS, which we reviewed. Our recommendations included sharing information through encrypted email instead of through a third-party platform, detailing responsibility for personal information at all stages of the pilot, detailing the process to address privacy breaches, and setting clear retention schedules for personal information. We also recommended that IRCC and the MCCSS assess the risk to privacy.

In its response, IRCC accepted all our recommendations and committed to conducting a PIA for work on this initiative that would follow the pilot project. Our office was satisfied with the steps taken by IRCC to incorporate our recommendations.

The Personal Information Protection and Electronic Documents Act (PIPEDA): A year in review

PIPEDA was adopted to support trust in what was then called “electronic commerce”. Under PIPEDA, consumers could benefit from legal protection against their personal information being collected, used or disclosed without their consent.

Two decades later, Canadians are now fully immersed in the digital economy. With databases growing in size and analytics growing in sophistication, consumers are facing risks that far surpass those of the year 2000 in terms of severity.

Every year, the number and magnitude of data breaches that involve personal information grows. Investigations into high profile breaches are an expanding part of the wide range of activities our office conducts to enforce and promote compliance with PIPEDA.

Where data breach investigations represents a significant portion of our work, we also invest efforts in a number of other activities aimed at ensuring compliance with PIPEDA and increasing businesses’ knowledge of their privacy-related obligations.

In addition to our investigations into a wide range of issues, we promote compliance by providing advice and guidance to businesses to help them address privacy issues proactively. We also contribute to enforcing and promoting Canada’s Anti-spam Legislation (CASL) with other federal regulators, and support emerging research and knowledge transfer into consumer privacy issues through our Contributions Program.

This section provides a summary of our activities related to PIPEDA in 2019-2020.

Breaches of security safeguards

With breach reporting obligations under PIPEDA coming into force November 1, 2018, our office marked the first full year of mandatory breach reporting during the year 2019-2020. Previously, organizations reported breaches to our office on a voluntary basis.

An organization subject to PIPEDA must report to our office any breach of security safeguards involving personal information under its control if it is reasonable in the circumstances to believe that breach creates a real risk of significant harm to an individual. The organization must also notify individuals of such breaches involving their personal information.

To determine whether a breach creates a real risk of significant harm, organizations are to take into account the sensitivity of the personal information involved in the breach, and the probability that the personal information has been, is being, or will be misused.

Organizations must keep and maintain records of every breach of security safeguards involving personal information under their control.

Since reporting became mandatory, we have experienced a significant jump in the number of breach reports that we receive. In 2019-2020, we received 678 breach reports affecting an estimated 30 million Canadian accounts, more than double the number of reports we received during the previous year and six times the amount we received the year before breach reporting became mandatory.

Breach response unit

While mandatory breach reporting is a new requirement under PIPEDA, it has been a policy requirement in the public sector since 2014. Our office’s breach response unit receives breach reports, reviews them and follows up with organizations as appropriate.

The unit performs an initial review of breach reports to ensure the information provided to our office and affected individuals is complete and appropriate.

As we continue our review of a given report, we may informally engage with the organization to ensure they have sufficiently addressed the breach, for instance, by mitigating the risk of further harm to affected individuals, or by taking steps to reduce the risk of future breaches.

As an example, we received a number of breach reports from a retail chain regarding the loss of customers’ electronic devices during transit to off-site technical support in some areas. The devices, along with forms that included customers’ personal information and device passwords, were being transported by courier.

After our engagement, the retailer improved its processes. With a better tracking system, a new process to detect lost items sooner and an assurance that personal information, including passwords, were no longer being sent with the devices, future risk to consumers have been greatly reduced.

In certain cases, we will launch a formal investigation. These investigations generally aim to evaluate the adequacy of the organization’s safeguard measures and its proposed improvements, making further recommendations where gaps persist.

Of the breach reports we assessed in 2019-2020, 87% appear to have met the reporting threshold.

Our office is seeing a rise in reports of large-scale breaches affecting a great number of individuals. Most notably, breaches at large organizations including Desjardins and Capital One have been reported to our office.

Breach reports received from three industry sectors accounted for 50% of all breach reports we received in 2019-2020, with 19% from the financial sector, 17% from telecommunications, and 14% from sales and retail.

Roughly half of reported breaches involved unauthorized access by malicious actors or insider threats, often as a result of employee snooping or social engineering hacks.

Insider threats can involve malicious actors, such as an employee that misuses customer information. We have also observed instances where there was no malicious intent, including situations where employees make errors when emailing or mailing personal information, or fail to follow an authentication process. The failure of staff to properly verify the identity of an individual has led to serious breaches through unauthorized access to customer accounts.

We also continue to see breaches involving disclosure to family members, theft and loss of devices, malware insertion, attacks on network vulnerabilities, credential stuffing, brute force password attacks, and accidental disclosures (such as including lists of email recipients in the c.c. field of an email instead of the b.c.c. field).

Targeted social engineering campaigns involving phishing and impersonation schemes continue to be a leading cause of breaches reported to our office. These campaigns, which sometimes use personal information leaked from previous breaches in an attempt to gain access to a victim’s accounts for financial gain, continue to be particularly troublesome.

For instance, we received an increased number of breach reports from telecommunications companies regarding unauthorized access of customer’s accounts through SIM swaps. These are a result of malicious actors using social engineering to take over a customer’s phone number and gain access to their phone calls and text messages.

Many online accounts are linked to cell phone numbers, and use text messages to validate a customer’s identity. Therefore, SIM swaps are often used to gain access to a person’s bank accounts, social media, email or any other account linked to a cell phone number.

In its 2007 report on the first statutory review of PIPEDA, the Standing Committee on Access to Information, Privacy and Ethics included a section on breach notification. The report stated: “As more stories of major breaches involving the personal information of large numbers of Canadians are covered in our daily newspapers, concern about this issue is growing.”

More than a decade later, the frequency and the magnitude of data breaches that compromise the personal information of Canadians continue to be on the rise.

Breach records inspection

In 2019-2020, our office conducted a breach record inspection exercise within a specific industry. We examined the breach records of seven Canadian telecommunications companies to assess compliance and to get a better sense of the plans, tools and approaches organizations are using to meet their breach responsibilities.

We chose the telecommunications sector because it was among those that reported the most breaches to our office in 2019. In addition, most Canadians have a consumer relationship with one of the country’s telecommunications companies, as their products and services are used by a majority of the population.

We found that generally, the industry appears to be taking their obligations seriously. That said, we identified key areas for improvement.

For instance, 40% of the records we reviewed did not include sufficient information for our office to adequately understand the organization’s assessment of whether a breach created a real risk of significant harm. This information should be included in the breach record.

Further, we felt that 20% of the records related to non-reported breaches should likely have been reported to our office.

We also noted that only one company out of the seven had a strategy and processes in place to deal with the retention of breach records.

During our examination, we observed practices that helped companies assess real risk of significant harm. For instance, five out of the seven organizations we examined used a checklist, while others used a matrix or a list of questions.

Operational updates and trends

In 2019-2020, our office closed 318 complaints under PIPEDA, the majority relating to access to personal information (34%) and consent (20%). Overall, we closed 30% more access-related complaints than the previous year, indicating individuals may be more aware of their rights, and more interested in understanding what information private organizations are collecting and retaining about them.

We accepted 13 times more complaints related to the health sector than the previous year, largely due to a significant privacy breach that occurred within that sector involving a medical laboratory services company.

We generally receive a low number of complaints against organizations in this sector, as most are governed by provincial privacy legislation; however, this particular breach affected individuals in several jurisdictions. In the end, the 23 complaints we received fell squarely within the boundaries of Ontario and British Columbia. Therefore, the complaints we received were discontinued and redirected to the proper authorities in those provinces. Their investigations are ongoing at the time of writing this report.

The financial sector continues to be the leading cause of complaints, constituting 21% of accepted complaints, a number consistent with the previous year, followed by telecommunications (13%) and the service sector (11%).

We achieved significant success in reducing our backlog of complaints older than 12 months during the year, decreasing our backlog by 50% for both Acts combined during the year, and by almost 20% solely for PIPEDA complaints.

Increased demand from a public concerned about deteriorating privacy rights and the complexity of the digital environment, among other factors, had contributed to the accumulation of the backlog of complaints.

As was the case with complaints under the Privacy Act, we succeeded in reducing the backlog of PIPEDA complaints through additional dedicated resources and a strategy that provides enhanced investigative efficiencies.

On the latter point, we undertook concerted human resource actions to onboard both contractors and employees to bring expertise, redistribute files and help reduce the backlog.

In 2019-2020, 221 complaints under PIPEDA were closed through early resolution, which represents 69% of all PIPEDA complaints. Early resolution and summary investigations are an efficient way to resolve straightforward privacy matters. Complainants typically see an outcome in a few months, compared to a much lengthier formal investigation.

The introduction of a new online complaint form introduced efficiencies in the triaging process by automatically entering complaint information into our case management system. As described in The The Privacy Act: A year in review, the new online complaint form provides complainants with relevant information as they progress through it.

Since the implementation of the new online complaint form, we have received far fewer complaints that fall outside of the scope of our office’s mandate or jurisdiction. This has reduced the time it takes to assess complaints and has facilitated the review of files.

We have also increased our use of formal investigation powers with respondents, including issuing summons for appearance and the provision of testimony under oath. We also conducted several site visits, which represent an important mechanism to conduct and validate forensic technological analyses and to carry out employee interviews.

Monitoring compliance

We have recently expanded our focus on ensuring compliance with our recommendations after we issue a report of findings. Our compliance monitoring unit oversees the implementation of recommendations made during certain investigations to facilitate compliance and assess whether organizations meet their commitments to our office and Canadians.

One investigation file currently in compliance monitoring is Equifax. Our previous annual report summarized our office’s investigation into a massive breach at Equifax that affected 19,000 Canadians and some 143 million people worldwide. At the time, we noted that both Equifax Canada and its United States-based parent company fell far short of their privacy obligations to Canadians.

Following the breach, Equifax Canada entered into a six-year binding compliance agreement with our office to submit third-party security audit reports, improve their accountability and data destruction programs, and increase transparency about their privacy practices. Our office’s monitoring of these deliverables ensures that Equifax implements privacy protective enhancements in a thorough and timely manner.

The compliance agreement was revised in January 2020 to amend certain consent requirements around transborder data flows. Our office will continue to monitor Equifax’s adherence to the agreement over the next several years.

Key investigations

Website featuring patient reviews of doctors highlights PIPEDA’s shortcomings

A dentist found a profile of herself on the RateMDs.com website. This profile had been created without her knowledge or consent. When she asked the company to remove the profile, they denied her request, citing patients’ interest in posting and reading reviews about health professionals.

She then complained to our office, and we launched an investigation into the matter. Our investigation examined issues of consent, accuracy and correction, openness, and appropriate purposes.

Consent

The complainant’s primary concern was that her personal information was being collected, used and disclosed without her consent.

The complainant’s profile on RateMDs contained two types of information: business contact information, such as her name and contact details for her practice; and reviews and ratings posted about her by RateMDs users.

With respect to the complainant’s business contact information, we found that it was publicly available within the meaning of the regulations under the Act. Therefore, RateMDs could collect, use and disclose it without her consent.

As for reviews and ratings, they represent the personal information of both the complainant and the individuals who post them.

PIPEDA would seem to require the consent of both parties to authorize RateMDs to publish these reviews and ratings. However, where the interests of individuals conflict, this will rarely be possible. Federal Court jurisprudence holds that, in such circumstances, PIPEDA requires a balancing of interests.

The complainant has a reasonable expectation of privacy as well as an interest in protecting her reputation and livelihood.

At the same time, users have given their express consent to post their reviews. The website’s stated intention is for users to comment in order for others to benefit and make informed choices about which health professionals they visit. We recognized there is a public interest in the publication of reviews that serve the general population by informing their decisions about whether to engage the services of certain health professionals.

Based on a balancing of the interests of the complainant with those of reviewers and the public more generally, we found this aspect of the complaint not well-founded.

Accuracy and correction

We noted that PIPEDA requires that RateMDs ensure the accuracy of information about health professionals on its website.

The operators of RateMDs should therefore provide the complainant and other health professionals a fair and accessible process to challenge the accuracy and completeness of information published about them. Of course, challenging the accuracy of comments may be extremely difficult in the context of reviews that are sometimes posted anonymously.

At the same time, PIPEDA states that “an organization shall not give an individual access to personal information if doing so would likely reveal personal information about a third party.” Therefore, PIPEDA would generally prohibit disclosing to health professionals the identity of users posting negative reviews about them, even though it may be required to ensure fairness and protect personal reputation. As such, applying the law as enacted, which includes prohibiting the disclosure of a third party’s identity to a health professional, can result in an unfair process under which it may be difficult for a health professional to demonstrate that the personal information about them is inaccurate.

The complainant in this case did not raise accuracy issues. Therefore, while we noted that RateMDs did have a process in place for reviewing and removing inaccurate information, we did not opine on whether that mechanism was sufficient.

We remain concerned with challenges related to the potential for posting of inaccurate information on RateMDs’ website. We have strongly encouraged RateMDs to explore alternative mechanisms to ensure and facilitate the accuracy of information.

Openness

RateMDs did not make clear to health professionals that they may request a correction or amendment to information about themselves, if they believe that information is inaccurate, incomplete or out of date.

RateMDs agreed to make changes to its terms of use and FAQs to address the lack of transparency with respect to its policy on removal and correction of reviews. Those changes clarify that personal information will be removed or amended where it is demonstrated to be inaccurate or out of date. They also indicated that a health professional can claim their profile to correct or update information about themselves or to post a response to any review.

Our office has reviewed RateMDs’ updated privacy communications. We are satisfied that these are sufficiently clear to meet openness requirements.

Appropriate purposes

Our office determined that a reasonable person would not consider the collection, use and disclosure of review and rating information of health professionals to be generally inappropriate under the circumstances, noting here that no objection was raised as to the accuracy of the information. The interests of patients posting to the website and the public interest of prospective patients suggest that RateMDs’ purposes are generally appropriate.

That being said, our office was concerned with the paid ratings manager service that allowed health professional subscribers to hide up to three negative reviews from their user profile. The negative reviews reappear if a user unsubscribes from the service.

According to our office’s guidance on inappropriate data practices, there are several “no-go zones” where the purpose for the collection, use and disclosure of personal information would be considered inappropriate by a reasonable person. This includes publishing personal information with the intended purpose of charging individuals for its removal.

In this case, RateMDs has created a platform that allows users to post reviews and ratings of health professionals, including negative ones. Having created the conditions for negative reviews to be posted, RateMDs cannot generate revenue from them by charging for their removal. Requiring health professionals to pay to remove reviews, and then requiring continued monthly payments to maintain their suppression, is a clear example of an inappropriate pay-for-takedown practice.

Between August 2019 and August 2020, RateMDs removed this option from its subscription plans.

Towards a rights-based approach for protecting Canadians in the digital age

This investigation is yet another demonstration of how the existing legislative framework is insufficient in upholding the privacy rights of Canadians in the digital economy.

It highlighted the tension between several issues, including online reputation, free speech, online anonymity, commercial interests, patient’s interests and the public interest. Given that these issues are relevant to many rating websites, as well as to other websites more generally, the rights of individuals to correct inaccurate personal information posted about them anonymously, and the fairness of that process, are clearly underdeveloped.

A rights-based framework that defines privacy in its broadest and truest sense would provide greater clarity for privacy protection in these contexts. In our previous Annual Report, we recommended that our laws be updated to include additional protections against harms that result from infringements of human rights in a digital era. Specifically, we recommended that reformed legislation must incorporate rights that are unique to the digital era, including but not limited to the right to be forgotten, data portability, and algorithmic transparency or explanation. We note that jurisdictions elsewhere have taken action to ensure there is a place in the law for rights specific to our new digital reality.

Global data flows: TD Canada Trust and Loblaw found to comply with current PIPEDA requirements

Protecting privacy in the context of outsourcing was a relevant issue in investigations into TD Canada Trust’s and Loblaw Co. Ltd.’s transfers of customers’ information to service providers outside Canada for processing.

The investigations ultimately concluded TD and Loblaw had met their requirements under the current law, in particular, the requirement under Principle 4.1.3 to use contractual or other means to ensure a comparable level of protection while the information is being processed by a third party. When framed under that law, the measures adopted by the two organizations were good practices.

However, we remain concerned that the current law may not adequately protect the personal information of Canadians when it travels outside our borders.

According to the Office’s 2009 Guidelines for processing personal data across borders, "comparable level of protection" means that the third party processor must provide protection that “can be compared” to the level of protection the personal information would receive if it had not been transferred. “It does not mean that the protections must be the same across the board but it does mean that they should be generally equivalent.”

In September 2019, our office concluded, following a consultation on transfers of personal information for processing, that our guidelines for processing personal data across borders would remain unchanged under the current law, and that it would focus its efforts on how a reformed law can best protect Canadians’ privacy rights when their information is transferred or the subject of cross border flows.

TD Canada Trust

In the TD investigation, we found that the bank’s technological controls, coupled with the terms of its contract with the service provider and associated monitoring and enforcement of those contractual requirements, provided a level of protection comparable, as defined in the 2009 guidelines, to that which would be required under PIPEDA if the information was processed by TD, in particular, under Principles 4.4, 4.5 and 4.7, which were most relevant in the context of this case.

The TD investigation also highlighted several good practices, under the current law, for other organizations that transfer personal information to third parties for processing. These included:

  • undertaking risk assessments prior to signing a contract to identify and mitigate potential privacy risks, and incorporating these findings into the contract;
  • requiring the service provider to control its work environment to prevent copying or sharing information about the contracting organization’s customers or employees;
  • using contractual means and robust safeguards to strictly limit the service provider’s access to and use of personal information; and
  • proactively monitoring the service provider’s safeguards and practices to ensure they comply with the contract, including through regular audits by an independent auditor who would monitor any issues to ensure they were addressed.

The bank was not required to obtain separate consent for the transfer, since the third party was using the information for a purpose for which TD had originally collected it (managing fraud claims). Nonetheless, we found that it provided appropriate information to customers about transfers of personal information to the third-party service provider, thereby meeting its openness requirements.

Loblaw

Our investigation of Loblaw also considered how a company handled transfers to processors. This investigation was related to the collection and use of personal information by Loblaw as part of a gift card offer, in the wake of a Competition Bureau of Canada review of allegations that customers had been overcharged for certain packaged bread products.

We found that in the circumstances, Loblaw’s detailed contractual requirements were sufficient to ensure a level of protection comparable to that required under PIPEDA.

We found that Loblaw was sufficiently transparent about its cross-border data transfers in its written communications to program registrants.

The company was not required to obtain additional consent for its transfer of name and address information to the service provider for processing, given that it had already obtained consent to use that information for the purpose for which it was to be used by the processor.

Our investigation into Loblaw also examined the issue of over-collection. We found that at least initially, Loblaw had collected more personal information than was necessary to verify the identity of certain customers.

While a name and address were needed to verify an identity, more sensitive information such as driver’s licence numbers, birthdates and digital photos (the latter being a form of biometric data) were not. During the investigation, Loblaw took steps to limit the information it was collecting. Our office was satisfied with their response.

Global data flows and law reform

While our investigations into TD Canada Trust and Loblaw concluded that both organizations had met their requirements under the current PIPEDA, we have ongoing concerns that PIPEDA in its current form does not adequately address risks to privacy posed by global data flows.

Such flows can create significant benefits for consumers and organizations. However, they can also create inherent risks for privacy, which need to be addressed through robust legal protections.

Examples of potential shortcomings with PIPEDA include that the Act’s “comparable level of protection” standard would seem to provide a lower level of protection than standards found in modern statutes such as those of Europe and Australia.

Australia’s law requires taking reasonable steps to ensure that an overseas recipient “does not breach” Australian privacy principles or, in the alternative, having a reasonable belief that a foreign law or binding scheme provides “substantially similar” protections. The European Union, for its part, requires “essentially equivalent” protections.

As it stands, clause 4.1.3 in Schedule 1 to PIPEDA does not distinguish between domestic and cross-border transfers for processing. As a result, it sets no additional requirements for cross-border transfers, even though the risks are not the same.

PIPEDA also does not provide specific mechanisms to regulate cross-border flows more generally, either ex ante or ex post, other than largely undefined accountability requirements. Mechanisms exist in other jurisdictions, such as standard contractual clauses, codes of conduct or other binding schemes that are used to help ensure the protection of personal information in the context of cross-border flows.

It is likely that existing standards in PIPEDA need to be enhanced to ensure the personal information of Canadians is appropriately protected when it leaves the country.

As our office develops recommendations for modernizing Canada’s federal private sector law, we will take into consideration legislative solutions and mechanisms that exist globally as well as submissions received under the 2019 consultation.

Dell improves security practices following breaches that disclosed customer information

Two Dell customers complained they received calls from fraudsters who knew certain specific facts about them, including information pertaining to their Dell products.

They alleged Dell had insufficient security safeguards that resulted in the breach of personal information belonging to their customers and they were dissatisfied with how Dell responded to their complaints about a privacy breach.

At the time of the complaints, Dell used a service provider to offer support for its customers in a call centre in India. (We did not examine the issue of transborder data processing in this investigation as this was not raised as a concern by the complainants.) Through the course of our investigation, we determined that two employees inappropriately disclosed Dell customer data lists on two separate occasions, affecting more than 7,800 Canadian Dell customers. The employees had taken and sold the information to a third party.

Our investigation determined that Dell was unaware of what information was disclosed in one of the instances, but confirmed both complainants were affected by the other breach.

Under PIPEDA, Dell remained responsible for the personal information transferred to the service provider and was obligated to ensure the third party protected that information with appropriate security safeguards.

The personal information transferred to the service provider, consisting of customer names, contact information and details about their computers, was sensitive in this particular context and required a high degree of protection. This type of information is valuable to fraudsters engaged in fraudulent tech support and other scams.

We found that certain technical controls and safeguards related to access controls, logging and monitoring were insufficient given the sensitivity of the personal information at issue.

Also worrisome, we found that Dell failed to adequately investigate the circumstances of the breaches and failed to adequately respond to the complainants’ reports about the fraud calls. In fact, had Dell more thoroughly investigated the circumstances of one of the complainants’ concerns, the second breach could have been prevented.

In light of the results of our investigation and in response to our recommendations, Dell improved its security safeguards along with its complaint handling and breach investigation practices.

Canada’s Anti-spam Legislation

Our office shares responsibility for enforcing Canada’s Anti-spam Legislation (CASL) with the Canadian Radio-television and Telecommunications Commission (CRTC) and the Competition Bureau.

Over the past year, our office updated guidance for businesses related to CASL, including guidance for businesses conducting e-marketing. The guidance informs organizations about our office’s mandate related to electronic address harvesting and e-marketing, and helps organizations comply with PIPEDA as it pertains to e-marketing activities.

We also produced an insert on CASL that was mailed through the CRA, with a potential reach of 477,350 businesses in Canada. The insert provided businesses with information on complying with PIPEDA and CASL. We also shared helpful tips and information related to CASL messaging on social media on a regular basis and during Fraud Prevention Month in March.

Our office’s Information Centre received 90 calls related to CASL from individuals and businesses. Most calls from individuals were related to unsolicited messages and possible scams, including fraudulent emails and phishing attempts. Inquiries from businesses concerned the general application of CASL and the most appropriate form of consent required to send out marketing material.

From an enforcement perspective, three investigations related to our office’s responsibilities under CASL were ongoing at the time of writing this report.

Further reading

Advice to businesses

Our office’s Business Advisory Directorate engages with businesses to assist them in assessing the privacy implications of their current practices – and to help them better understand the privacy implications of new technologies and business models before these are deployed in the marketplace.

Addressing privacy issues upfront avoids time-consuming and costly investigations, helps mitigate future privacy risks, offers organizations a measure of regulatory consistency and predictability in their dealings with our office, and allows Canadians to benefit from innovation.

Our office may proactively offer advisory services; however, businesses subject to PIPEDA can voluntarily request an advisory consultation with our office.

Mila COVI App

In March 2020, Montreal-based artificial intelligence research institute Mila approached our office to seek advice on the COVI contact-tracing app, which they had started to develop in response to the COVID-19 pandemic.

Our office’s Business Advisory Directorate, supported by a panel of internal cross-functional experts, started an advisory consultation on the overall design and functioning of the app.

Following our consultation, we were pleased to note that the developers adopted a number of key privacy principles. These included:

  • using personal information for the narrowly defined and limited purpose of alleviating the public health crisis;
  • limiting the use of the app over time, that is until the pandemic recedes; and
  • only sharing aggregated and de-identified data with government authorities.

VidCruiter

VidCruiter is a Canadian-based company operating out of New Brunswick that provides digital recruitment services to public and private-sector organizations. Its digital recruitment platform allows for virtual interviews to be conducted as part of the hiring process. In late spring 2019, VidCruiter independently approached the OPC to request privacy compliance guidance and advice.

We provided a number of recommendations to improve VidCruiter’s compliance with PIPEDA. Our advice focused on accountability, identifying purposes, consent, limiting use, disclosure and retention, safeguards, openness, and individual access.

Analysts from our office’s Business Advisory Directorate formulated advice in consultation with their counterparts in the Government Advisory Directorate, as many of VidCruiter’s clients are federal departments and agencies. (This work is discussed under The Privacy Act: A year in review.)

Following this advisory consultation, we followed-up with VidCruiter to determine how many of our non-binding compliance recommendations had been adopted. We were pleased to learn the company had already adopted 12 of the 13 recommendations and the final recommendation was also under consideration.

Outreach and stakeholder relations

Through our office’s outreach and stakeholder relations program, the Business Advisory Directorate participated in a dozen exhibits and events, and 36 stakeholder meetings. For example, we hosted the annual forum with the Privacy Commissioner for Chief Privacy Officers, and exhibited at events such as the Reboot Annual Privacy and Security Conference, Toronto Entrepreneurs 2019, and Franchise Expo in Regina. Our representatives were invited to speak at a half dozen events, including the Ontario Bar Association Privacy Law Summit, the Cyber Risk Summit and Communitech.

Over the past year, the Business Advisory Directorate started holding privacy clinics, which involve one-time, voluntary privacy compliance conversations relating to the initiatives and practices of Canadian businesses subject to PIPEDA. The directorate is using these conversations to expand the reach of the OPC’s advisory services in a flexible and cost-effective manner, and to provide both on-site and virtual promotional engagement opportunities. The ground work done in the latter half of 2019-2020 to design, develop and test such clinics allowed us to effectively continue our promotional engagements in a fully virtual context after the onset of the pandemic.

These events gave our office a chance to highlight new materials developed to help businesses understand their privacy obligations, such as a new video on meaningful consent and a video series designed to help businesses that are subject to PIPEDA understand their privacy obligations.

We promoted our guidance and tips on social media on an ongoing basis, as well as on special events such Cybersecurity Awareness Month and Privacy Awareness Week.

Contributions Program

Our office funds independent privacy research and related knowledge translation initiatives through its Contributions Program. The goal of the program is to generate new ideas, approaches and knowledge about privacy, which organizations can use to better safeguard personal information, and which individual Canadians can use to make informed decisions.

Each fall, we issue a call for proposals. Academic institutions and non-profit organizations (including industry associations, consumer and voluntary organizations, trade associations and advocacy organizations) are eligible to receive funding. The annual budget for the program is $500,000.

Of note, the Minister of Justice renewed the terms and conditions of the program in 2019-2020 for a new five-year cycle, which means our office can continue to provide funding for innovative research and public education projects during that period.

We received 27 proposals for the 2019-2020 funding cycle. Our office evaluated these proposals based on merit. Ultimately, 11 projects were selected to receive funding. New projects supported in 2019-2020 examined a variety of topics, such as parenting in the digital age, meaningful consent in the context of connected devices, and consent models for health innovators.

Our call for proposals in 2019-2020 expressed a particular interest in funding one or more design jams focused on developing new, cutting-edge consent solutions that incorporate the OPC’s Guidelines for obtaining meaningful consent. Design jams are collaborative brainstorming events that bring together experts from various backgrounds with the aim of generating solutions to a particular problem or challenge, in a creative environment. We funded three design jams as a result of this year’s call.

Of particular note this year was a project led by Citizen Hacks, a youth-run group that encourages the next generation of innovators to “actively participate in building a digital world that works for everyone.” Citizen Hacks organized a design jam that explored the central question: How can we build a digital future that protects everyone’s privacy?

Participants collaborated in teams to design, create and pitch an answer to this challenge over 36 hours. Through workshops, presentations, and panels taking place during the event, participants had numerous opportunities to learn about specific skills and approaches to creating privacy-oriented technology, and the connection between technology and society, no matter their prior level of experience in computer science. The design jam included a diverse group of participants, and privacy experts in attendance at the event were able to share their expertise.

Advice to Parliament

Every year, our office engages with Parliament through various channels. Parliamentarians primarily seek our input at committee, to advise on legislation that could impact Canadians’ privacy and to seek our expertise for committee studies on privacy-related issues.

In order to provide this support to Parliament, we analyze proposed legislation and legislative amendments and we follow committee studies that are exploring privacy related topics. We are pleased to note that Parliament accepted 68% of our recommendations this past year.

In addition to our work before parliamentary committees, we regularly respond to requests for individual briefings from parliamentarians or for presentations before a party caucus. Similarly, we present on privacy matters before interparliamentary associations. This past year, we presented before the Canadian branch of the Assemblée parlementaire de la Francophonie (CAPF). This work assists us to achieve our mandate of protecting and promoting privacy rights in Canada.

Below is a summary of some of the advice our office has provided to Parliament in 2019¬ 2020.

Cybersecurity in the financial sector

In April 2019, we appeared before the Standing Committee on Public Safety and National Security (SECU) in the context of the committee’s study on cybersecurity in the financial sector as a national economic security issue.

We reiterated concerns we expressed at the Standing Senate Committee on Banking, Trade and Commerce on its study of Open Banking, namely, that the financial sector must be built upon a foundation that includes respect for privacy and other fundamental rights.

We called for banks and other financial institutions to have robust standards for both cybersecurity and privacy, and explained to SECU members how mandatory breach reporting requirements can be a tool to enable institutions to confront the adequacy of cybersecurity plans and preparations – or lack thereof.

In its final report, SECU emphasized our call for fundamental changes to Canada’s privacy laws, including order-making powers along the lines of those held by our office’s counterpart in the United Kingdom, as well as a rights-based approach.

In conclusion, SECU proposed that in order to be competitive in the digital economy, Canada should “maximiz[e] its ability to enforce the privacy rights and security of its citizens in this domain.” We are encouraged that the Committee supports our call for much-needed reform of our privacy laws.

Children and the no-fly list

In May 2019, we appeared before the Senate Standing Committee on Human Rights (RIDR) as part of their ongoing study to examine and monitor issues relating to human rights and to review the machinery of government dealing with Canada’s international and national human rights obligations.

Our appearance focused on the Passenger Protect Program. The Committee’s particular interest was on the difficulties that families face when their children cannot check-in to a flight because someone else with the same name is considered a security threat.

Our statement emphasized that while we recognize the importance of assessing individuals arriving in Canada for potential national security threats, such activities must be conducted without infringing the rights of the traveling public.

Our office has been seized with the Passenger Protect Program since its inception. Over the years, we have offered policy-level recommendations to Transport Canada and to Public Safety Canada on safeguarding information, providing notice to individuals in a privacy sensitive manner, and confirming there is effective recourse for those improperly impacted by the Program.

We note that the passage of Bill C-59, An Act respecting national security matters, put in place a redress system for those falsely identified as risks to travel. That system has been designed to assist in reducing the negative impacts for those falsely identified as being on the list, such as being subject to additional secondary questioning, and the risk of reputational harm, among other problems.

We have received a PIA on the redress program, which is being reviewed at the time of writing this report. We will provide recommendations to support implementing the program in a manner that safeguards the privacy rights of Canadians.

International and domestic cooperation

As the world becomes increasingly digitized, the challenge of protecting privacy as personal information flows across borders remains a common goal of many data protection authorities. Our office has assumed a leadership role in this area, and has long been cooperating with its domestic and international counterparts to leverage resources, share best practices, and more effectively enforce privacy laws, in Canada and abroad.

Working collaboratively with other regulators ultimately helps to better protect Canadians. Stronger privacy rights in other parts of the world, and partnerships with international privacy enforcement authorities, help ensure Canadians’ personal information remains protected when it is sent outside of Canada’s borders for processing.

We achieve this by taking part in working groups, adopting resolutions and issuing joint statements with our counterparts. We collaborate on these projects domestically with the Information Commissioner of Canada and our provincial and territorial counterparts, and internationally with fellow members of bodies such as the Global Privacy Assembly, the Global Privacy Enforcement Network, the Association francophone des autorités de protection des données personnelles and the Organization for Economic Co-operation and Development (OECD), among others.

Our cooperative efforts have expanded to encompass, with increasing frequency, joint investigations and evidence sharing with other privacy enforcement authorities in relation to companies that deal in personal information across our jurisdictions. Most of the tools and services that underpin digital life today are provided by organizations scattered across the globe, from multinational tech giants to small and medium enterprises; this calls for a coordinated response by data protection authorities.

Today, governments are increasingly making use of technology and datasets provided by the private sector, which requires us to work more closely with domestic partners on public-sector issues.

This section outlines some activities our office conducted in collaboration with other regulatory bodies in 2019-2020.

Joint work with the National Security and Intelligence Review Agency

In previous years, our office has reported extensively on developments related to Bill C 59, An Act respecting national security matters. The passage of Bill C 59 in June 2019 brought about significant changes to the review structure of Canada’s federal security and intelligence community, which includes CSIS, the Communications Security Establishment Canada (CSEC) and the RCMP.

Notably, the Act authorized our office and the National Security and Intelligence Review Agency (NSIRA) to coordinate activities and to share information.

NSIRA was created to increase accountability and transparency in matters of national security, notably through a comprehensive review of national security and intelligence activities across the Government of Canada.

Having NSIRA and our office working collaboratively to avoid duplication of efforts will provide for more effective reviews of information handling practices in the national security sphere. Bringing together NSIRA’s expertise on national security and our expertise on data protection and privacy will result in stronger oversight of intelligence and national security activities, and better reports to Parliament and to Canadians. This is especially important given the role of privacy as a precondition to the exercise of other human rights.

Over the past year, our office has been engaging with NSIRA to discuss how best to coordinate our work. We are currently developing a memorandum of understanding, which will set the parameters of our joint activities and provide transparency to parliamentarians, national security agencies and the general public.

Our discussions have been productive and we have identified opportunities for collaboration between our offices. We are planning a joint review in 2020-2021 of disclosures made by federal government departments under the Security of Canada Information Disclosure Act (SCIDA), which we discussed earlier in this report under The Privacy Act: A year in review.

Privacy breaches that present national security concerns are another potential area of focus, particularly those involving cyber incidents targeting federal government systems. As highlighted previously in this report, public-sector breaches reported to our office are rarely attributed to cyber threats, which appears at odds with the breach reports we receive from the private sector. We believe our work with NSIRA will give us greater insight into the impact of cyber attacks on privacy breaches in the public sector.

Federal, provincial and territorial resolution on law reform

Our office works collaboratively with its provincial and territorial counterparts on common public education and policy matters, in the public and private sector. As we are all united in the effort to protect and promote privacy rights, we occasionally issue joint resolutions to highlight consensus on matters of public policy, and outline shared concerns or support on certain issues of concern to Canadians. This alignment among information and privacy commissioners provides a benefit to Canadians by calling for action that will encourage consistent privacy protections for individuals across the country.

At the annual meeting of the federal, provincial and territorial information and privacy commissioners in Charlottetown, Prince Edward Island, in October 2019, commissioners issued a joint resolution calling on governments to modernize access and privacy legislation to better protect Canadians. This resolution is discussed earlier in this report, under Privacy in a pandemic.

Global Privacy Assembly resolution on privacy as a human right

During the International Conference of Data Protection Commissioners held in Tirana, Albania, in October 2019, members agreed to change the Conference’s name to Global Privacy Assembly (GPA). Members felt the new name better reflects their year-round commitment to mutual support and knowledge sharing, and fosters stronger cooperation.

During the 2019 conference, members adopted the Resolution on Privacy as a Fundamental Human Right and Precondition for Exercising other Fundamental Rights, which was sponsored and drafted by our office. It is discussed earlier in this report, under Privacy in a pandemic.

Further reading

Global Privacy Assembly Policy Strategy Working Group

Building on our role as sponsor of the GPA’s 2019 Resolution on Privacy as a Fundamental Human Right and Precondition for Exercising other Fundamental Rights, our office is chairing a work stream of the GPA’s Policy Strategy Working Group that studies the relationship between privacy and other rights and freedoms, including human rights and democracy.

At both the national and international level, privacy authorities from around the world have been compiling information from a global pool of resources, including jurisprudence, law and academic research on the subject of privacy and other rights and freedoms.

Our goal is to lead the development of an informed narrative that speaks to privacy and human rights, as well as privacy and democratic or political rights. The aim is to encourage global progress in the recognition of privacy as a fundamental human right and to help GPA members promote the calls for action articulated in the Resolution on Privacy as a Fundamental Human Right and Precondition for Exercising other Fundamental Rights.

Further reading

Privacy enforcement with provincial and territorial counterparts

The Domestic Enforcement Collaboration Forum (DECF) promotes and facilitates collaborative efforts between the Office of the Information and Privacy Commissioner of Alberta, the Office of the Information and Privacy Commissioner of British Columbia, the Commission d’accès à l’information du Québec and our office, which also acts as chair.

Over the past year, the DECF identified new complaints or incidents of potential interest for collaboration, and provided updates and strategic advice on ongoing joint investigations. The DECF also discussed other enforcement matters of interest to participants and shared general trends, investigation techniques and best practices with respect to complaints and investigations.

Furthermore, the DECF served as a forum to discuss enforcement strategies in the public sector, for instance with respect to access complaints, and to highlight information about key public sector findings.

In the past year, we have entered into more joint investigations with other authorities in Canada than ever before.

For instance, in July 2019, we announced we were investigating a privacy breach at Desjardins along with our counterparts at the Commission d’accès à l’information du Québec.

In February 2020, we launched a joint investigation into the alleged use of facial recognition technology at Clearview AI with provincial commissioners in Alberta, British Columbia and Quebec, our first with all three provinces with private-sector privacy legislation.

In November 2019, the Information and Privacy Commissioner for British Columbia, Michael McEvoy, and Commissioner Therrien released their findings in the matter of AggregateIQ’s use of personal information in providing consulting services to political campaigns in Canada, the United States and the United Kingdom.

During the past year, we also continued an investigation into Cadillac Fairview Corporation Limited, where we are looking into the alleged use of facial recognition in conjunction with cameras installed in mall directories. We are investigating this matter jointly with Alberta and British Columbia, and sharing information with Quebec.

Privacy enforcement with international counterparts

In 2019, the Global Privacy Enforcement Network (GPEN) continued to foster increased communication and collaboration among members and partner networks. GPEN intensified its collaboration with the International Consumer Protection and Enforcement Network (ICPEN), addressing the need for greater collaboration on matters where there is an intersection between the regulatory spheres of privacy and consumer protection. Most notably, GPEN endorsed an ICPEN letter to app marketplaces calling for improved privacy transparency, representing the first global cross-regulatory collaborative effort of its kind.

In the fall, our office was among 16 data protection authorities who participated in the 2019 GPEN Sweep. This year, the Sweep examined how prepared organizations are to manage and respond to data breaches.

Our office is currently co-chair of the GPA’s new permanent International Enforcement Cooperation Working Group (IEWG). Members of the IEWG are working to advance enforcement cooperation across jurisdictions by establishing practical tools, solutions and approaches that support enforcement cooperation generally and in active investigations, and by engaging with each other on privacy enforcement cases of global interest.

As part of collaborative efforts inspired by this group, our office and five of its international counterparts published an open letter to video teleconferencing companies reminding them of their obligations to comply with the law and handle people’s personal information responsibly. This open letter was directly related to privacy risks heightened during the COVID-19 pandemic.

In partnership with the Office of the Australian Information Commissioner, our office also co-chairs the Digital Citizen and Consumer Working Group, a 13-member international working group studying the intersections between privacy or data protection, consumer protection and anti-trust, with a view to promoting cross-regulatory cooperation between these spheres.

It should be noted that under the current legal framework, we may collaborate with international partners in circumstances where we would not be able to do so with Canadian regulators. For instance, when the Competition Bureau recently examined Facebook’s claims about the privacy of Canadians, our office was conducting its investigation into Facebook and privacy concerns flowing from the Cambridge Analytica revelations. Even though we were looking into similar issues, our organizations have limited ability to collaborate outside matters related to CASL. We have called for legislative amendments to rectify this situation, and enhance the collective protection of Canadians in the process.

Privacy cases in the courts

Privacy Commissioner of Canada v Facebook, Inc. (T-190-20) (Federal Court) (Facebook 1) Facebook, Inc. v Privacy Commissioner of Canada (T-473-20) (Federal Court) (Facebook 2)

Facebook 1 is an application brought by the Privacy Commissioner of Canada under paragraph 15(a) of PIPEDA for an order following the issuance of a report of findings dated April 25, 2019, regarding a complaint concerning the personal information handling practices of the respondent, Facebook, Inc.

The application follows a joint investigation last year by the Privacy Commissioner of Canada and the Office of the Information and Privacy Commissioner for British Columbia that found major shortcomings in the social media giant’s privacy practices.

Facebook disputed the findings of the investigation and refused to implement recommendations to address the deficiencies identified.

The Office of the Privacy Commissioner filed its notice of application with the Federal Court on February 6, 2020.

The application is seeking:

  • a declaration that Facebook contravened PIPEDA;
  • an order requiring Facebook to implement effective, specific and easily accessible measures to obtain, and ensure it maintains, meaningful consent from all users;
  • an order requiring Facebook to specify the technical revisions, modifications and amendments to be made to its practices to achieve compliance with PIPEDA;
  • an order that the parties follow up with the Court, as well as an order that the Court retain jurisdiction for the purposes of ongoing monitoring and enforcement;
  • an order prohibiting Facebook from further collecting, using and disclosing any personal information of users in any manner that contravenes PIPEDA; and
  • an order requiring Facebook to publish a public notice of any action taken or proposed to be taken to correct its practices that contravene PIPEDA.

The Federal Court has, among other powers, the authority to impose binding orders requiring an organization to correct or change its practices and comply with the law.

On March 6, 2020, our office served on Facebook our affidavit evidence in support of the application. Facebook has since brought a motion to strike portions of our affidavit.

On April 15, 2020, Facebook also brought an application for judicial review under s. 18.1 of the Federal Courts Act of our Report of Findings (Facebook 2). Facebook is seeking judicial review of our decision to investigate and continue to investigate, and the investigation process, and seeks to quash the resulting report of findings.

In response, our office brought a motion to strike Facebook’s application for judicial review on the basis that Facebook is out of time to bring such a challenge and has an adequate alternative remedy in its legal right to respond to our office’s ongoing application under section 15 of PIPEDA (Facebook 1).

Both of these applications are being specially managed by a Case Management Judge of the Federal Court. The two motions are moving forward on parallel tracks and a joint hearing of both of these interlocutory proceedings is anticipated in late 2020 or early 2021, at the earliest. The main applications will not move ahead on their merits until those motions are determined.

Google Reference (T-1779-18) (Federal Court)

This is an application by the Privacy Commissioner of Canada pursuant to section 18.3 of the Federal Courts Act referring two questions for hearing and determination. These questions are as follows:

  • Does Google LLC in the operation of its search engine service, collect, use or disclose personal information in the course of commercial activities within the meaning of paragraph 4(1)(a) of PIPEDA when it indexes web pages and presents search results in response to searches of an individual's name?
  • Is the operation of Google's search engine service excluded from the application of Part I of PIPEDA by virtue of paragraph 4(2)(c) of PIPEDA because it involves the collection, use or disclosure of personal information for journalistic, artistic or literary purposes and for no other purpose?

The questions arose in the context of a complaint from an individual alleging that Google is contravening PIPEDA by continuing to prominently display links to online news articles concerning him in search results when his name is searched using Google's search engine service. The complainant requested that Google remove the articles in question from results for searches of his name.

In its initial response to the complaint, Google took the position, in part, that PIPEDA does not apply to it in the circumstances. In order to resolve, as a first step, this jurisdictional issue, the Privacy Commissioner referred the above two questions regarding whether PIPEDA applies to Google's operation of its search engine to the Federal Court for determination before continuing with the investigation.

Shortly after the reference was filed, Google brought a motion seeking to have the reference expanded to deal with the issue of whether a potential requirement to remove links from its search results would violate section 2(b) of the Canadian Charter of Rights and Freedoms, or, alternatively, to have the reference struck. On April 16, 2019, the Federal Court dismissed Google’s motion. Google appealed this decision. The appeal was heard on June 26, 2019, and dismissed on July 22, 2019.

The Canadian Broadcasting Corporation (CBC) and a coalition of various other media organizations (Media Coalition) brought motions to be added as parties to the reference or alternatively, be given leave to intervene. These motions were dismissed on March 1, 2019, with leave to file another application for leave to intervene following the determination of Google’s motion on scope. The CBC appealed the decision. The appeal was heard on June 26, 2019, and dismissed on July 22, 2019.

Subsequently, both Google and the Complainant filed motions seeking leave to file additional evidence in the reference.

In a decision issued July 24, 2020, the Federal Court granted the Complainant’s motion and Google’s motion in part. The Court also granted leave for the CBC and the Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) to participate as interveners in the proceeding.

At present, the remaining steps are for the parties and interveners to file their written arguments and for a hearing to be held.

Our office has indicated that it will not finalize its Draft Position Paper on Online Reputation until the conclusion of the reference proceeding.

Further reading

Canadian Coalition for Genetic Fairness v. Attorney General of Quebec et al (SCC 38478) (Supreme Court of Canada)

This case concerned a reference by the Government of Quebec concerning the constitutionality of the Genetic Non-Discrimination Act, S.C. 2017, c. 3 (GNDA), which prohibits certain harmful practices relating to the collection, use and disclosure of genetic test results.

In particular, the GNDA creates stand-alone prohibitions relating to forced genetic testing and the collection, use and disclosure of genetic test results without consent (sections 1 to 7). It also amended the Canada Labour Code (section 8) and the Canadian Human Rights Act (sections 9 and 10) to protect federally regulated employees in relation to genetic testing and to protect against discrimination based on genetic characteristics.

Shortly after its passage, the Government of Quebec referred the constitutionality of sections 1 to 7 of the GNDA (but not the amendments to the Canada Labour Code or to the Canadian Human Rights Act) to the Quebec Court of Appeal. The reference asked whether sections 1 to 7 of the GNDA exceed Parliament’s authority to make laws in relation to criminal matters under the Constitution Act, 1867.

The provisions of the GNDA that were at issue prohibit the following:

  • Requiring an individual to undergo a genetic test as a condition of providing goods/services or of entering into/maintaining a contract or any of its terms, or refusing to engage in such activities because of a refusal to undergo such testing (section 3).
  • Requiring an individual to disclose the results of a genetic test as a condition of engaging in one of the activities listed above, or refusing to engage in the activities because of the refusal to disclose these results (section 4).
  • The collection, use or disclosure of the results of a genetic test without the written consent of the individual concerned by any person engaged in providing goods or services, or entering into or maintaining contracts with individuals (section 5).

Section 6 exempts health care practitioners and researchers from the application of sections 3 to 5. Section 7 makes it an offence to contravene sections 3 to 5, with the potential for fines and prison time.

The Quebec Court of Appeal found that the provisions at issue were ultra vires Parliament’s power to enact laws in relation to criminal matters. However, on appeal to the Supreme Court of Canada, a majority of the justices (5 to 4) found that sections 1 to 7 of the GNDA were validly enacted under the federal criminal law power. Karakatsanis J. and Moldaver J. each wrote concurring reasons for the majority.

For Karakatsanis J. the pith and substance of the provisions was to ensure individuals’ control over their personal information disclosed by genetic tests in the areas of contracting and the provision of goods and services, in order to address fears that individuals’ genetic test results will be used against them and to prevent discrimination based on that information. This was a valid criminal law purpose because it protected against a risk of harm to autonomy, privacy and equality as well as to health.

Karakatsanis J. noted in particular that the criminal law power can be used to safeguard autonomy and privacy. In this case, Karakatsanis J. found that forced genetic testing poses a clear threat to autonomy and to an individual’s privacy interest in not finding out their genetic makeup. Forced disclosure of genetic test results and the collection, use or disclosure of genetic test results without written consent threaten autonomy and privacy by compromising an individual’s control over access to their genetic information.

Moldaver J. saw sections 1 to 7 of the GNDA was about protecting health “by prohibiting conduct that undermines individuals’ control over the intimate information revealed by genetic testing” and thereby mitigating fears that genetic tests could be used against them. In his view, because the law was aimed at protecting against a real threat to health – fear preventing individuals from taking a potentially life-saving genetic test – it had a valid criminal law purpose.

Appendix 1: Definitions

Complaint Types

Access
The institution/organization is alleged to have denied one or more individuals access to their personal information as requested through a formal access request.
Accountability
Under PIPEDA, an organization has failed to exercise responsibility for personal information in its possession or custody, or has failed to identify an individual responsible for overseeing its compliance with the Act.
Accuracy
The institution/organization is alleged to have failed to take all reasonable steps to ensure that personal information that is used is accurate, up-to-date and complete.
Challenging compliance
Under PIPEDA, an organization has failed to put procedures or policies in place that allow an individual to challenge its compliance with the Act, or has failed to follow its own procedures and policies.
Collection
The institution/organization is alleged to have collected personal information that is not necessary, or has collected it by unfair or unlawful means.
Consent
Under PIPEDA, an organization has collected, used or disclosed personal information without valid consent, or has made the provisions of a good or service conditional on individuals consenting to an unreasonable collection, use, or disclosure.
Correction/notation (access)
The institution/organization is alleged to have failed to correct personal information or has not placed a notation on the file in the instances where it disagrees with the requested correction.
Correction/notation (time limit)
Under the Privacy Act, the institution is alleged to have failed to correct personal information or has not placed a notation on the file within 30 days of receipt of a request for correction.
Extension notice
Under the Privacy Act, the institution is alleged to have not provided an appropriate rationale for an extension of the time limit, applied for the extension after the initial 30 days had been exceeded, or, applied a due date more than 60 days from date of receipt.
Fee
The institution/organization is alleged to have inappropriately requested fees in an access to personal information request.
Identifying purposes
Under PIPEDA, an organization has failed to identify the purposes for which personal information is collected at or before the time the information is collected.
Index
Info Source (a federal government directory that describes each institution and the information banks – groups of files on the same subject – held by that particular institution) is alleged to not adequately describe the personal information holdings of an institution.
Language
In a request under the Privacy Act, personal information is alleged to have not been provided in the official language of choice.
Openness
Under PIPEDA, an organization has failed to make readily available to individuals specific information about its policies and practices relating to the management of personal information.
Retention (and disposal)
The institution/organization is alleged to have failed to keep personal information in accordance with the relevant retention period: either destroyed too soon or kept too long.
Safeguards
Under PIPEDA, an organization has failed to protect personal information with appropriate security safeguards.
Time limits
Under the Privacy Act, the institution is alleged to have not responded within the statutory limits.
Use and disclosure
The institution/organization is alleged to have used or disclosed personal information without the consent of the individual or outside permissible uses and disclosures allowed in legislation.

Dispositions

Well-founded
The institution or organization contravened a provision of the Privacy Act or PIPEDA.
Well-founded and resolved
The institution or organization contravened a provision of the Privacy Act or PIPEDA but has since taken corrective measures to resolve the issue to the satisfaction of the OPC.
Well-founded and conditionally resolved
The institution or organization contravened a provision of the Privacy Act or PIPEDA. The institution or organization committed to implementing satisfactory corrective actions as agreed to by the OPC.
Not well-founded
There was no or insufficient evidence to conclude the institution/organization contravened the privacy legislation.
Resolved
Under the Privacy Act, the investigation revealed that the complaint is essentially a result of a miscommunication, misunderstanding, etc., between parties; and/or the institution agreed to take measures to rectify the problem to the satisfaction of the OPC.
Settled
The OPC helped negotiate a solution that satisfied all parties during the course of the investigation, and did not issue a finding.
Discontinued

Under the Privacy Act: The investigation was terminated before all the allegations were fully investigated. A case may be discontinued for various reasons, but not at the OPC’s behest. For example, the complainant may no longer be interested in pursuing the matter or cannot be located to provide additional information critical to reaching a conclusion.

Under PIPEDA: The investigation was discontinued without issuing a finding. An investigation may be discontinued at the Commissioner’s discretion for the reasons set out in subsection 12.2(1) of PIPEDA.

No jurisdiction
It was determined that federal privacy legislation did not apply to the institution/organization, or to the complaint’s subject matter. As a result, no report is issued.
Early resolution (ER)
Applied to situations in which the issue is resolved to the satisfaction of the complainant early in the investigation process and the Office did not issue a finding.
Declined to investigate
Under PIPEDA, the Commissioner declined to commence an investigation in respect of a complaint because the Commissioner was of the view that:
  • the complainant ought first to exhaust grievance or review procedures otherwise reasonably available;
  • the complaint could be more appropriately dealt with by means of another procedure provided for under the laws of Canada or of a province; or,
  • the complaint was not filed within a reasonable period after the day on which the subject matter of the complaint arose, as set out in subsection 12(1) of PIPEDA.
Withdrawn
Under PIPEDA, the complainant voluntarily withdrew the complaint or could no longer be practicably reached. The Commissioner does not issue a report.

Appendix 2: Statistical tables

Statistical tables related to the Privacy Act

Table 1

Privacy Act dispositions* of access and privacy complaints** by institution
Respondent Discontinued Early Resolved No jurisdiction Not well-founded Resolved Settled Well-founded Well-founded and conditionally resolved Well-founded and resolved Total
Administrative Tribunals Support Service of Canada   2               2
Atomic Energy of Canada Limited 2                 2
Bank of Canada   2   1         1 4
Canada Border Services Agency 1 27   15     2 7 10 62
Canada Council for the Arts         1         1
Canada Post Corporation   2 1 3           6
Canada Revenue Agency 8 28   19   1 1   5 62
Canada School of Public Service   1               1
Canadian Broadcasting Corporation   2               2
Canadian Food Inspection Agency   1             1 2
Canadian Institutes of Health Research   1   1           2
Canadian Northern Economic Development Agency   1               1
Canadian Radio-Television and Telecommunications Commission   1   1           2
Canadian Security Intelligence Service 1 6   7 1   1     16
Canadian Space Agency   1               1
Citizenship and Immigration Canada   1               1
Communications Security Establishment Canada   2               2
Correctional Service Canada 9 31   19 2   3   7 71
Courts Administration Tribunal 1                 1
Crown-Indigenous Relations and Northern Affairs Canada   2   1     1     4
Department of Justice Canada   1   5           6
Employment and Social Development Canada 6 13   3         2 24
Environment and Climate Change Canada   2   1           3
Federal Public Service Labour Relations and Employment Board 2                 2
Fisheries and Oceans Canada   1             1 2
Global Affairs Canada   3   4     1   2 10
Health Canada   1   1     1     3
Immigration and Refugee Board of Canada   3   1           4
Immigration, Refugees and Citizenship Canada   24   4           28
Indigenous Services Canada   1               1
Innovation, Science and Economic Development Canada   4               4
Library and Archives Canada   2   1         1 4
Military Police Complaints Commission       2           2
National Defence 1 24   6 2 2 6   4 45
National Research Council Canada   1               1
Natural Resources Canada 6 1               7
Natural Sciences and Engineering Research Council of Canada       5           5
Non-Public Property and Staff of the Non-Public Funds, Canadian Forces   1               1
Office of the Commissioner of Official Languages                 1 1
Office of the Correctional Investigator           1       1
Office of the Public Sector Integrity Commissioner of Canada 1           1   1 3
Parks Canada Agency   1   1           2
Parole Board of Canada   2   2           4
Privy Council Office       1           1
Public Health Agency of Canada 1                 1
Public Prosecution Service of Canada                 3 3
Public Safety Canada       1           1
Public Service Commission of Canada 2 1         2     5
Public Services and Procurement Canada 1 15   1 1       2 20
Royal Canadian Mounted Police   56   30 2   12 2 10 112
Royal Canadian Mounted Police External Review Committee       1           1
Security Intelligence Review Committee                 1 1
Shared Services Canada   2               2
Social Sciences and Humanities Research Council of Canada   1   3           4
Statistics Canada   8   13         2 23
Transport Canada 1 2   1           4
Treasury Board of Canada Secretariat 1               1 2
Veterans Affairs Canada   4   5 1 1     1 12
Women and Gender Equality Canada (formerly Status of Women Canada)       1         1 2
Total 44 285 1 160 10 5 31 9 57 602
* Privacy Act dispositions combining old and new counting methodology.
** Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 144.

Table 2

Privacy Act treatment times – Early resolution cases by complaint type
Complaint type Count Average treatment time
(months)
Access 158 3.7
Access 153 4.8
Correction/notation 5 2.5
Language    
Privacy 127 3.5
Accuracy 3 2.5
Collection 33 4.2
Retention and disposal 2 2.5
Use and disclosure 89 4.7
Time limits 53 0.2
Correction/notation    
Time limits 53 0.2
Total 338 3.9

Table 3

Privacy Act treatment times* – All other investigations** by complaint type***
Complaint type Count Average treatment time
(months)
Access 170 18.1
Access 157 18.5
Correction/notation 13 18.9
Language    
Privacy 147 24.6
Accuracy    
Collection 49 24.3
Retention and disposal 4 11.8
Use and disclosure 94 26
Time limits 680 7.5
Correction/notation 2 8.9
Extension notice 14 9.3
Time limits 664 8.2
Total 997 12.4
* These results are impacted by the reduction of backlogged files, several of which were older than 12 months, thus increasing the overall average treatment times.
** Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 144.
*** Privacy Act dispositions combining old and new counting methodology.

Table 4

Privacy Act treatment times* – All closed** files by disposition***
Complaint type Number Average treatment time (months)
Early resolved 338 3.9
All other investigations 997 12.4
Discontinued 59 34.7
Declined to investigate 1 0.0
No jurisdiction 1 26.0
Not well-founded 168 14.2
Resolved 12 14.3
Settled 5 35.0
Well-founded 177 12.3
Well-founded and conditionally resolved 200 7.3
Well-founded and resolved 374 10.7
Total 1,335 10.3
* These results are impacted by the reduction of backlogged files, several of which were older than 12 months, thus increasing the overall average treatment times.
** Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 144.
*** Privacy Act dispositions combining old and new counting methodology.

Table 5

Privacy Act breaches by institution
Respondent Count
Canada Post Corporation 1
Canada Revenue Agency 6
Canada School of Public Service 1
Canadian Human Rights Commission 1
Canadian Institutes of Health Research 1
Canadian Security Intelligence Service 1
Canadian Transportation Agency 1
Correctional Service Canada 57
Department of Justice Canada 2
Employment and Social Development Canada 211
Environment and Climate Change Canada 3
Fisheries and Oceans Canada 1
Global Affairs Canada 2
Health Canada 1
Immigration, Refugees and Citizenship Canada 8
Innovation, Science and Economic Development Canada 1
Military Grievances External Review Committee 1
National Defence 1
National Film Board of Canada 1
Non-Public Property and Staff of the Non-Public Funds, Canadian Forces 1
Office of the Correctional Investigator 2
Parks Canada Agency 1
Prince Rupert Port Authority 1
Public Health Agency of Canada 1
Public Prosecution Service of Canada 1
Public Safety 1
Public Sector Pension Investment Board 1
Public Service Commission of Canada 13
Public Services and Procurement Canada 6
Royal Canadian Mounted Police 4
Shared Services Canada 4
Social Sciences and Humanities Research Council of Canada 1
Statistics Canada 2
Veterans Affairs Canada 1
Total 341

Table 6

 
Privacy Act complaints* and breaches
Category Total
Accepted
Privacy 199
Access 216
Time Limits 346
Total accepted* 761
Closed through early resolution
Access 158
Privacy 127
Time Limits 53
Total 338
Closed through all other investigations**
Access 170
Privacy 147
Time Limits 680
Total 997
Total closed*** 1,335
Breaches received
Accidental disclosure 123
Loss 168
Theft 18
Unauthorized access 32
Total received 341
* Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 3.
** Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 144.
*** Privacy Act dispositions combining old and new counting methodology.

Table 7

  
Privacy Act complaints accepted by complaint type
Complaint type Early resolution Investigation Total number Total
percentage*
Number Percentage* Number Percentage*
Privacy
Accuracy 2 1%   0% 2 0%
Collection 21 6% 11 3% 32 4%
Retention and disposal 1 0% 5 1% 6 1%
Use and disclosure 119 34% 40 10% 159 21%
Access
Access 150 43% 64 15% 214 28%
Correction/notation 1 0% 1 0% 2 0%
Language   0%   0%   0%
Time limits
Correction/notation   0%   0% 0 0%
Extension notice   0% 6 1% 6 1%
Time limits 51 15% 289 69% 340 45%
Total 345 100% 416 100% 761 100%
* Figures may not sum to total due to rounding.

Table 8

 
Privacy Act top 10 institutions by complaints accepted
Respondent Privacy Access Time limits Total
Early resolution Investigation Early resolution Investigation Early resolution Investigation
Canada Border Services Agency 7 3 16 6 4 6 42
Canada Revenue Agency 7 5 15 6 4 26 63
Canadian Security Intelligence Service     6 8 1   15
Correctional Service Canada 10 2 23 5 2 113 155
Employment and Social Development Canada 4 4 7 6 1 3 25
Global Affairs Canada 1 3 3 1 4 7 19
Immigration, Refugees and Citizenship Canada 9 2 8 1 16 8 44
National Defence 2 4 6 1 3 17 33
Public Services and Procurement Canada 59 1 6 3   1 70
Royal Canadian Mounted Police 16 16 34 10 6 94 176
Total 115 40 124 47 41 275 642

Table 9

 
Privacy Act complaints accepted by institution
Respondent Early resolution Investigation Total
Administrative Tribunals Support Service of Canada 1 4 5
Agriculture and Agri-food Canada 1 1 2
Auditor General Of Canada, Office of the   1 1
Bank of Canada 2 1 3
Canada Border Services Agency 27 15 42
Canada Industrial Relations Board   2 2
Canada Mortgage and Housing Corporation 1 1 2
Canada Post Corporation 1 3 4
Canada Revenue Agency 26 37 63
Canada School of Public Service 1 3 4
Canadian Broadcasting Corporation 1   1
Canadian Food Inspection Agency 2 1 3
Canadian Heritage 1   1
Canadian Human Rights Commission   2 2
Canadian Institutes of Health Research 1   1
Canadian Museum of Nature   1 1
Canadian Northern Economic Development Agency 1   1
Canadian Radio-Television and Telecommunications Commission 1   1
Canadian Security Intelligence Service 7 8 15
Canadian Space Agency 1   1
Canadian Transportation Agency   1 1
Civilian Review and Complaints Commission for the Royal Canadian Mounted Police 1   1
Communications Security Establishment Canada 1 1 2
Correctional Service Canada 35 120 155
Crown-Indigenous Relations and Northern Affairs Canada   2 2
Department of Justice Canada 3 2 5
Employment and Social Development Canada 12 13 25
Environment and Climate Change Canada 2 1 3
Federal Economic Development Agency for Southern Ontario 1   1
Financial Transaction and Reports Analysis Centre of Canada   2 2
Fisheries and Oceans Canada 2 1 3
Global Affairs Canada 8 11 19
Health Canada 4 1 5
Immigration and Refugee Board of Canada 1 1 2
Immigration, Refugees and Citizenship Canada 33 11 44
Indigenous Services Canada 3   3
Innovation, Science and Economic Development Canada 4 2 6
Library and Archives Canada 2 2 4
National Defence 11 22 33
National Energy Board 1   1
National Research Council Canada 1   1
Natural Resources Canada 1   1
Natural Sciences and Engineering Research Council of Canada   1 1
Office of the Public Sector Integrity Commissioner of Canada 1   1
Parks Canada Agency 1   1
Privy Council Office   1 1
Public Prosecution Service of Canada   1 1
Public Safety Canada   2 2
Public Service Commission of Canada 1 1 2
Public Services and Procurement Canada 65 5 70
Royal Canadian Mounted Police 56 120 176
Service Canada   1 1
Shared Services Canada 1   1
Social Sciences and Humanities Research Council of Canada 1 1 2
Statistics Canada 5 3 8
Transport Canada 3 2 5
Treasury Board of Canada Secretariat 2 1 3
Veterans Affairs Canada 8 4 12
Total 345 416 761

Table 10

Privacy Act complaints accepted by province, territory or other
Province, territory or other Early resolution Investigation Total number Total
percentage*
Number Percentage* Number Percentage*
Ontario 144 42% 145 35% 289 38%
Quebec 62 18% 76 18% 138 18%
Nova Scotia 16 5% 19 5% 35 5%
New Brunswick 9 3% 9 2% 18 2%
Manitoba 7 2% 12 3% 19 2%
British Columbia 63 18% 104 25% 167 22%
Prince Edward Island 2 1% 1 0% 3 0%
Saskatchewan 6 2% 4 1% 10 1%
Alberta 25 7% 26 6% 51 7%
Newfoundland and Labrador 1 0% 4 1% 5 1%
Northwest Territories 1 0%   0% 1 0%
Yukon   0% 1 0% 1 0%
Nunavut 1 0%   0% 1 0%
US 3 1% 1 0% 4 1%
Other (non US) 2 1% 1 0% 3 0%
Not specified 3 1% 13 3% 16 2%
Total 345 100% 416 100% 761 100%
* Figures may not sum to total due to rounding.

Table 11

Privacy Act dispositions* by complaint type**
Complaint type Declined to
investigate
Discontinued No jurisdiction Not well-founded Resolved Settled Well-founded Well-founded and conditionally resolved Well-founded and resolved Early Resolved Total
Access
Access   15   79 3 2 3 3 52 153 310
Correction/notation   1   9 1 2       5 18
Language                     0
Privacy
Accuracy                   3 3
Collection   13   29     1 6   33 82
Retention and disposal       3         1 2 6
Use and disclosure   15 1 40 6 1 27   4 89 183
Time limits
Correction/notation                 2   2
Extension notice   1   1     3   9   14
Time limits 1 14   7 2   143 191 306 53 717
Total 1 59 1 168 12 5 177 200 374 338 1,335
* Privacy Act dispositions combining old and new counting methodology.
** Includes one representative complaint for each of several series of related complaints and complaints submitted by a small number of individual complainants; excluded complaints total 144.

Table 12

Privacy Act dispositions* of time limits by institution
Respondent Declined to investigate Discontinued Early resolved Not well-founded Resolved Well-founded Well-founded and conditionally resolved Well-founded and resolved Total
Canada Border Services Agency     6 2   2 1 14 25
Canada Mortgage and Housing Corporation               1 1
Canada Revenue Agency     4     1 7 26 38
Canada School of Public Service       2   1   1 4
Canadian Food Inspection Agency   1 1         1 3
Canadian Human Rights Commission               1 1
Canadian Museum of Nature             1   1
Canadian Northern Economic Development Agency               1 1
Canadian Security Intelligence Service     1 2       1 4
Communications Security Establishment Canada               1 1
Correctional Service Canada   2 2   1 98 157 57 317
Crown-Indigenous Relations and Northern Affairs Canada           2     2
Department of Justice Canada     1       2 1 4
Employment and Social Development Canada     1     1   2 4
Environment and Climate Change Canada     1       1   2
Federal Economic Development Agency for Southern Ontario     1           1
Global Affairs Canada     4     6 1 6 17
Health Canada     1           1
Immigration, Refugees and Citizenship Canada     16         8 24
Indigenous Services Canada               1 1
Innovation, Science and Economic Development Canada               3 3
Library and Archives Canada     1         1 2
National Defence     4     2 5 28 39
Parks Canada Agency     1           1
Public Prosecution Service of Canada           2 1   3
Public Services and Procurement Canada               1 1
Royal Canadian Mounted Police 1 11 5 2 1 30 14 158 222
Transport Canada     1       1   2
Treasury Board of Canada Secretariat   1 1     1     3
Veterans Affairs Canada     1         3 4
VIA Rail Canada               1 1
Total 1 15 53 8 2 146 191 317 733
* Privacy Act dispositions combining old and new counting methodology.

Statistical tables related to PIPEDA

Table 1

 
PIPEDA complaints accepted* by industry sector
Industry sector Number Proportion of all complaints accepted**
Accommodations 13 4%
Construction 1 0%
Entertainment 1 0%
Financial 74 26%
Food and beverage 1 0%
Government 1 0%
Health 27 9%
Insurance 16 6%
Internet 22 8%
Manufacturing 2 1%
Professionals 19 7%
Publishing (except internet) 3 1%
Rental 3 1%
Sales/Retail 21 7%
Services 32 11%
Telecommunications 37 13%
Transportation 13 4%
Utilities 3 1%
Total 289 100%
* PIPEDA complaints accepted based on count of one for each series of complaints dealing with related issue; excluded complaints total 22.
** Figures may not sum to total due to rounding.

Table 2

PIPEDA complaints accepted* by complaint type
Complaint Type Number Percentage of all
complaints accepted**
Access 97 34%
Accountability 2 1%
Accuracy 5 2%
Appropriate purposes 1 0%
Challenging compliance 1 0%
Collection 6 2%
Consent 83 29%
Correction/notation 0 0%
Identifying purposes 0 0%
Other 1 0%
Retention 2 1%
Safeguards 42 15%
Use and disclosure 49 17%
Total 289 100%
* PIPEDA complaints accepted based on count of one for each series of complaints dealing with related issue; excluded complaints total 22.
** Figures may not sum to total due to rounding.

Table 3

PIPEDA investigations closed* by industry sector and disposition**
Sector category Early resolved Declined Discontinued (under 12.2) No jurisdiction Not well-founded Settled Well-founded Well-founded and conditionally resolved Well-founded and resolved Withdrawn Total
Accommodations 14   1           2 1 18
Construction 1     1         3   5
Entertainment 1   1               2
Financial 45   4   4   2 2 8 1 66
Food and beverage 2                   2
Government                     0
Health 26 1     1           28
Individual 1                   1
Insurance 11     1 1     1 3 5 22
Internet 9   5       3   1   18
Manufacturing 3             1     4
Non-profit organizations                     0
Professionals 4   1     1       2 8
Publishing (except Internet) 3                   3
Rental 1             1     2
Sales/Retail 23       1       3 6 33
Services 26       2     3 2 1 34
Telecommunications 34   3   1   1   2 1 42
Transportation 14   1   2   1 1 2 6 27
Utilities 3                   3
Not specified                     0
Total 221 1 16 2 12 1 7 9 26 23 318
* PIPEDA complaints accepted based on count of one for each series of complaints dealing with a related issue; excluded complaints total 10.
** PIPEDA dispositions combining old and new counting methodology.

Table 4

PIPEDA investigations closed by complaint type* and disposition**
Complaint type Early resolved Declined to investigate Discontinued (under 12.2) No jurisdiction Not well-founded Settled Well-founded Well-founded and conditionally resolved Well-founded and resolved Withdrawn Total
Access 68   5   7   3 3 13 10 109
Accountability 2   1     1         4
Accuracy 5                   5
Appropriate purposes         1           1
Challenging compliance 1                   1
Collection 25   1   1   1 1 3   32
Consent 46   7 1 1   1 2 1 4 63
Correction/notation                     0
Identifying purposes                     0
Retention 3             1   1 5
Safeguards 45 1 1       1 2 4 6 60
Use and disclosure 26   1 1 2   1   5 2 38
Other                     0
Total 221 1 16 2 12 1 7 9 26 23 318
* PIPEDA complaints accepted based on count of one for each series of complaints dealing with a related issue; excluded complaints total 10.
** PIPEDA dispositions combining old and new counting methodology.

Table 5

PIPEDA investigations* – Average treatment time** by disposition***
Disposition Count Average treatment time in months
Early resolved 221 5.7
Declined to investigate 1 6.1
Discontinued (under 12.2) 16 11.7
No jurisdiction 2 28.4
Not well-founded 12 20.9
Settled 1 8.2
Well-founded 7 16.3
Well-founded and conditionally resolved 9 18.7
Well-founded and resolved 26 19.6
Withdrawn 23 13.9
Total 318  
Overall weighted average   9.0
* PIPEDA investigations based on count of one for each series of complaints dealing with a related issue; excluded complaints total 10.
** These results are impacted by the reduction of backlogged files, several of which were older than 12 months, thus increasing the overall average treatment times.
*** PIPEDA dispositions combining old and new counting methodology.

Table 6

PIPEDA investigations* – Average treatment times** by complaint and disposition*** types
Early resolved Dispositions not early resolved All dispositions
Complaint type Number Average treatment time in months Number Average treatment time in months Number Average treatment time in months
Access 68 5.3 41 17.9 109 10.0
Accountability 2 6.4 2 9.5 4 8.0
Accuracy 5 3.5     5 3.5
Appropriate purposes     1 13.2 1 13.2
Challenging
compliance
1 11.8     1 11.8
Collection 25 7.7 7 22.2 32 10.9
Consent 46 6.2 17 16.1 63 8.9
Correction/notation            
Identifying purposes            
Retention 3 6.4 2 13.2 5 9.1
Safeguards 45 3.2 15 14.1 60 6.0
Use and disclosure 26 7.9 12 15.8 38 10.4
Other            
Total 221 5.7 97 16.7 318 9.0
* PIPEDA investigations based on count of one for each series of complaints dealing with a related issue; excluded complaints total 10.
** These results are impacted by the reduction of backlogged files, several of which were older than 12 months, thus increasing the overall average treatment times.
*** PIPEDA dispositions combining old and new counting methodology.

Table 7

PIPEDA breach notifications by industry sector and incident type
Sector Incident type Total incidents per sector Percentage of total incidents*
Accidental disclosure Loss Theft Unauthorized access
Accommodation 4   1 6 11 2%
Agriculture, Forestry, Fishing and Hunting 2   1 4 7 1%
Construction 1   1 7 9 1%
Entertainment 2     10 12 2%
Financial 27 11 17 73 128 19%
Food and beverage 2 1   7 10 1%
Government 2     5 7 1%
Health 17   5 10 32 5%
Insurance 21 23 12 16 72 11%
Internet 1     10 11 2%
Manufacturing   4 1 13 18 3%
Mining and oil and gas extraction       3 3 0%
Non-profit organizations   5 3 16 24 4%
Professionals 5 1 4 14 24 4%
Publishing 4     11 15 2%
Rental       1 1 0%
Sales/retail 16 23 2 51 92 14%
Services 13 4 7 38 62 9%
Telecommunications 18   5 93 116 17%
Transportation 2     8 10 1%
Utilities           0%
Not specified 7   1 6 14 2%
Total 144 72 60 402 678 100%
* Figures may not sum to total due to rounding.


Table 8

Number of Canadians accounts affected by incident type
Incident type Number of accounts affected
Accidental disclosure 152,225
Loss 6,581
Theft 29,573
Unauthorized access 30,155,138
Total 30,343,517


Appendix 3 – Investigation processes

Privacy Act investigation process

Figure 2: Privacy Act investigation process: see text version.

Figure 3: Privacy Act investigation process: see text version.

Intake

Individuals make written submissions to our Office about alleged violations of the Privacy Act. Our Intake Unit reviews the matter to determine whether it constitutes a complaint, i.e., whether the allegations could constitute a contravention of the Act, and the most efficient manner in which to resolve it.

An individual may complain about any matter specified in section 29 of the Privacy Act, for example:

  • denial of access or unacceptable delay in providing access to his or her personal information held by an institution;
  • improper collection, use or disclosure of personal information, or
  • inaccuracies in personal information used or disclosed by an institution.

It is sometimes possible to immediately address issues, eliminating the need for our Office to pursue the matter as a standard investigation. In these cases, we simply resolve the matter through early resolution. The Privacy Commissioner may also initiate a complaint if satisfied there are reasonable grounds to investigate a matter.

  • Complaint
    • No
      The individual is advised, for example, that the matter is not in our jurisdiction.
    • Yes
      An investigator is assigned to the case.
      • Early resolution
        A complaint may be resolved before a standard investigation is undertaken if, for example, the issue has already been fully dealt with in another investigation and the institution has ceased the practice or the practice does not contravene the Act.
      • Standard investigation
        The investigation provides the factual basis for the Commissioner to determine whether the individual’s rights under the Privacy Act have been contravened.

        The investigator writes to the institution, outlining the substance of the complaint. The investigator gathers the facts related to the complaint through representations from both parties and through independent inquiry, interviews of witnesses, and review of documentation.

        Through the Commissioner or his delegate, the investigator has the authority to receive evidence, enter premises where appropriate, and examine or obtain copies of records found on any premises.
        • Discontinued
          A complaint may be discontinued if, for example, a complainant decides not to pursue it, or a complainant cannot be located.
        • Settled
          The OPC seeks to resolve complaints and to prevent contraventions from recurring. The Commissioner encourages resolution through negotiation and persuasion. The investigator assists in this process.
      • Analysis
        The investigator analyzes the facts and prepares recommendations to the Commissioner or his delegate. The investigator will contact the parties as necessary and review the facts gathered during the course of the investigation. The investigator may also tell the parties what he or she will be recommending, based on the facts, to the Commissioner or his delegate. At this point, the parties may make further representations.

        Analysis will include internal consultations with various directorates, for example, Legal Services, Policy, Research and Parliamentary Affairs, and Technology Analysis, as appropriate.
        • Findings
          The Commissioner or his delegate reviews the file and assesses the report. The Commissioner or his delegate, not the investigator, decides what the appropriate outcome should be and whether recommendations to the institution are warranted.

          The Commissioner or his delegate sends letters of findings to the parties. The letters outline the basis of the complaint, the relevant findings of fact, the analysis, and any recommendations to the institution. The Commissioner or his delegate may ask the institution to respond in writing, within a particular timeframe, outlining its plans for implementing any recommendations.

          The possible findings are:
          • Not well-founded: The evidence, on balance, does not lead the Commissioner or his delegate to conclude that the complainant’s rights under the Act have been contravened.
          • Well-founded: The institution failed to respect a provision of the Act.
          • Well-founded, resolved: The investigation substantiated the allegations and the institution has agreed to take corrective measures to rectify the problem.
          • Resolved: The evidence gathered in the investigation supports the allegations raised in the complaint, but the institution has agreed to take corrective measures to rectify the problem, to the satisfaction of this Office. The finding is used for those complaints in which “well-founded” would be too harsh to fit what essentially is a miscommunication or misunderstanding.

            In the letter of findings, the Commissioner or his delegate informs the complainant of his or her rights of recourse to the Federal Court on matters of denial of access to personal information.
            • Where recommendations have been made to an institution, OPC staff will follow up to verify that they have been implemented.
            • The complainant or the Commissioner may choose to apply to the Federal Court for a hearing of the denial of access. The Federal Court has the power to review the matter and determine whether the institution must provide the information to the requester.

PIPEDA investigation process

Figure 4: PIPEDA investigation process: see text version.

Figure 5: PIPEDA investigation process: see text version."

Intake

Individuals make written complaints to the OPC about violations of the Act. Our Intake Unit reviews these complaints, and, if necessary, follows up with complainants to seek clarification and gather additional information.

If complainants have not raised their concerns directly with the organization, we will ask them to do so in order to try to resolve the issue and then to come back to us if they are unsuccessful.

The Intake Unit is also sometimes able to immediately address issues. For example, if we have previously investigated the type of issue being raised and have determined that the activities are compliant with PIPEDA, an intake officer will explain this to the individual. Or, if we have previously determined that we do not have jurisdiction over the organization or type of activity, an intake officer will explain this and, where appropriate, refer the individual to other resources or sources of assistance.

In cases where the Intake Unit is not able to immediately address issues (and once the necessary information is gathered), the matter is accepted by our Office as a formal complaint. The Privacy Commissioner may also initiate a complaint if satisfied there are reasonable grounds to investigate a matter.

  • Complaint declined
    The Commissioner may decide to decline to investigate a complaint if certain conditions under subsection 12(1) of the Act are met. The complainant may request that the Commissioner reconsider this decision.
  • Sent to investigation
    Complaints of a serious, systemic or otherwise complex nature, for example, uncertain jurisdictional matters, multiple allegations or complex technical issues, are assigned to an investigator.
    • Investigation
      Investigations provide the factual basis for the Commissioner to determine whether the individual’s rights have been contravened under PIPEDA.

      The investigator writes to the organization, outlining the substance of the complaint. The investigator gathers the facts related to the complaint through representations from both parties and through independent inquiry, interviews of witnesses, and review of documentation. Through the Commissioner or his delegate, the investigator has the authority to receive evidence, enter premises where appropriate, and examine or obtain copies of records found on any premises.
      • Analysis
        The investigator analyses the facts and prepares recommendations to the Commissioner or his delegate.

        The investigator will contact the parties and review the facts gathered during the course of the investigation. The investigator will also advise the parties of his or her recommendations, based on the facts, to the Commissioner or his delegate. At this point, the parties may make further representations.

        Analysis will include internal consultations with various directorates, for example, Legal Services, Policy, Research and Parliamentary Affairs, and Technology Analysis, as appropriate.
        • No jurisdiction
          The OPC determines that PIPEDA does not apply to the organization or activities being complained about.
        • Findings
          The Commissioner or his delegate reviews the file and assesses the report. The Commissioner or his delegate (not the investigator) decides what the appropriate outcome should be and whether recommendations to the organization are warranted.
        • Preliminary report
          If the results of the investigation indicate that there likely has been a contravention of PIPEDA, the Commissioner or his delegate recommends to the organization how to remedy the matter, and asks the organization to indicate within a set time period how it will implement the recommendation.
        • Final report and letters of findings
          The Commissioner or his delegate sends letters of findings to the parties. The letters outline the basis of the complaint, the relevant findings of fact, the analysis, and the response of the organization to any recommendations made in the preliminary report.

          (The possible findings are described in Appendix 1 – Definitions.)

          In the letter of findings, the Commissioner or his delegate informs the complainant of his or her rights of recourse to the Federal Court.
          • Where recommendations have been made to an organization but have not yet been implemented, the OPC will ask the organization to keep us informed, on a predetermined schedule after the investigation, so that we can assess whether corrective action has been taken.
          • The complainant or the Commissioner may choose to apply to the Federal Court for a hearing of the matter. The Federal Court has the power to order the organization to correct its practices. The Court can award damages to a complainant, including damages for humiliation. There is no ceiling on the amount of damages.
        • Settled
          The OPC seeks to resolve complaints and to prevent contraventions from recurring. The OPC helps negotiate a solution that satisfies all involved parties during the course of the investigation. The investigator assists in this process.
        • Discontinued
          A complaint may be discontinued if, for example, a complainant decides not to pursue it or cannot be located, or if certain conditions, described in section 12.2 of the Act, are met.
  • Sent to early resolution officer
    Complaints which we believe could potentially be resolved quickly are sent to an early resolution officer. These complaints include matters where our Office has already made findings on the issues; where the organization has already dealt with the allegations to our satisfaction; or where it seems possible that allegations can be easily remedied.
    • Transferred to investigation
      If early resolution is unsuccessful, the case is transferred to an investigator.
    • Early resolution
      Early resolution officers encourage resolutions through mediation, negotiation and persuasion.

Appendix 4: Substantially similar legislation

Subsection 25(1) of PIPEDA requires our Office to report annually to Parliament on the “extent to which the provinces have enacted legislation that is substantially similar” to the Act.

Under paragraph 26(2)(b) of PIPEDA, the Governor in Council may issue an Order exempting an organization, a class of organizations, an activity or a class of activities from the application of PIPEDA with respect to the collection, use or disclosure of personal information that occurs within a province that has passed legislation that is “substantially similar” to PIPEDA.

On August 3, 2002, Industry Canada (now known as Innovation, Science and Economic Development Canada) published the Process for the Determination of ‘Substantially Similar’ Provincial Legislation by the Governor in Council, outlining the policy and criteria used to determine whether provincial legislation will be considered substantially similar. Under the policy, laws that are substantially similar:

  • provide privacy protection that is consistent with and equivalent to that in PIPEDA;
  • incorporate the 10 principles in Schedule 1 of PIPEDA;
  • provide for an independent and effective oversight and redress mechanism with powers to investigate; and
  • restrict the collection, use and disclosure of personal information to purposes that are appropriate or legitimate.

Organizations that are subject to provincial legislation deemed substantially similar are exempt from PIPEDA with respect to the collection, use or disclosure of personal information occurring within the respective province. Accordingly, PIPEDA continues to apply to the collection, use or disclosure of personal information in connection with the operations of a federal work, undertaking or business in the respective province, as well as to the collection, use or disclosure of personal information outside the province.

The following provincial laws that have been declared substantially similar to PIPEDA:

  • Quebec’s An Act Respecting the Protection of Personal Information in the Private Sector;
  • British Columbia’s Personal Information Protection Act;
  • Alberta’s Personal Information Protection Act;
  • Ontario’s Personal Health Information Protection Act, with respect to health information custodians;
  • New Brunswick's Personal Health Information Privacy and Access Act, with respect to health information custodians;
  • Newfoundland and Labrador's Personal Health Information Act, with respect to health information custodians; and
  • Nova Scotia’s Personal Health Information Act, with respect to health information custodians.

Appendix 5: Report of the Privacy Commissioner, Ad Hoc

The Office of the Privacy Commissioner (OPC) is subject to the Privacy Act and because it cannot investigate complaints against itself related to its handling of requests for access to personal information, a Privacy Commissioner Ad Hoc is appointed to conduct such investigations.

As Privacy Commissioner Ad Hoc, I have received in the past correspondence from individuals disappointed in the outcome of privacy complaint investigations and who call upon me to act. I received such a case again this year during the current period between April 1, 2019 and March 31, 2020. The matter was related to a complaint made against a federal institution where the OPC was called upon to investigate. As this is not part of my role, I nonetheless take the necessary time to provide written explanations as to why I cannot act as I have no authority to review a decision of the OPC at the conclusion of its privacy complaint investigation. Understandably, the OPC does not provide advice on whether an individual should or should not challenge its own findings; however, there is merit in the OPC informing the complainants that there are other options at their disposal, namely recourses before the Federal Court. This information could be further communicated on the OPC Web site.

For complaints regarding the OPC’s processing of requests to access personal information for the period ending on March 31, 2020, I received two complaints. The first complaint resulted in a finding that there was no need to issue any recommendation as the response provided by the OPC was lawful. A second complaint regarding the OPC received during this period is still under investigation.

Anne E. Bertrand, Q.C.

Report a problem or mistake on this page
Error 1: No selection was made. You must choose at least 1 answer.
Please select all that apply (required):

Note

Date modified: