Language selection


Appearance before the Standing Committee on Access to Information, Privacy and Ethics on its study on the federal government’s use of technological tools capable of extracting personal data from mobile devices and computers

February 1, 2024

Ottawa, Ontario

Opening statement by Philippe Dufresne
Privacy Commissioner of Canada

(Check against delivery)

Thank you Mr. Chair and Members of the Committee for the invitation to contribute to your study on the federal government’s use of technological tools capable of extracting personal data from mobile devices and computers.

Last fall, Radio Canada/CBC reported that 13 federal institutions had acquired such tools. The media reports raised questions about the reasons for their use and whether these organizations were respecting their privacy obligations in using these tools.

Initial reports referred to these tools as covert surveillance, or “spyware.” Since then, it has been clarified that the tools are digital forensics tools, which are distinct from spyware.

Digital forensics tools are used to extract and examine large numbers of files from laptops, hard drives, or mobile devices. They are typically used in investigations or technical analysis, and, often with the knowledge of the device owner.

They can be used to analyze the metadata of a file, or to create a timeline of events, such as when an account was used, when websites were accessed, or to see when an operating system was changed. These tools can also be used to recover deleted data or to ensure that data has been properly wiped from a device before it is discarded or repurposed. This makes them useful investigative tools that can help to preserve the integrity of an evidence chain.

Digital forensics tools are distinct from spyware in that spyware is typically installed remotely on a person’s device without their knowledge. It can then covertly collect personal information, such as keylogging and web-browsing history. One example would be on-device investigative tools, or ODITs, which are used by law enforcement to obtain data covertly and remotely from targeted devices. Importantly, in the context of law enforcement, judicial authorization is required prior to their use.

In August 2022, I testified before this Committee as part of your study about the use of ODITs by the RCMP. You will recall that in that case, the RCMP had advised the House that it had been using ODITs in recent years to obtain data covertly and remotely from targeted devices, but had not completed a Privacy Impact Assessment (PIA) and had not advised my Office.

In my appearance at the time, I noted that PIAs were required under Treasury Board policy but were not a legally binding requirement under privacy legislation. I recommended that the preparation of PIAs should be made a legal obligation for the government under the Privacy Act.

In its November 2022 report, the Committee endorsed this recommendation and also called for an amendment to the preamble of the Privacy Act to indicate that privacy is a fundamental right, and for the Act to be amended to include the concept of privacy by design and explicit transparency obligations for government institutions. I welcomed and supported these recommendations, and the Committee may wish to reiterate them as they remain outstanding and relevant.

With technology increasingly changing the manner in which personal information is collected, used, and disclosed, it continues to be important that government institutions carefully consider and assess the privacy implications of their activities to determine if and when PIAs are required.

My vision for privacy is one where privacy is treated as a fundamental right, where privacy supports the public interest and innovation, and where Canadians trust that their institutions are protecting their personal information.

Conducting a PIA and consulting my Office before a privacy-impactful new technology is used would strengthen privacy, support the public interest, and generate trust. This is why it should be a legal obligation for government institutions under the Privacy Act.

Currently, the Treasury Board Secretariat’s Directive on Privacy Impact Assessment requires that institutions conduct PIAs:

  • When personal information may be used as part of a decision-making process that directly affects an individual;
  • When there are major changes to existing programs or activities where personal information may be used for an administrative purpose;
  • When there are major changes to existing programs or activities as a result of contracting out or transferring programs or activities to another level of government or to the private sector; and
  • When new or substantially modified programs or activities will have an impact on overall privacy, even where no decisions are made about individuals.

In our advisory discussions with federal institutions, we promote the use of PIAs as an effective risk-management process. PIAs ensure that potential privacy risks are identified and mitigated – ideally at the front-end – across programs and services that collect and use personal information.

That said, the use of a new tool does not always trigger the need for a PIA. This will depend on how the tool is being used and what is being done with the information that it collects.

The OPC has used digital forensics tools, for example, in the context of certain breach investigations, to determine the nature, scale, and scope of the incident, including how a breach occurred and what types of personal information, if any, may have been compromised.

Digital forensics tools, however, can be used in ways that do raise important risks for privacy that would merit a full PIA.

For example, when conducting an internal investigation about an employee’s conduct where a decision will be made that will directly impact that individual, or as a tool used as part of an inquiry into alleged criminal activity.

In those types of cases, a PIA would be required – addressing not only the specific tool being used to collect personal information, but the broader program under which the tool is being used.

It is incumbent on all federal institutions to review their programs and activities accordingly.

Where digital forensics tools are used in the context of employee monitoring, institutions must take steps to ensure respect for the fundamental right to privacy and foster transparency and trust in the workplace. There should be clear rules about when and how monitoring technologies are to be used. My Office updated its guidance on privacy in the workplace in May 2023 and my provincial, territorial colleagues and I issued a joint resolution on employee privacy in October 2023.

In the present case, following the Radio Canada/CBC reports regarding the use of digital forensics tools in the federal government, my Office followed up with the institutions that were listed there and in this Committee’s motion to proceed with this study.

To summarize what we learned:

  • Three organizations indicated that they had completed and submitted a PIA on the relevant program;
  • One organization indicated that it had procured the tool but never used it;
  • Another organization indicated that a PIA was not required; and
  • The remaining eight organizations indicated that they had either started work on a new PIA, or were considering whether to conduct a new PIA or to update an existing one in light of their use of the tools.

We will continue to follow up with institutions to insist that PIAs be completed in cases where they are required under the Treasury Board policy. But without a requirement in the Privacy Act, there are limits to what we can do to ensure compliance.

Privacy Impact Assessments, in appropriate cases, are good for privacy, good for the public interest, and they generate trust. In this increasingly digital world, they should be a requirement under privacy law.

I am happy to take your questions.

Thank you.

Date modified: