Consolidated issue sheets for a general information session (ETHI)
Organizational Background
In this section
- OPC Role, mandate and powers of Commissioner
- Strategic Priorities (2024-2027)
- How the OPC Prioritizes its Work
- OPC Structure and Transformation
- OPC Budget
- AI Adoption at the OPC
- OPC Staff Expertise and Training
- Technology Analysis Lab
- Funding Models for Agents of Parliament
- Investigation Timelines
- Litigation Costs
- Federal, Provincial and Territorial (FPT) Collaboration
- International Collaboration
- Canadian Digital Regulators Forum – Year Two
- Public Education and Outreach
- OPC Departmental Results
- OPC Services to Canadians
OPC Role, mandate and powers of Commissioner
Speaking Points
- The mission of my Office is to protect and promote the privacy rights of individuals.
- As an agent of Parliament, I operate independently to oversee compliance with the Privacy Act, which governs federal institutions’ handling of personal information, and the Personal Information Protection and Electronic Documents Act (PIPEDA), our federal private-sector privacy law.
- In addition to overseeing compliance with the Privacy Act and PIPEDA, my office provides advice to Parliament, collaborates with international and domestic partners, funds privacy-related research, creates guidance for Canadians and organizations, and raises public awareness through communication and outreach activities.
- At present, my investigative recommendations are non-binding. Unlike the Information Commissioner and some provincial privacy Commissioners, I do not have the power to issue binding orders.
Background
- The Privacy Commissioner works independently from any other part of the government to investigate complaints from individuals (under s.29 of the Privacy Act, and s.11 of PIPEDA), and may also initiate an investigation if satisfied that there are reasonable grounds to investigate a matter (subsection 29(3) Privacy Act and subsection 11(2) of PIPEDA) and publicly report on the findings.
- PIPEDA does not apply in provinces that have enacted substantially similar legislation, except in relation to federal works, undertakings and businesses, or interprovincial or international transfers of personal information.
- Currently, the Commissioner must resolve complaints through negotiation and persuasion. However, if voluntary co-operation is not forthcoming, the Commissioner has the power to summon witnesses, administer oaths and compel the production of evidence.
- In certain circumstances, where matters remain unresolved, the Commissioner may pursue legal action before the Federal Court (under s.14 and 15 of PIPEDA, and s.41 of the Privacy Act – for denial of access only).
Lead: PRPA
Strategic Priorities (2024-2027)
Speaking Points
- My strategic plan and the three privacy priorities on which it is based offer a roadmap for maintaining trust and promoting innovation while protecting the fundamental right to privacy in the digital age.
- My three strategic priorities are 1) protecting and promoting privacy with maximum impact, 2) addressing and advocating for privacy in this time of technological change, and 3) championing children’s privacy rights.
- The priorities address areas where I believe that my Office can have the greatest impact, and where the greatest risks lie if they are not addressed. These will guide the OPC’s work through to 2027.
Background
- Priority one is focused on ensuring that the OPC’s activities are effective, efficient and impactful.
- e.g. the Transformation Plan (Launched in January, implemented in April 2025) restructured the OPC’s functions to ensure we respond rapidly and effectively to emerging issues and conduct compliance activities more strategically.
- Priority two focuses on bolstering our ability to address the privacy impacts of the fast-moving pace of technological advancements, especially in the world of artificial intelligence (AI) and generative AI.
- e.g., Collaboration among the G7 Data Protection and Privacy Authorities Roundtable to release statements on the role of data protection and privacy authorities in fostering trustworthy AI, and on child-appropriate AI.
- Priority three is about promoting and protecting the privacy rights of children, understanding and recognizing their unique sensitivities so that young people can benefit from technology without compromising their privacy and well-being.
- e.g. Conducting focused research and outreach with young people, launching a youth advisory council, and applying this lens to enforcement activities, ie.:TikTok and Powerschool.
Lead: PRPA
How the OPC Prioritizes its Work
Speaking Points
- The privacy issues and risks that we collectively face as a society, in both the public and private sectors, are vast and complex. With current resourcing levels, my Office simply cannot engage with them all. Given this, I must make choices regarding prioritizing our focus.
- Last year, I released a strategic plan for my Office that outlined three strategic priorities that will guide the OPC’s work through to 2027. These priorities are where we believe that we can have the greatest impact and where the greatest risks lie if they are not addressed.
- As part of my Office’s recent transformation, I have also centralized a business intelligence function, so that the OPC can use business intelligence and a robust risk management framework to make key decisions and strategic choices.
- Officials from my Office also meet regularly to discuss emerging issues so that we are positioned to respond more quickly and be more proactive in our work. When determining potential courses of action, we consider factors including alignment with the OPC’s strategic priorities, what is the most effective and efficient use of OPC resources, risks to Canadians’ privacy rights, and delivering optimal results for Canadians.
Background
- OPC’s three strategic priorities: 1) Protecting and promoting privacy with maximum impact; 2) Addressing and advocating for privacy in this time of technological change; and 3) Championing children’s privacy rights.
- The centralized BI function is focused on harnessing information and data to inform both the OPC’s promotion and enforcement activities, whether it’s to produce focused guidance, outreach, or to address compliance issues in the public or private sectors.
- Senior officials from across the OPC meet on a monthly basis to discuss emerging issues and make recommendations to the Commissioner on potential courses of action.
Lead: PRPA
OPC Structure and Transformation
Speaking Points
- Over the last few months, the OPC has implemented a major reorganization and transformation that aims to streamline our activities and focus on achieving more efficient outcomes for Canadians.
- The new structure will allow us to:
- place an even greater focus on our strategic priorities;
- respond more rapidly and effectively to emerging issues through more proactive engagement with organizations;
- evolve our approach to investigations to promote compliance; and
- bring stronger alignment to our work and to all our activities.
- The new structure, which took effect April 1, 2025, ensures that the OPC’s work, including our compliance activities, are effective, efficient and impactful to respond to the many and new challenges facing privacy today.
Background
- The OPC Transformation builds greater collaboration and cohesion across the Office and streamlines our processes, ultimately supporting more integrated, agile and strategic approaches to maximize our impact for Canadians.
- The Transformation is aligned with the OPC’s first strategic priority “protecting and promoting privacy with maximum impact.”
- The new structure reframes the compliance function as a continuum – that allows the OPC to put greater emphasis on engaging with organizations, proactively as well as reactively to promote compliance, softening structural lines between the two federal privacy acts, and purposefully undertaking certain activities and interventions.
- The goal is to resolve as many cases as possible through early engagement and resolution, and focus our in-depth investigations on key priority issues or incidents.
Lead: Corporate Services
OPC Budget
Speaking Points
- The total proposed funding for my office in the 2025-26 Main Estimates is $38.4 million.
- This represents an increase of $4.4 million over the previous year, which is attributable to the additional funding resulting from the renewal of collective bargaining agreements, adjustments to employee benefits plan ($1.1 million) and the temporary funding received as part of Budget 2023 ($3.3 million).
- This temporary funding has enabled my office to reduce the complaints backlog and to undertake more in-depth investigations of privacy breaches, but we will need a more permanent solution if we are to address the full volume and complexity of privacy issues in the current environment.
- That is why I have recommended that, at a minimum, the temporary breach and backlog funding be made permanent.
- We use this funding to protect and promote privacy rights, including by investigating complaints, assessing compliance, providing advice and recommendations, and working with stakeholders in other jurisdictions.
Background
- The office’s 2025-26 Main Estimates of $38.4M break down as follows (2023-24 is provided for comparison):
Budgetary 2024-25 2025-26 $M % $M % Personnel expenditures (including EBP) 28.3 83 31.0 81 Operating expenditures 5.2 15 6.9 18 Contributions program 0.5 2 0.5 1 Total reference levels 34.0 100 38.4 100
Lead: Corporate Services
AI Adoption at the OPC
Speaking Points
- In October 2024, the OPC launched its internal AI strategy to demonstrate a privacy-first AI implementation within the Government of Canada, build practical AI expertise across the Office, and help staff improve efficiency through responsible AI use.
- My Office has invested in secure, high-performance internal AI servers. We aim to share our experience with other government departments to promote privacy by design principles and meeting policy and legal obligations.
- The first version of our internal AI service, focused on low-risk use cases such as summarization, will be rolled out to our staff this year, with plans to expand to additional use cases at later stages.
- This initiative supports OPC’s second strategic priority, which addresses the privacy impacts of rapid technological advancements, especially in AI.
Background
- OPC employees are instructed not use ChatGPT for work-related tasks or on OPC devices, except when evaluating it as part of an investigation.
- Access to DeepSeek’s online services has been blocked for OPC devices, in alignment with GC CIO guidance, and we continue to monitor and adjust for new cyber threats as they emerge.
- Other services are permitted, though their use must align with our acceptable use policies and AI-assisted technologies guidelines.
- OPC’s first version of internal AI is not trained on internal data, will not collect any personal information, nor deliver research capabilities and automated decision making, and will not be used for any external services to Canadians.
- OPC has applied Privacy by Design Principles in the design of this solution, including completing a privacy impact assessment to ensure risks are identified and adequately managed.
Lead: Corporate Services
OPC Staff Expertise and Training
Speaking Points
- The scope of OPC’s operational environment is vast, requiring knowledge of fields such as privacy, IT, finance, national security, and law.
- We strive to recruit employees from diverse backgrounds with the knowledge and skills to help us achieve the ambitious goals we have set ourselves in our strategic plan, and we prioritize employee training in key areas such as children’s privacy, emerging technologies and IT Security.
- This also includes strengthening our technology-analysis function by hiring staff with expertise in AI and generative AI, a priority in which our needs for training and expertise are exponential.
- As part of the OPC Transformation, we are committed to implementing a development program for employees within the PM occupational group, our largest occupational group. The program’s ultimate goal is to recruit, develop, and retain privacy experts who will play a vital role in advancing our strategic objectives. The development of this program will be a key focus this year.
Background
- Given privacy expertise is in high demand with the growing pressures on the labour market, we are continually finding ways to improve our recruitment and training of new talent and to develop internal talent to retain skills and organizational knowledge.
- We are increasing access to technology-related training content to ensure we are keeping up with and staying ahead of technological advancements and their impact on privacy, particularly with respect to AI and generative AI.
- One of the recommendations from the OPC’s Transformation Plan was the creation and implementation of a PM development program to be managed by the Compliance Promotion and Enforcement Sector. A contract is being put in place to bring an expert consultant in to assist with the development of this program. There will be consultations with key stakeholders including management and human resources. This project is expected to be completed by summer 2026.
Lead: Corporate Services
Technology Analysis Lab
Speaking Points
- Addressing the privacy impacts of technological advancements is one of my office’s strategic priorities.
- In support of this, my office studies different technologies to assess their potential privacy implications.
- OPC has a team of IT analysts who use their extensive technological expertise to examine malware, hardware components, mobile applications, enterprise systems and Internet-of-things devices with a view to promoting privacy through the safe and secure use of digital technologies by Canadians.
- OPC’s technology analysis lab also supports compliance investigations and research related to emerging technologies, including artificial intelligence, biometrics, as well as privacy-enhancing technologies.
- The Technology Analysis Division plays a key role in strengthening the OPC’s impact by building strategic partnerships, supporting international and FPT data protection authorities in developing technological expertise, and embedding technology foresight into organisational decision making.
Background
- TAD currently employs 9 Full Time Equivalents, with most employees having an educational and professional background related to computer science.
- TAD employees have expertise related to digital forensics, incident response, penetration testing, software and hardware reverse engineering, and Artificial Intelligence to name a few.
- Over the last Fiscal Year (2024-2025), TAD supported 37 compliance investigations, in addition to 37 promotion engagements.
- In addition to supporting activities related to the core mandate of the OPC, TAD continues to support various internal IM/IT functions related to cyber security, such as conducting ad-hoc security assessments of various OPC systems.
Lead: Corporate Services
Funding Models for Agents of Parliament
Speaking Points
- We have advocated for a long-term stable funding mechanism that reflects the independent role played by Agents of Parliament, and also ensures their offices are properly funded.
- Currently there is an inherent conflict of interest where the OPC scrutinizes government compliance with privacy laws and relies on that same government for funding.
- A funding mechanism that ensures stable and adequate funding to address emerging issues rapidly would be preferable to the current process.
Background
- A letter to the Parliamentary Budget Officer (PBO) dated January 31, 2019 was sent by the Agents of Parliament seeking an alternative to the existing funding mechanism process.
- Not all Agents of Parliament have the same funding mechanism. The PBO, for example, has the ability to request funds directly from the Speakers of the House and Senate.
- In 2005, an Advisory Panel pilot project was launched to test a proposed new funding and oversight model for Agents of Parliament.
- This panel had been convened in response to concerns that independence from government may be compromised by the fact that Treasury Board determines the amount of funding available to the Agents of Parliament.
- The 2008 Corbett report concluded that the pilot project was a success and should be made permanent, given it achieved the key objective of reducing the perception of conflict of interest that was inherent in the pre-existing process.
Lead: Corporate Services
Investigation Timelines
Speaking Points
- More than half of complaints closed by my Office last fiscal year were resolved through early resolution, a streamlined investigation process focused on mediating a complaint with the parties in a timely manner.
- When a complaint requires more in-depth investigative work, my office aims to complete these within a year, whenever possible. However, some factors can cause investigations to take more time, such as the responsiveness of organizations or when investigations involve several respondents.
- While the joint investigations that I undertake with my provincial, territorial and international colleagues can be highly beneficial, these can also take more time.
- By the end of the last fiscal year, my office had significantly reduced its backlog of active investigations, even exceeding its target. Such results had not been reached in several years.
- My office is continually trying to innovate to improve timelines; with this objective in mind, last May I launched a transformation plan that brings together compliance activities in an integrated sector, across a single continuum. This renewed service model aims to improve efficiencies and achieve better results and outcomes for Canadians.
Background
- Last fiscal, the OPC closed the year with a backlog of only 9% of all active investigations (older than 12 months), a level than had not been reached since 2021. The target is 10%.
- The average completion time for the 438 PIPEDA complaints closed last fiscal year was 8.8 months.
- The average completion time for the 1,317 Privacy Act complaints closed last fiscal year was 4.6 months.
- PIPEDA complaints take more time to process because they are generally more complex and most often require our office to determine whether the OPC has jurisdiction.
Lead: CPE
Litigation Costs
Speaking Points
- My Office resolves many complaints in early resolution or through its investigative findings and recommendations. Nevertheless, in the absence of order-making power, litigation is sometimes the only means of achieving compliance with my recommendations.
- Initiating a court application or responding to an application for judicial review can be very costly, despite best efforts to be judicious in the use of resources.
- While my Office’s litigation expenditures have generally varied between $100,000 and $300,000 annually over the past six years, in 2023-2024, they more than doubled from the previous year to surpass $700,000.
- This was due to unique circumstances, with expenses related to two cases in Federal Court and three at the Federal Court of Appeal.
Background
- OPC litigation expenditures for retainers with external counsel by fiscal year:
2019-20 2020-21 2021-22 2022-23 2023-24 2024-25 $130,597.50 $114,930.49 $212,329.02 $284,277.14 $771,381.86 $137,606.15
Lead: Legal
Federal, Provincial and Territorial (FPT) Collaboration
Speaking Points
- Collaboration among regulators across Canada is essential to address modern privacy challenges and maximize our impact for Canadians and Canadian organizations.
- Interoperability of privacy laws offers consistency for organizations, and common protections for individuals’ data everywhere in Canada.
- I meet regularly with my provincial and territorial counterparts to discuss issues of mutual interest, share updates, and collaborate.
- We also issue joint resolutions and statements to express consensus on public policy matters of common interest or concern to us, such as joint principles for responsible, trustworthy and privacy-protective generative AI technologies, which were developed last year.
- My Office works especially closely with our counterparts in Quebec, British Columbia, and Alberta, as they have provincial laws deemed substantially similar to the Personal Information Protection and Electronic Documents Act, on activities such as joint investigations and guidance for organizations.
Background
- In May 2022, the OPC and the Quebec, BC, and Alberta offices entered into a Memorandum of Understanding to allow us to share information, consult on enforcement matters, discuss areas of mutual policy interest and develop public education and compliance resources.
- OPC recently updated its MOU with Ontario, to account for statutory changes in Ontario and to add provisions regarding joint investigations and adjudications.
- In November 2024, the FPT table adopted 2 privacy resolutions: Identifying and mitigating harms from privacy-related deceptive design patterns, and Responsible information-sharing in situations involving intimate partner violence.
- The OPC is currently conducting several joint investigations with our provincial counterparts, including on TikTok, Open AI, and Certn.
Lead: CSR
International Collaboration
Speaking Points
- In this digital, data-driven age, protecting privacy requires global coordination. As personal data moves around the world at lightning speed and scale, international collaboration is essential to address the growing complexity and global nature of data protection.
- This is a pivotal moment for privacy. Collaborating with international counterparts allows the OPC to bring a Canadian voice to, and stay at the forefront of, global efforts to advance privacy and data protection.
- This includes modern approaches and common standards to provide greater consistency for businesses operating across jurisdictions and better protections for individuals.
- One such highlight is my recent election to Chair the Global Privacy Assembly, a leading forum for privacy protection globally. My vision includes maximizing our collective efforts on addressing the privacy implications of technology, youth privacy, and international data flows, to shape a future where innovation can flourish, privacy rights are respected, and trust is reinforced.
Background
- G7 DPA Roundtable (June 2025) – adopted joint statement on ‘Promoting Responsible Innovation and Protecting Children by Prioritizing Privacy’.
- Global Privacy Assembly – Commissioner elected Chair (September 2025). The OPC chairs working groups: Data Protection and Other Rights and Freedoms; International Enforcement; Digital Citizens and Consumers, and is a member of 8 others, including Ethics in AI and Digital Education.
- OPC Membership: Global Privacy Enforcement Network; Asia Pacific Privacy Authorities; l’Association francophone des autorités de protection des données; and engages with Global Cross Border Privacy Rules Forum and OECD Working Party on Data Governance and Privacy in the Digital Economy.
- OPC has 10 bilateral and 3 multilateral MOUs and participates in APEC Cross-Border Privacy Enforcement Arrangement; GPA Cross-Border Enforcement Arrangement, Global Cooperation Arrangement for Privacy Enforcement.
- General Data Protection Regulation (GDPR) adequacy (January 2024).
Lead: CSR
Canadian Digital Regulators Forum – Year Two
Speaking Points
- The Canadian Digital Regulators Forum (CDRF) is a partnership between my Office, the CRTC, the Competition Bureau, and the Copyright Board.
- The Forum was established in June 2023 to strengthen collaboration on matters relating to digital markets and platforms.
- I served as chair for our second year, which focused on capacity building and cooperation. Our major deliverable was a paper we recently published on synthetic media, which refers to artificially generated images, video, text, or audio content, typically produced using AI technologies.
- This year, the Forum is being led by CRTC Chairperson Vicky Eatrides.
- We will continue to strengthen collaboration this year by engaging with external stakeholders through holding a workshop as well as publishing a series of articles related to developments in digital markets.
Background
- Other accomplishments from Year Two include:
- admitting the Copyright Board of Canada into the Forum as a permanent member;
- a panel discussion highlighting the work the Forum had done in its first year at the IAPP Canadian Privacy Symposium in Toronto; and,
- discussing the impacts of AI and Members’ roles in its regulation at Canada’s Competition Summit.
- As chair, the Commissioner represented the CDRF at the International Network for Digital Regulation Cooperation and Organisation for Economic Co-operation and Development joint event on the interplay between digital regulatory frameworks. The panel related to synthetic media and deceptive design patterns and the Forum signed a joint statement related to the event.
Lead: PRPA
Public Education and Outreach
Speaking Points
- An important function of my Office is to inform individuals about their rights and the OPC’s work on matters that affect their privacy, and to inform businesses and federal institutions about privacy obligations.
- Prioritizing privacy helps create conditions for a resilient Canadian economy and a more secure and enriching digital society. Fostering a culture of privacy, adopting privacy-by-design principles, and establishing privacy standards enables responsible innovation.
- To that end, myself and OPC experts frequently engage with a variety of stakeholders, including students, businesses, privacy professionals, federal institutions, and other regulators; create resources for individuals and organizations to promote greater awareness and understanding of privacy issues; and deploy proactive communications.
- As part of my transformation plan for the OPC launched this past January, I am putting a greater emphasis on strategic partnerships, engagement and collaboration to help maximize our impact.
Background
- Resources include for individuals (i.e. tips for online privacy, AI, raising privacy concerns with a business), and for teachers (i.e. graphic novels, discussion guides, videos, posters, lesson plans and presentations).
- Resources for organizations include for businesses (i.e. issue-specific guidance, outreach events, videos), and for federal government institutions (Privacy Act Bulletins, lessons learned, privacy news, trends and information).
- Activities undertaken in 2024-2025 include:
- Delivered 100 speeches to various audiences; exhibited at event for educators.
- Issued 48 news releases; responded to 137 media requests; logged almost 3 million website visits.
- Coordinated 2024 GPEN Sweep, developed products about deceptive design.
- Tips and radio campaign on identity theft; 2 email campaigns to teachers.
- Social media content and campaigns (i.e. Privacy Awareness Week, Cybersecurity, Small Business Week, Media Literacy, and Data Privacy Week).
Lead: CSR
OPC Departmental Results
Speaking Points
- In 2023-24, we continued our work towards preparing our Office for the anticipated changes to our mandate, while also working towards the achievement of our departmental plan targets.
- Our efforts and the infusion of temporary resources have allowed us to make modest progress as we continue our work to meet all targets.
- Recognizing the operational challenges that we face, we recently examined our internal processes and structures in order to optimize our programs and services to better respond to the needs of Canadians.
Background
- The latest available Departmental Results Report is for fiscal year 2023-2024.
- Targets met: 2
Percentage of private sector organizations that have good or excellent knowledge of their privacy obligations 88% (target at least 85%); Percentage of federal and private sector organizations that find OPC’s advice and guidance to be useful in reaching compliance 72% (target at least 70%). - Targets missed: 4
Percentage of complaints responded within service standards 50% (target at least 75%); Percentage of formal OPC recommendations implemented by departments and organizations 82% (target at least 85%); Percentage of Canadians who read OPC information and find it useful 68% (target at least 70%); Percentage of OPC recommendations on privacy-relevant bills and studies that have been adopted 50% (target at least 60%). - Indicators with no target: 2
- The 2 indicators that measure our guidance to businesses and information to Canadians on key privacy issues had no target, considering the possibility of a transformed legal framework and the fact that our guidance is grounded in legislation and could quickly become outdated following such reform.
- At the program level, the OPC met 3 of its targets
- Outcome-level and Program-level results are published on GC Infobase
Lead: Corporate Services
OPC Services to Canadians
Speaking Points
- The OPC is implementing the Government of Canada Policy on Service and Digital to better protect and promote privacy by making its programs and services more modern, easier to use, and more responsive to the needs of Canadians.
- OPC has launched a digital service optimization project to improve the usability and effectiveness of our online information request and complaints intake services.
- In addition to improving the client experience, this project is expected to deliver internal process efficiencies and our ability to improve our other services to Canadians.
Background
- The 11 services to Canadians the OPC reports upon include: Respond to inquiries; provide business advisory services; manage media relations; respond to parliamentarians’ information requests; oversee the contributions program; review privacy impact assessments (PIAs); offer government advisory and consultation; receive and review breach reports under the Privacy Act; investigate Privacy Act complaints; receive and review breach reports under PIPEDA; investigate PIPEDA complaints.
- We recently added the following 3 services which will be reported upon in future years: provide self-service information via the OPC website; offer a tool to assess real risk of harm from breaches; accept privacy codes of practice from Canadian organizations.
- Our initial research is showing that the current online complaint form is resulting in a rate of 43% of complaints being submitted to the OPC that fall out of our jurisdiction or cannot be investigated, and is difficult for persons with disabilities to use.
- We have launched a project that will optimize this critical service towards making it easier to use and improve the likelihood we receive complaints that fall within our jurisdiction.
- We will use the lesson learned in this project to build our capacity to apply modern digital techniques to improve all our services.
Lead: Corporate Services
Trends and Statistics
In this section
Trends and Statistics: Breaches
Speaking Points
- Breaches are consistently on the rise in both the public and private sectors, year after year. These include cyber incidents but also document losses and employee snooping.
- Last fiscal year, my office received breach reports affecting close to 21 million Canadian accounts.
- The vast majority of public sector breaches and over half of private sector breaches reported to my office last year created a real risk of significant harm to those whose personal information was captured.
- My Office notices an increase in the number of cyber incidents affecting companies that oversee critical infrastructure, such as financial and telecommunications companies.
- Last fiscal year, the financial sector reported the largest percentage of breaches (30%) to my office.
- Recently, threat actors targeted the airline and fashion sectors.
Background
- Year to date in 2025-2026 compared to the same period last year, the OPC has received an equivalent number of breach reports. When comparing 2023-2024 to 2024-2025, we note close to 4% more breach reports.
Breaches reported to the OPC
*As of June 30, 2025FY PIPEDA PA Total % RROSH Reported critical infrastructure cyber incidents /
all PIPEDA cyber incidents2025-26* 166 144 310 76% 23 / 82 (28%) 2024-25 686 613 1,299 72% 61 / 429 (14%) 2023-24 693 561 1,254 65% 34 / 321 (11%) Total 1,545 1,318 2,863 69% 118 / 842 (14%) - While we cannot confirm whether an increase in breach reports relates to more incidents or more reporting, we do observe that it is often the same institutions submitting reports, especially in the public sector.
Lead: CPE
Trends and Statistics: Complaints and Investigations
Speaking Points
- A core function of my office is to receive and investigate complaints about the personal information-handling practices of federal government institutions and private sector businesses.
- In 2024-25, we received 20% more complaints than the previous year and so far this year these numbers are growing significantly compared to the same period last year.
- In 2024-25, we concluded close to 1800 investigations under both acts, with almost three quarters (73%) of these being in the public sector. These numbers are up from the previous year.
Background
- Complaints received and accepted over the past two years:
FY Privacy Act PIPEDA Total Received Accepted Received Accepted Received Accepted 2024/25 1,950 1,279 1,467 446 3,417 1,725 2023/24 1,749 1,113 1,108 446 2,857 1,559 - We accept significantly fewer PIPEDA than PA complaints. A complaint may not be accepted for various reasons, including when it is outside OPC jurisdiction. We also often have to redirect complainants to contact the organization’s privacy officer first.
- Under the Privacy Act, the OPC has greater discretion whether to investigate or not; many complaints received fall within provincial jurisdiction (e.g., health data).
- We are seeing a significant increase in complaints this year, especially under PIPEDA with an 80% increase YTD compared to last year.
Lead: CPE
Trends and Statistics: Government Advisory (Privacy Act)
Speaking Points
- My Office has a dedicated team that provides advice and recommendations to government institutions, which are subject to the Privacy Act, on their programs and activities. We do this through the review of Privacy Impact Assessments (PIAs), as well as by conducting outreach activities.
- While PIAs are a requirement pursuant to a Treasury Board Secretariat Directive, I continue to recommend that PIAs be made a legislated requirement under the Privacy Act.
- Generally speaking, the OPC saw an increase in PIAs submitted for review last fiscal. We also received more requests for consultations.
- We have held numerous outreach sessions to the public sector, which are in demand and well attended. This has included successful virtual events held jointly with TBS, dealing with privacy in government contracting and in the workplace.
Background
- Volume of work: During fiscal year 2024-2025:
- The OPC has received 138 PIA submissions and 108 requests for consultation from federal institutions, an increase of 16 per cent over the previous fiscal year.
- We also received 658 notifications of disclosure of personal information under section 8(2)(m) of the Privacy Act, with a similar increase of about 16 per cent.
- 2025-26 YTD, we have received 91 PIAs and consultation requests, and 267 8(2)m notifications. While this number is tracking lower than last fiscal, historically we have seen an increase in activity in Q3 and Q4 (as projects and programs advance).
- Advice to TBS: We provided advice on central government guidance, including in relation to Generative AI, a new guide to the Privacy Act, and new directions on how PIAs are to be undertaken.
- Outreach: We held 18 outreach events last fiscal year and 8 outreach sessions completed or planned for Q1-2. A further 12 events are anticipated for Q3-4, including increased collaboration with TBS to maximize impact and reach.
Lead: CPE
Trends and Statistics: Business Advisory (PIPEDA)
Speaking Points
- A key function of my office is to provide advice to businesses to help them meet their privacy obligations under PIPEDA.
- As the privacy landscape continues to evolve, my office will continue to engage with businesses to help support technological innovation while also protecting privacy as a fundamental right.
- As part of its business advisory function, the OPC provides advice to the private sector under four program lines: (1) advisory consultations; (2) outreach; (3) breach engagements; and (4) review and approval of Codes of Practice (new function in 2025).
- Last fiscal, more than half of our advisory consultations touched on the issue of Artificial Intelligence, one of my Office’s strategic priority areas.
- Almost three quarters of our engagements involved the provision of foundational privacy support to small and medium-sized entities, which play a key role in economic growth and job creation in Canada.
- This year, in line with my Office’s first strategic priority to Maximize Impact, we are increasing our focus on the provision of advice to businesses through proactive engagements to encourage voluntary compliance expeditiously, without the need for lengthy, resource-intensive investigations.
Background
- In 2024-25, the OPC undertook 15 advisory consultations and 92 promotional activities (privacy clinics, exhibits, presentations, stakeholder meetings, targeted promotion sessions, etc.) in the private sector. In 2024-25, 60% of our advisory consultations involved AI.
- The office also leveraged partnerships by conducting presentations and privacy clinics through various innovation hubs, business accelerators and chambers of commerce, for example, working with 12 partners to reach over 260 businesses in Atlantic Canada, Manitoba, and Alberta.
Lead: CPE
Legislation
In this section
Privacy Law Reform is Good for Canada
Speaking Points
- Data is one of the most important resources of the 21st century—it can support and fuel innovation, and how it is managed shapes Canada’s ability to lead and thrive in the digital economy. Yet, our federal privacy laws pre-date the modern digital economy, while technologies continue to evolve rapidly.
- For this new Parliament, adopting legislation to modernize our federal privacy laws so they fully meet the challenges of today’s data driven world would be a milestone towards prioritizing privacy.
- Prioritizing privacy is good for Canada because:
- It will position Canadian organizations to succeed in the digital economy, innovate responsibly, and earn public trust;
- It will enable Canadians to confidently reap the benefits of a digital society while having confidence that their data is protected and used responsibly; and
- It reinforces democratic values by protecting freedom, trust, dignity and autonomy—the very things that make democratic and economic participation possible – and that unite us as Canadians.
Background
- Personal information of Canadians is being collected, used, and shared at an unparalleled pace and volume on a global scale.
- Continued advancement in the adoption of artificial intelligence (AI) and generative AI, the risk of significant harms caused by data breaches, and the increasingly complex nature of global data flows have put data protection at the forefront of the public interest.
- According to the OPC’s latest survey of Canadians (2024-2025): 91% of Canadians are concerned about their personal information being used to steal their identity; 88% of Canadians are concerned about their personal information being used to train AI systems; 87% of Canadians are concerned about privacy when using social media.
Lead: PRPA
Privacy Act Reform Recommendations
Speaking Points
- Over the years, my Office has made many recommendations to modernize the critically outdated Privacy Act.
- I have identified seven priority recommendations that would be the most impactful in enhancing privacy protections for Canadians:
- Collection Threshold: Create an explicit necessity and proportionality requirement for the collection of personal information.
- PIA Requirement: Require departments to conduct privacy impact assessments (PIA) in high-risk situations.
- Orders: Provide the Privacy Commissioner with the power to issue binding orders.
- Discretion to decline: Provide the Privacy Commissioner with discretion to discontinue or decline complaints.
- Safeguards: Adopt an explicit legal requirement to safeguard personal information.
- Breach Reporting: Create a legal requirement for reporting privacy breaches.
- Discretion to Report: Provide more discretion to the Privacy Commissioner to publicly report.
Background
- There have been engagements between the OPC and government officials on Privacy Act reform dating back to 2016. Of note are submissions the OPC made to the Standing Committee on Access to Information, Privacy and Ethics (ETHI) in 2009 and 2016, and involvement in Justice Canada consultations (2019-2021).
- In the last session of Parliament, we also appeared before several committees that were examining facets of the Privacy Act and related issues.
- Beyond the seven priority recommendations, we have distilled our other previous recommendations for reform into a list of 11 additional technical amendments.
- We have shared these recommendations with Treasury Board Secretariat and Justice Canada and welcome the opportunity to work with them to update the Privacy Act.
Lead: PRPA
PIPEDA Reform Priority Recommendations
Speaking Points
- I have identified seven priority recommendations for PIPEDA reform based on their potential impact on enhancing privacy protections and rights in Canada:
- Enforcement Powers: Provide the Privacy Commissioner with the power to issue binding orders, impose administrative monetary penalties and to conduct proactive audits.
- Fundamental Right to Privacy: Recognize privacy as a fundamental right in the purpose clause and embedded preamble.
- Children’s Privacy: Enhance children’s privacy rights by explicitly recognizing the best interests of the child and mandating the OPC to develop a code of practice for children’s privacy.
- De-identification: Promote innovation by including a framework for de-identification and anonymization.
- Right to Deletion and De-listing: Ensure individuals maintain control over their personal information by including a clear and explicit right to de-list and delete personal information.
- Privacy by Design and Privacy Impact Assessments (PIAs): Enhance accountability by requiring organizations to implement privacy by design, and conduct PIAs for high-risk activities.
- Trans-Border Data Flows: Institute rules and requirements to protect personal information moving outside of the country.
Background
- These priority recommendations would strengthen key regulatory powers and address systemic issues that we have observed, including emerging risks in the digital economy, and would bring PIPEDA more in line with other jurisdictions.
- Beyond the seven (7) priority recommendations, we have identified twelve (12) additional amendments that are intended to improve administrative timelines, processes, and powers that will allow the OPC to more effectively conduct investigations, address automated decision-making, and collaborate domestically.
- We have shared these recommendations with the Department of Industry and welcome the opportunity to work with them to update PIPEDA.
Lead: PRPA
Privacy Law Comparisons
Speaking Points
- Canadians need and expect modernized privacy laws, such as those we have seen enacted in other jurisdictions. Interoperability helps Canadians know their information is protected when it crosses borders and reduces compliance costs for organizations.
- Modernized private sector privacy laws typically include enhanced enforcement powers for the regulator, such as order-making power, proactive audits, and administrative monetary penalties.
- We have also seen other jurisdictions, such as the UK and EU, include provisions on trans-border data flows and special protections for children.
- The Privacy Act was introduced in 1983 and has not had substantive updates, whereas the UK, Australia, and New Zealand have recently updated their public sector privacy laws to include new provisions such as safeguards on automated decision making.
Background
- Some examples of protections in foreign laws that have informed our private sector law reform recommendations include:
- Proactive compliance audits (GDPR, Australia, UK, Ireland)
- Special privacy protections for children (GDPR, UK, Australia)
- Requiring PIAs for high-risk activities (GDPR, UK)
- Specific provisions for trans-border data flows (GDPR, UK, Australia, New Zealand)
- While PIPEDA was deemed to provide an “adequate” level of protection for transfers outside the EU under the GDPR in January 2024, the European Commission’s report suggested that reform could be an opportunity to enshrine stronger protections and requirements in legislation.
- The OPC has long called for Privacy Act reforms for the digital age, and to better align with international privacy developments. The UK updated its Data Protection Act 2018 in 2025, Australia updated its Privacy Act 1988 in 2024, and New Zealand reformed its Privacy Act 1993 in 2020.
Lead: PRPA
Bill C-2 (Strong Borders Act)
Speaking Points
- Bill C-2 is a highly complex piece of legislation that would amend over a dozen laws in addition to enacting a new statute.
- Given the nature and scope of the bill, my Office has been studying it closely, including in light of the OPC’s past work on other lawful-access proposals and stakeholder concerns about associated privacy impacts.
- I look forward to offering my views to Parliament once the bill has been referred to committee.
Background
- Minister of Public Safety introduced Bill C-2 (the Strong Borders Act) June 3, 2025.
- Of the bill’s 16 parts, those that present the most significant privacy risks include:
- amendments to the Criminal Code and CSIS Act that would create or modify a range of search powers, including a new warrantless information demand and, in the Code, an expansive new production order for subscriber data (Part 14);
- amendments to the Criminal Code and other statutes that would facilitate the cross-border sharing of information between law enforcement (Part 14);
- a new Supporting Authorized Access to Information Act that would compel electronic service providers to make technical modifications in order to provide authorized persons with access to private data (Part 15);
- amendments to the Canada Post Corporation Act that would allow the search and seizure of mail under any act of Parliament and authorize Canada Post to open letter mail in certain circumstances (Part 4); and
- amendments to the PCMLTFA that would allow banks and other reporting entities under that Act to receive personal information from the RCMP or other law-enforcement agencies for specified purposes (Part 16).
- Second reading of C-2 resumed on September 16, 2025. The Conservatives have criticized the bill for omitting measures related to fentanyl sentencing, gun crime, and bail reform and have also cited concerns related to privacy, civil liberties, and government overreach. The Bloc have raised similar concerns but are inclined to move forward to an in-depth committee study. The NDP have characterized C-2 as an “undemocratic power grab” and called on the Government to withdraw it.
Lead: PRPA
Bill C-4 (Political Parties)
Speaking Points
- Bill C-4 – the Making Life More Affordable for Canadians Act – amends the required elements for federal political parties’ existing privacy policy obligations.
- In my submission to the House of Commons Standing Committee on Finance (FINA), I proposed amendments to better protect electors’ personal information. I also reaffirmed the position that political parties should be subject to widely held privacy standards and that my Office should play a role to ensure the protection of privacy rights in this context.
- I have repeatedly called for political parties to be subject to privacy rules substantially similar to requirements set out for the public and private sectors in the Privacy Act and PIPEDA, while at the same time being adapted to the unique role played by political parties in the democratic process. This would help to ensure that voter participation can be maximized while at the same time protecting Canadians’ fundamental right to privacy.
Background
- Elements of political parties’ privacy policies are laid out in s. 385(2)(k) of the Canada Elections Act (CEA); Exemption for parties from provincial/territorial privacy law is at s. 446.4 of Bill C-4.
- The June 16, 2025, submission to FINA recommended that there should be:
- Requirements for political parties to identify the purposes for which personal information is collected, to seek consent (subject to express authority in the legislation), to limit collection, use and disclosure, and to provide a mechanism for access and correction to personal information under their control.
- Privacy breach notification provisions, and that breaches be reported both to affected individuals as well as to a relevant, independent body such as the Privacy Commissioner of Canada, Elections Canada and/or the Commissioner of Canada Elections.
- Provisions in the CEA for formal collaboration between the OPC, the Commissioner of Canada Elections and Elections Canada.
Lead: PRPA
Quebec’s Law 25
Speaking Points
- Law 25, formerly Bill 64, updated Quebec’s public and private sector privacy laws and has set the bar high for privacy protection in Canada.
- Quebec’s private sector privacy law (Act respecting the protection of personal information in the private sector, CQLR c. P-39.1) now includes provisions that protect the right to reputation, provide individuals with the right to contest automated decisions, and is applicable to provincial political parties in Quebec.
- It also provides my Quebec counterpart with the ability to proactively verify an organization’s compliance and to issue monetary penalties for a wide range of violations, including those connected to the collection, use or disclosure of personal information.
- Ideally, I would like to see comparable requirements brought in federally. This would maintain interoperability between the federal law and Quebec’s substantially similar statute.
Background
- Law 25 was enacted on September 21, 2021, and entered into force in three phases. As of September 22, 2024, it is fully in force, with data portability rights implemented as the final phase.
- Many of the OPC’s recommendations on former Bill C-27 were informed by Law 25, and by the laws of other domestic and international jurisdictions.
- Unlike PIPEDA, Law 25 introduced provisions relating to privacy by default, PIAs, transborder data flows, proactive audits, and application to political parties.
- Law 25 also introduced provisions allowing for disclosure of personal information for research purposes without consent, but also provides for safeguards, such as completion of a PIA by the organization before disclosure.
- Law 25 includes a requirement that anonymization be done in accordance with “generally accepted best practices”. Regulations published on May 15th, 2024, for Law 25 detail a three-stage process requiring organizations to define purposes, implement anonymization techniques based on best practices and risk of re-identification, and periodically reassess anonymized data.
Lead: Legal
Protecting Youth Online
Speaking Points
- Ensuring that children’s privacy is protected and that young people understand and are able to exercise their privacy rights is one of my key strategic priorities.
- Children and minors may be impacted by technologies differently than adults, be at greater risk of being affected by privacy-related issues, and therefore require special protections. It is essential that our privacy laws in Canada be modernized to protect children and the best interests of the child.
- This year, my office worked to better understand children and youth perspectives when it comes to their online privacy.
- We conducted focus groups and research into young people’s views on privacy, which found that young people did not have good knowledge of their privacy rights or how to exercise them, and an online survey of parents and teachers, which found that the vast majority of parents worry about their children’s online privacy.
- I have also announced my plans to convene a youth advisory council, which will help inform my office’s work and outreach to young people.
Background
- OPC has advanced much work related to children’s privacy in 2024 and 2025, including:
- Hosting an international symposium on youth privacy in a digital age.
- Establishing a youth advisory council. Interviews have concluded and we expect the members to be announced in the Fall.
- Launching a consultation on the development of a children’s privacy code.
- Holding a consultation on age assurance, which can help to protect young people online. We are now preparing draft guidance on this topic, informed by our public consultation.
- Working closely with our international partners on children’s privacy, including as part of the Global Privacy Assembly’s Digital Education Working Group and on an international age-assurance working group.
Lead: PRPA
Regulating Artificial Intelligence (AI)
Speaking Points
- While AI systems may pose novel risks to privacy and raise new questions and concerns about the collection, use and disclosure of personal information, they are captured by existing laws.
- As the development and use of AI systems is fueled by collection of massive amounts of data, including Canadians’ personal information, privacy legislation will be central to their governance.
- Regardless of whether the government opts to introduce standalone AI legislation, it is important to ensure that privacy legislation remains effective in this time of technological change and that the OPC is able to effectively collaborate with other relevant regulators.
- We are recommending, for instance, that Parliament consider amending PIPEDA to require that organizations undertake PIAs for high-impact AI systems.
Background
- Recommendation 6, 2025 PIPEDA Priority Recommendations: “Enhance accountability by requiring organizations to implement privacy by design and conduct privacy impact assessments (PIAs) for high-risk activities.”
- In 2020, the OPC published “A Regulatory Framework for AI: Recommendations for PIPEDA Reform” following a public consultation. While these recommendations largely remain relevant, they pre-date the rise of generative AI and should thus be read alongside later OPC work.
- On the role of DPAs and regulatory collaboration, the 2024 G7 Statement on the Role of Data Protection Authorities in Fostering Trustworthy AI stated, “We believe that a cooperative approach, in which DPAs are at the forefront in working closely with other authorities and competent bodies, ensures a holistic governance framework that can effectively manage the risks and harness the benefits of trustworthy AI technologies whilst safeguarding fundamental rights.”
Lead: PRPA
Compliance and Enforcement
In this section
Investigations under the Privacy Act
Speaking Points
- While section 63 of the Privacy Act prevents me from discussing or disclosing details of investigations, I can confirm that some of my ongoing public sector investigations include:
- The contracting practices related to ArriveCAN, and more specifically the measures that were in place to protect personal information during the development of the app;
- A privacy breach resulting from unauthorized access to Global Affairs Canada’s virtual private network;
- A cyberattack which resulted in a breach of the personal information of federal government personnel who used government-contracted relocation services over the past 24 years; and
- Cyberattacks at the Canada Revenue Agency that led to more than 30,000 privacy breaches dating back to 2020.
- Some of my key completed investigations and reviews include:
- The loss of an unencrypted Universal Serial Bus (USB) storage device by the RCMP.
- Biennial review of the measures taken by FINTRAC to protect personal information.
Background
- Pursuant to subsection 29(1) of the Privacy Act, we receive and investigate complaints from individuals who may have been denied the right to access and correct their personal information, or who allege that personal information has been collected, used, retained, or disclosed in contravention of the Act.
- The Privacy Commissioner of Canada can also choose to initiate a complaint, under subsection 29(3), when he is satisfied that there are reasonable grounds to investigate. He can also decide, at his discretion, to carry out investigations under subsection 37(1) against a federal institution or organization covered by the Act.
Lead: CPE
Investigations under PIPEDA
Speaking Points
- Some of my key completed investigations include:
- MindGeek: In February 2024, I issued findings following my Office’s investigation into a complaint against Aylo, the operator of Pornhub and other pornographic websites.
- 23andMe: In June 2025, I issued, along with my UK counterpart, findings into a breach at 23andMe, a direct-to-consumer genetic testing website.
- Google: On September 9, 2025, I issued findings following my Office’s investigation of a complaint against Google related to de-listing.
- TikTok: On September 23, 2025, I issued findings following a joint investigation, with my counterparts in QC, AB and BC, into TikTok’s privacy practices, in particular as they relate to children in Canada.
- Ongoing ones include:
- ChatGPT: In May 2023, I commenced a joint investigation with my counterparts from QC, AB and BC into OpenAI, the company behind the AI-powered chatbot, ChatGPT.
- Certn: In May 2024, I commenced an investigation with my counterparts in BC and AB, into Certn, a tenant screening service that performs background checks.
- Loblaws: In July 2024, I commenced an investigation after receiving several complaints in which individuals alleged that they were unable to delete their PC optimum accounts.
Background
- Pursuant to s. 12(1) of PIPEDA, we investigate complaints against organizations engaged in commercial activity. If there are reasonable grounds to investigate a matter under the Act, we can also initiate a complaint under s. 11(2).
- Where investigations are ongoing, due to confidentiality obligations, the OPC cannot provide further details. However, in each instance (other than where we already issued findings), we intend to issue our findings in the coming months.
Lead: CPE
PCMLTFA Codes of Practice
Speaking Points
- As of March 4th 2025, reporting entities under the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (PCMLTFA) can develop codes of practice that relate to the sharing of personal information without consent between themselves and submit them to me for review and approval. I am fully supportive of this initiative.
- However, I have received no additional funding for this new activity. The volume of codes submitted to date is significantly higher than what was estimated in the Regulatory Impact Analysis Statement.
- My Office is committed to ensuring the success of this initiative. To that end, we have reallocated internal resources to undertake this work.
- I have appreciated FINTRAC’s close collaboration on this important file. My staff has regular meetings with them to support this initiative.
Background
- Section 11.01 of the PCMLTFA allows for the disclosure, collection, and use of personal information without consent, provided the disclosure is made in accordance with the regulations. As per the regulations, reporting entities can establish and implement a code of practice for this purpose and submit it to the Commissioner for approval. The code of practice must, among other things, provide for the substantially same or greater protection than PIPEDA.
- Participants in a code are protected from criminal and civil proceedings if they disclose, collect, or use personal information pursuant to a code in good faith.
- The OPC has 120 days (with a potential 15-day extension) to approve a code of practice. If there is no decision in that period, a code is deemed approved. Codes are to be reviewed every 5 years or if there is a significant change.
- Codes must also be submitted to FINTRAC, who may provide comments to the OPC for OPC’s consideration in its review.
- To date, the OPC has received 6 codes and understands that other entities remain interested. The Regulatory Impact Analysis Statement published along with the amended regulations estimated that three codes would be submitted over a 10-year period.
Lead: CPE
Public and private-sector breach-reporting obligations
Speaking Points
- Private-sector organizations subject to PIPEDA are required to report privacy breaches to the OPC when there is a real risk of significant harm to an individual.
- Conversely, federal institutions are required to report such breaches, but only pursuant to Treasury Board policy and not the Privacy Act. I have recommended that breach-reporting obligations be given the force of law under a modernized Privacy Act.
- Increasingly, third-party organizations that provide services to both the public and private sectors (e.g., IT solutions/platforms) are being targeted in cyberattacks that can have a cascading impact on multiple client organizations. This was the case, for example, with the Ticketmaster and PowerSchool breaches.
- I am concerned that, under PIPEDA, service providers have often not reported privacy breaches directly to my office and that impacted individuals may not have been informed. This is why I hope more explicit reporting obligations will be considered in future iterations of a modernized private sector privacy legislation.
Background
- Under section 4.2.12 of the TBS Policy on Privacy Protection, federal organizations subject to the Privacy Act must report material privacy breaches to the OPC within 7 days of determining that a breach is material.
- Under section 10.1(1) of PIPEDA, organizations must report breaches to the OPC as soon as feasible after determining that there is a real risk of significant harm (RROSH) to an individual. RROSH is determined based on the sensitivity of the personal information and the probability that the personal information has or will be misused.
- The OPC continues to see a significant gap between the public and private sectors when it comes to the reporting of privacy breaches involving cyber incidents.
- In 2024-2025, the OPC received 429 breach reports from the private sector but only 55 from federal institutions. This represents a 35% increase in cyber incidents reported under both Acts when compared to the previous year.
Lead: CPE
Key Investigations
In this section
Key Investigation: TikTok
Speaking Points
- My Office, along with my counterparts in Quebec, British Columbia and Alberta published our joint report of findings into our investigation into TikTok on September 23, 2025.
- Our investigation found serious deficiencies in TikTok’s age assurance mechanisms, which allowed hundreds of thousands of Canadian children under the age of 13 to access TikTok each year, contrary to its own terms.
- We also found that TikTok failed to obtain meaningful consent from adults and teens for its collection and use of user data, including sensitive data of younger users and biometric data.
- I am pleased that TikTok committed to improving its age assurance measures to keep children off its platform and to enhance its privacy communications to ensure meaningful consent. Our Offices will be working with TikTok over the coming months to ensure recommendations have been addressed.
Background
- In February 2023 the OPC, Quebec, British Columbia and Alberta launched a joint investigation into TikTok.
- The investigation was launched in the wake of now-settled class-action lawsuits in the United States and Canada, and numerous media reports related to TikTok’s collection, use and disclosure of user data.
- The investigation examined whether TikTok obtained consent for collection and use of users’ information for purposes of targeting ads and customizing content, but also whether it was collecting children’s personal information for an appropriate purpose.
- The entity under investigation was TikTok Pte Ltd. (Singapore-based company) as it is the business entity responsible for Canadians’ personal information and TikTok’s privacy practices.
- The investigation took approximately 30 months to complete given the level of complexity of the issues and technology under investigation, as well as the need to confer amongst provincial partners throughout the process in coming to our conclusions.
Lead: CPE
Key Investigation:
Speaking Points
- In May 2023, my Office commenced an investigation into the practices of OpenAI in relation to its ChatGPT service. This has been a joint endeavour with counterparts in Alberta, British Columbia, and Quebec.
- Among the issues under examination are consent, openness, access, accuracy, and accountability. We have also been looking into whether OpenAI collects, uses and discloses personal information for appropriate purposes, and whether this collection is limited to information that is necessary for these purposes.
- We aim to release our findings by the end of this year.
- It is important that AI and other related emerging technologies be developed and deployed in a responsible and privacy-protective manner. This is why my Office has made it a strategic priority to address and advocate for privacy in this time of technological change.
Background
- ChatGPT is a natural language processing tool (or chatbot) driven by AI technology. The language model can answer questions and assist users with a range of tasks, such as composing emails and essays.
- In April 2023, the OPC launched an investigation into ChatGPT after receiving a complaint alleging that the company collected (“scraped”), used, and disclosed the complainant’s personal information for the purpose of its commercial text-generation service without first obtaining their consent. We closed this investigation in May 2023 to pursue a broader, joint, commissioner-initiated complaint.
- DPAs around the world, including many in Europe, initiated investigations into ChatGPT. The European Data Protection Board has launched a dedicated task force to “exchange information on possible enforcement actions” and issued a report in May 2024, sharing a preliminary assessment of OpenAI’s practices against the GDPR.
- As a member of the Global Privacy Assembly’s working groups on AI and International Enforcement Cooperation, we are exchanging information and learning from the experiences of our counterparts.
Lead: CPE
Key Investigation: 23andMe
Speaking Points
- In June, along with my counterpart Commissioner Edwards from the UK Information Commissioner’s Office (ICO), I released the joint Report of Findings into our investigation into a global data breach that occurred at 23andMe
- Our investigation found that 23andMe did not implement appropriate controls to protect the highly sensitive personal data in its control and did not adequately notify affected customers and regulators, including my Office, after the breach as legally required.
- While the ICO issued a monetary penalty to the organization, my powers are limited to issuing non-binding recommendations.
- Canada’s privacy laws should be modernized to grant me similar powers as my international counterparts in order to protect Canadians’ privacy adequately.
Background
- Between April and September 2023, a hacker carried out a credential-stuffing attack on 23andMe, exploiting reused login credentials that were stolen in previous unrelated data breaches.
- The incident affected the personal information of 7 million customers and approximately 320,000 Canadians and included some of the most sensitive information about individuals.
- The compromised data in the 23andMe breach included highly sensitive information related to health, race and ethnicity, as well as information about relatives, date of birth, sex at birth and gender.
- Following the breach, 23andMe filed for Chapter 11 bankruptcy in the United States. The Bankruptcy Court approved the sale of 23andMe to TTAM Research Institute, a non-profit led by 23andMe’s co-founder and long-time CEO.
Lead: CPE
Key Investigation: Certn
Speaking Points
- In May 2024, I launched a joint investigation with my counterpart in British Columbia into Certn Canada, a company that offers background check services, including tenant-screening services to landlords. Our colleagues in Alberta joined the investigation a few weeks later.
- In recent years my Office has received several privacy-related complaints from tenants against landlords, property managers, and third-party property management service providers.
- Requiring prospective tenants to consent to extensive background checks could have profound implications on the ability of Canadians to secure housing, particularly in a challenging rental market.
- Accordingly, we are examining whether Certn’s collection, use, and disclosure of personal information is for an appropriate purpose, that consent is valid and meaningful, and that the information obtained is accurate.
Background
- Certn operates across Canada and internationally. It promotes itself as a tech company “innovating every part of the background screening process.”
- Certn claims to collect, use, and disclose vast amounts of personal information – which may be sensitive – by way of over 100,000 databases from over 200 countries and territories (according to a previous version of Certn’s website). It states that many of these sources are “publicly available.”
- Certn’s services include criminal records checks, credit checks, education and employment verification, international background checks, social media scans, and what it calls “Softcheck,” or real-time searches of publicly available datasets that it promotes as suitable for tenant screening.
Lead: CPE
Key Investigation: X
Speaking Points
- In February 2025, following the receipt of a complaint, my Office opened an investigation into social media platform X regarding the alleged use of Canadians’ personal information to train its artificial intelligence models.
- My investigation is being conducted under s.12 of the Personal Information Protection and Electronic Documents Act. My office is examining the compliance of X’s collection, use, and disclosure of Canadians’ personal information to train these models.
- As the investigation is ongoing, I cannot provide further information at this time.
Background
- The investigation pertains to the training of X’s AI models, including Grok.
- The company xAI developed the large language model-based chatbot Grok and acquired X in an-all stock deal, as was reported in March 2025.
- The complaint was filed by (now former) MP Brian Masse, challenging the fact that X may potentially be “using Canadians’ data to train artificial intelligence to influence their political decisions.”
- In the statement responding to the announcement of the investigation, the NDP stressed the importance of transparency in algorithms to ensure accountability and counter misinformation.”
Lead: CPE
Key Investigation: LinkedIn
Speaking Points
- My Office has engaged with LinkedIn in recent months to discuss privacy issues related to its use of Canadian members’ personal information to train its generative AI models.
- This engagement was prompted by media reports indicating that LinkedIn had started doing so without prior notice.
- The company proactively decided to pause the practice in Canada while it engaged with my Office.
- LinkedIn has announced it will resume the practice in early November, at which time it will also begin sharing members’ personal information with its parent company Microsoft to train its own Generative AI models.
- Following discussions with my office, LinkedIn informed its members of this initiative via an in-product banner and an email, which included a link to an opt-out mechanism.
- I should note that even when it is publicly accessible, personal information remains subject to privacy laws and must be adequately protected.
- Engaging with organizations to promote the responsible development and use of trustworthy and privacy-friendly technologies, like AI, is part of my Office’s strategic priority of advocating for privacy amid technological change.
- Our engagement with LinkedIn is ongoing.
Background
- LinkedIn’s engagement with our Office has been voluntary; the OPC did not launch a formal investigation into LinkedIn’s use of personal information to train its generative AI.
- On December 10, 2024, our Office published a statement welcoming LinkedIn’s commitment to pause the training of AI models using the personal information from Canadian member accounts, as did other data protection authorities, including the ICO and PCPD-Hong-Kong.
Lead: CPE
Key Investigation: Powerschool
Speaking Points
- On February 11, 2025, I announced the launch of an investigation into a breach of security safeguards at PowerSchool following a cyberattack that impacted millions of Canadian students, parents, and educators.
- Following engagements with my Office, PowerSchool agreed to implement specific measures in July 2025, in addition to those it had already implemented, to ensure that the company’s security measures are appropriately strengthened.
- These were formalized in a Letter of Commitment which was made public on my Office’s website.
- In light of PowerSchool’s commitments, I decided to conclude the investigation.
- I took this approach to ensure that the matter was resolved in an expedited and efficient manner, and that Canadians’ personal information was adequately protected.
- My Office is actively monitoring PowerSchool’s commitments to ensure they are fully met. If I am not satisfied with the measures implemented, I will consider further actions, such as a commissioner-initiated complaint.
Background
- In December 2024, a hacker used compromised credentials to obtain data such as names, contact information, dates of birth and, in some cases, medical information and Social Insurance Numbers of millions of individuals across Canada.
- PowerSchool has already taken or committed to take measures, such as:
- Requiring the use of the company’s secure remote access solution (requires single sign-on and MFA) to access their customer support platform environment.
- Restricting access to, and tightening password and access controls for, the affected customer support portal.
Lead: CPE
Key Investigation: Nova Scotia Power
Speaking Points
- On April 25, 2025, Nova Scotia Power detected a cyber incident on their network and initiated their incident response with the help of external cybersecurity experts.
- My Office began collaborating with Nova Scotia Power immediately after it was made aware of the breach to ensure that the organization implemented measures expeditiously to adequately mitigate the risk of harm to affected individuals and the impact on Canadians.
- We subsequently received complaints about the matter and on May 28, 2025, launched an investigation into the breach. Our engagement will aim to ensure that the company protects it systems against the risk of a subsequent breach.
- My Office was advised that affected individuals were notified, and that the company is offering a five-year subscription for credit monitoring.
- As this matter is the subject of an open investigation, I cannot share any further details at this time.
Background
- Nova Scotia Power determined that on around March 19, 2025, a threat actor (TA) started gaining access to its networks, and client personal information (PI) stored on its systems was exfiltrated. The company also determined that breached data had been shared by the TA on the dark web.
- Breached PI of customers (current and former) includes name, phone number, email address, mailing addresses, date of birth, customer account history (including customer payment/billing/credit history/bank account numbers), driver’s license number, and social insurance number.
- The company identified and notified over 280,000 affected individuals. To date the OPC has received 77 complaints.
- The Nova Scotia Energy Board (NSEB) is also investigating the breach. We have had discussions with NSEB regarding our respective processes, within confidentiality limitations under PIPEDA.
Lead: CPE
Key Investigation: ArriveCAN
Speaking Points
- On March 19, 2024, my Office launched an investigation under the Privacy Act against the Canada Border Services Agency (CBSA), following receipt of a complaint related to the development of the ArriveCAN mobile app.
- The investigation has been focused on the review of the contracting practices related to ArriveCAN and, more specifically, the measures that were in place during the development of the app to protect personal information.
- As part of the investigation, my office has also taken into consideration privacy issues raised in the motion tabled in the previous Parliamentary session by the House of Commons Standing Committee on Government Operations and Estimates (OGGO) in relation to the contractors that worked on the development of the ArriveCAN app.
Background
- On March 7, 2024, the OPC received a complaint from an MP regarding the CBSA’s contracting practices, and specifically, about the fact that some contracted resources involved in the app development may not have had the appropriate/required security clearances.
- On March 14, 2024, OGGO adopted a motion calling upon the OPC to investigate the ArriveCAN app, including the work of all contractors and subcontractors, to determine whether the privacy and personal information of Canadians was adequately protected.
- We acknowledged OGGO’s motion on May 16, 2024, and confirmed that our investigation of the CBSA would take into consideration the issues raised therein.
- The OPC had previously investigated the ArriveCAN app as part of a special pandemic report, following a complaint it had received alleging that the app had generated inaccurate information that resulted in over 10,000 travellers receiving erroneous notifications to quarantine under the emergency measures.
Lead: CPE
Key Investigation: CRA Breaches
Speaking Points
- In February 2024, I released a special report about credential stuffing attacks that took place in 2020, where bad actors gained access to certain Canada Revenue Agency (CRA) accounts. As part of my investigation, we had learned that the attackers accessed and modified personal information held by the CRA for financial gain.
- During the final stages of this investigation, the CRA advised my office of unreported breaches.
- Per our standard practice, my office was actively engaged with the CRA regarding these breaches. In May 2024, we began receiving retroactive quarterly reports on these breaches, some of which dated back to 2020.
- On October 29, 2024, I received a complaint related to these breaches and subsequently launched an investigation.
- This investigation remains ongoing; therefore, I am limited in what I can share at this time.
Background
- The CRA has been cooperative with the OPC. However, there have been some delays in receiving complete representations during this investigation.
- Since May 2024, the CRA has been reporting to our office on these breaches on a quarterly basis. The most recent quarterly report was submitted in August 2025.
- On December 5, 2024, the Privacy Commissioner appeared before ETHI to discuss the privacy breaches at the CRA.
- The current investigation concerns over 30,000 breaches.
Lead: CPE
Key Investigation: WADA
Speaking Points
- In November 2024, I launched an investigation into the World Anti-Doping Agency (WADA) after receiving a complaint about its handling of biological samples collected from athletes.
- The complaint alleges that WADA disclosed personal information to international sporting federations to assess athletes’ sex-based eligibility without their knowledge or consent, and for a purpose that would not be considered appropriate under PIPEDA.
- As this is an ongoing investigation, I am limited as to what I can share at this time.
Background
- Based in Montreal, WADA is responsible for monitoring and fighting the use of drugs in sports.
- It became subject to PIPEDA in 2015 following international pressure for Canada to ensure that WADA’s vast holdings of sensitive personal information are subject to proper oversight.
- The OPC had launched an investigation into WADA in 2016 following a breach of its Anti-Doping Administration and Management System (ADAMS) which resulted in the public disclosure of athlete’s personal information, including their health information.
- Following that investigation, the OPC entered into a compliance agreement with WADA to implement remedies to address the identified deficiencies in its safeguard. This included:
- developing a comprehensive Information Security framework;
- implementing additional safeguards related to access controls;
- employing encryption at rest for ADAMS data in their custody; and
- ensuring that its application security and intrusion detection is properly configured and that systems and logs are adequately/actively monitored.
Lead: CPE
Key Investigation: Ticketmaster
Speaking Points
- In May 2024, after being made aware from media reports that Ticketmaster had suffered a breach, my Office immediately engaged with the company.
- We learned that a malicious threat actor breached a third-party cloud-based data storage platform that Ticketmaster used, gaining access to the personal information of millions of individuals, including those of Canadians.
- Following the receipt of a complaint, in July 2024, I launched an investigation.
- My investigation aims to determine whether Ticketmaster Canada had adequate safeguards in place to protect the personal information under its control. It also seeks to determine whether it notified the impacted individuals as soon as feasible where it was reasonable to believe that the breach had created a real risk of significant harm.
- As this is an ongoing investigation, I am limited as to what I can share at this time.
Background
- Between April 2 and May 18, 2024, Ticketmaster Canada experienced a cybersecurity incident.
- The personal information that was breached includes information that can be considered sensitive. For a subset of individuals, date of birth and passport number may have also been accessed.
- The breach was linked to a third-party service provider (Snowflake) that Ticketmaster uses to store client information. As part of its investigation, the OPC contacted Snowflake to get information about the incident.
- Under PIPEDA, Ticketmaster is deemed the data controller and is therefore the entity under investigation.
Lead: CPE
Policy and Guidance
In this section
Biometrics Guidance
Speaking Points
- Biometric technology is being used more widely and in new ways, and these uses can involve highly sensitive personal information that is uniquely identifying, stable over time, and difficult to change (e.g. faceprints, voiceprints, DNA, fingerprints, etc.).
- My Office issued guidance to federal institutions and private sector organizations in August 2025 outlining key requirements and considerations for processing biometric information.
- The guidance is meant to help institutions and organizations ensure that their use of biometric technology is done in a way that reduces the risks involved, complies with the law, and respects the right to privacy.
- OPC consulted broadly on the guidance before release and incorporated extensive feedback from industry, public institutions, civil society, and other stakeholders.
Background
- Recent applications of biometric technology with which our office is familiar include age verification, fraud prevention, and location access and security.
- OPC launched a public consultation on a draft version of the guidance in 2023. We received 34 written submissions and met with 31 organizations from varied stakeholder groups.
- Key feedback from the consultation centered on alignment of the guidance with legal requirements, precision in technical and policy explanations, variation in risk profiles and sensitivity across biometric initiatives, and additional best practices for privacy protection.
- The private sector guidance emphasizes the need to obtain valid consent, using an appropriate form of consent, and ensuring that the purpose for collecting, using, and disclosing biometric information is appropriate.
- The public sector guidance emphasizes the need to ensure that there is legal authority for the collection, use, and disclosure of biometric information, and the importance of assessing the necessity and proportionality of proposed biometrics initiatives.
Lead: PRPA
Age Assurance
Speaking Points
- Age assurance can be a reasonable and effective way to create safer online experiences for youth. We are seeing it being introduced in many global jurisdictions, in support of a variety of policy goals.
- However, unless carefully developed and used, age assurance could also lead to over-collection of sensitive personal information, tracking of online activities, or other significant privacy impacts for both young people and adults.
- Prioritizing privacy in age assurance is both necessary and possible. My office is currently developing guidance on both when age assurance should be used and how to design it to be privacy protective, informed by a public consultation we ran last year.
Background
- Age assurance is an umbrella term which encompasses age verification (e.g. ID checking), age estimation, and age declaration.
- Prominent age assurance-related legislation includes the UK Online Services Act, the EU Digital Services Act, the Australian Online Safety Act, and multiple US state-level laws.
- Canada’s Bill S-209 (“Protecting Young Persons from Exposure to Pornography Act”) is currently before the Standing Senate Committee on Legal and Constitutional Affairs.
- Key policy goals for age assurance legislation include limiting access to sexually explicit material, limiting exposure to other harmful material (such as that related to self-harm), or enforcing age minimums for social media access.
- The Australian Age Assurance Technology Trial, which concluded in August 2025, found that “age assurance can be done … privately, efficiently and effectively” and that there are “no substantial technological limitations preventing its implementation to meet policy goals.”
- OPC ran an exploratory consultation on age assurance from June-September, 2024 and received 40 responses from a wide range of stakeholders. We published a “What We Heard” report in March 2025.
Lead: PRPA
Children’s Code
Speaking Points
- My Office intends to develop a Canadian children’s privacy code that will clarify obligations under PIPEDA and set out the OPC’s expectations regarding organizations’ handling of children’s personal information.
- Over the summer, the OPC ran an exploratory consultation to help us better understand stakeholder views on children’s privacy and explore how a code can be developed to best protect their privacy rights. We are currently analyzing submissions, and plan to publish a “What We Heard” report in early 2026 followed by a draft code in the Spring.
- Many jurisdictions around the world, such as the UK, Ireland and California, have benefited from the release of guidance and/or the adoption of legislation that requires the design of products and services to address the specific needs and best interests of children.
- Once implemented, a code of practice can empower children to exercise their privacy rights and protect themselves against potential harms. Modernized privacy legislation with special protections for children can provide greater assurance that children are protected.
Background
- Children’s Privacy Codes in other jurisdictions include:
- The California Age-Appropriate Design Code Act, passed September 15, 2022, was drafted by legislators and introduced directly into law.
- The UK’s Age-Appropriate Design Code was developed after legislation was passed requiring the UK Information Commissioner’s Office to develop a legally binding code and took effect September 2, 2020.
- The Irish Data Protection Commission released a code in the form of guidance titled Fundamentals for a Child-Oriented Approach to Data Processing in December 2021.
- The OPC consultation received over 30 responses from various stakeholder groups that will inform the development of a code and advance the OPC’s 2024-27 Strategic Plan.
Lead: PRPA
Youth Advisory Council
Speaking Points
- In June, I announced the establishment of a Youth Advisory Council to help support my Office’s policy development, consultation and public education functions.
- The aim is to create an avenue for young people to share their insights, experiences, and ideas on the privacy issues that matter the most to them. Their voices will play an important role in deepening our understanding of how these issues impact young people, which will help to inform our efforts where they can have the most impact.
- It is important that all organizations (both businesses and government) actively involve youth when assessing the privacy impacts of projects involving youth’s personal information.
Background
- OPC’s Youth Advisory Council (YAC) will be comprised of seven individuals between the ages of 13-17 from across Canada, named for a two-year term. This age group is highly active online and starting to navigate digital space more independently, yet they remain vulnerable to privacy risks.
- We are aiming to create a diverse group that reflects the broad range of experiences and perspectives of young people across Canada and expect the members to be announced in the Fall/Winter timeframe.
- We received 61 applications and conducted 18 interviews. PRPA is developing a list of final recommended candidates and hopes to inform successful candidates in the next few weeks, following Commissioner approval.
- The OPC conducted a PIA for the YAC, given that personal information would be used to make an administrative decision (to select members) and in special recognition of the sensitivity of children’s personal information.
- Other data protection authorities with YACs include Ontario’s Office of the Information and Privacy Commissioner and Australia’s eSafety Commissioner.
Lead: PRPA
Contributions Program
Speaking Points
- My Office provides up to $500,000 a year for research and public education initiatives on a range of privacy issues related to PIPEDA through our Contributions Program.
- These independent projects generate new information, expertise, and understanding that can help organizations strengthen privacy protections and assist Canadians in exercising their privacy rights in their interactions with the commercial sector.
- Each year’s call for applications focuses on a particular theme that aligns with the priorities of the office. This year we solicited projects that increase knowledge and awareness with respect to smart devices, including associated data flows and technical, policy or legislative steps that can be taken to ensure these devices have privacy built into them.
Background
- Established in 2004, the Contributions Program has provided nearly $10.5 million in funding to cultivate expertise and understanding on a broad range of privacy issues related to the private sector.
- The program has funded a wide diversity of projects, including from the First Nations Information Governance Centre (on data sovereignty and PIPEDA, Vancouver Island University (on young people and AI) and the University of Western Ontario (on dark patterns).
- The Office uses the resulting knowledge and insights to build a strong foundation for advising Parliament, developing policy positions, conducting investigations and promoting public awareness of privacy issues for Canadians.
- All projects must relate to PIPEDA since the program exists under that Act. Proposals are evaluated based on merit by OPC subject-matter experts and, where necessary or appropriate, external peer reviewers. In most years, approximately $50,000 is allocated to successful applicants, up to a maximum of $100,000.
- This year, the program’s Terms and Conditions were renewed for five years by the Minister of Justice (until March 31, 2030).
- The full list of funded projects is published annually on the OPC website, along with summaries of completed projects from previous years.
Lead: PRPA
Transborder Data Flows / Data Sovereignty
- Interoperability can provide businesses with regulatory certainty and help to reduce compliance costs, while maintaining high privacy standards.
- To that end, I work closely with my international counterparts to advance the concept of Data Free Flow with Trust by identifying commonalities between existing regulatory approaches.
- My Office has advocated for amending PIPEDA to include explicit and separate provisions addressing trans-border data flows to ensure that personal information does not leave Canada unless comparable protections are in place. This would ensure our law aligns with modern privacy laws like those in Australia, New Zealand and the GDPR.
- PIPEDA does not prohibit organizations in Canada from transferring personal information to organizations in another jurisdiction, nor does it distinguish between domestic and international transfers.
- It requires that organizations be transparent about their data practices, and clarifies that organizations remain responsible for personal information transferred to a third party for processing and must ensure that a “comparable level of protection” is provided.
Background
- Principle 4.1.3. of PIPEDA requires organizations to use contractual or other means to provide a “comparable level of protection” when transferring information to a third party for processing.
- The privacy laws of Australia, New Zealand, and the EU provide for specific transfer mechanisms, such as adequacy rulings, standard contractual clauses, codes of conduct or other schemes such as binding corporate rules.
- The OPC’s submissions on the former Bill C-11 and Bill C-27 recommended that a TBDF framework be implemented to consider, among other things: 1) to whom the obligations apply; 2) enhanced accountability measures; 3) criteria to be met prior to a transfer taking place; and 4) jurisdictional assessments of protections.
- In 2024, the GPA’s Global Frameworks and Standards Working Group produced, and OPC co-sponsored, a resolution on DFFT. The G7 DPA Roundtable also maintains DFFT as a pillar of the group’s work and is looking at data transfer tools to foster interoperability (i.e. comparisons of certification mechanisms).
Lead: PRPA
Privacy implications of Artificial Intelligence
Speaking Points
- Artificial intelligence (AI) has privacy implications which arise in both the development and use phases.
- Development of AI systems requires massive amounts of data, including personal information; even where individuals are aware of this collection, it is not always clear that they have consented to it or are able to exercise their privacy rights.
- Use of AI systems can be associated with bias or discrimination, non-transparent decisions, or a lack of accountability. AI systems can also process large amounts of data, leading to more sophisticated tracking or surveillance of individuals.
- However, none of these are inherent problems; designing privacy into the development and use of AI systems can lead to responsible innovation.
Background
- OPC Survey of Canadians: 34% of Canadians are “extremely concerned” about their privacy when using AI, the highest among surveyed topics.
- Recommendation 6, 2025 PIPEDA Priority Recommendations: “Enhance accountability by requiring organizations to implement privacy by design and conduct privacy impact assessments (PIAs) for high-risk activities.”
- G7 DPA Roundtable statement on responsible innovation: Common considerations that support prioritizing privacy in practice include:
- Determining whether the processing of personal data is necessary;
- Conducting an assessment of privacy risks that may be created or exacerbated by the technology, and making appropriate design, development and deployment decisions to mitigate identified risks;
- Designing technologies in a way that supports the exercise of privacy rights; and,
- Monitoring and regularly re-assessing the effectiveness of risk mitigations.
Lead: PRPA
Privacy Implications of Foreign Ownership
Speaking Points
- When foreign-owned companies operate in Canada, Canadians’ personal information may be sent to another jurisdiction where it could be accessed by that jurisdiction’s courts, law enforcement and national security authorities.
- Although PIPEDA does not prohibit such transfers, companies must be transparent and advise customers that their information may be sent to another jurisdiction where it could be accessed by foreign authorities.
- PIPEDA permits the disclosure of personal information without knowledge or consent in certain law enforcement and national security contexts where a government institution has lawful authority to request the information.
- Should access by a foreign government raise national security concerns, the government may initiate a national security review under the Investment Canada Act (ICA). Although my Office has not been involved in any specific reviews under the ICA, we are available to be consulted during such a review, and we have provided general advice to government departments regarding privacy and ICA reviews.
Background
- PIPEDA permits the disclosure of personal information without knowledge or consent to a requesting government institution, or part thereof, that has identified its lawful authority to obtain the information and indicated that: it suspects the information relates to national security (s. 7(3)(c.1)(i)); or the request is for the enforcement of, or carrying out an investigation relating to the enforcement of any law of a foreign jurisdiction (s. 7(3)(c.1)(ii)).
- PIPEDA also permits disclosure on an organization’s own initiative if it has reasonable grounds to believe the information relates to a contravention of foreign laws; or if it suspects the information relates to national security (s. 7(3)(d)).
- On August 26, 2019, the OPC wrote to ISED offering advice regarding how to incorporate privacy considerations in ICA reviews. In October 2020, we provided advice to the RCMP on the interplay of privacy and the ICA.
Lead: Legal
Litigation
In this section
Litigation: Facebook
Speaking Points
- In 2024, the Federal Court of Appeal issued an important decision about Facebook’s data practices, acknowledging that international data giants, whose business models rely on users’ personal information, must respect Canadian privacy law and protect individuals’ fundamental right to privacy.
- As my Office had done in its 2019 investigation, the Federal Court of Appeal concluded that the social media platform had breached the requirement to obtain meaningful consent from users and had failed to appropriately safeguard users’ personal information.
- The Supreme Court of Canada has recently agreed to hear Facebook’s appeal of the Federal Court of Appeal’s decision.
Background
- In March 2018, the OPC received a complaint about Facebook arising from media reports that Cambridge Analytica had accessed the personal information of Facebook users without their consent via a third-party application (TYDL App).
- The OPC and the Office of the Information and Privacy Commissioner for British Columbia jointly investigated and found that Facebook had not obtained meaningful consent from its users before disclosing their personal information and that it had not implemented adequate safeguards.
- The OPC filed an application with the Federal Court under s. 15 of PIPEDA seeking, in particular, an order requiring Facebook to correct its practices to comply with PIPEDA, as Facebook did not agree to implement the OPC’s recommendations.
- On April 13, 2023, the Federal Court dismissed the Commissioner’s s. 15 application and the OPC appealed this decision to the Federal Court of Appeal.
- On September 9, 2024, the Federal Court of Appeal allowed the OPC’s appeal with costs and declared that Facebook’s practices between 2013-2015 breached PIPEDA.
- On June 12, 2025, the Supreme Court of Canada granted Facebook’s application seeking leave to appeal the judgement of the Federal Court of Appeal. Both parties must file their respective memoranda of fact and law in the fall of 2025.
Lead: Legal
Litigation: Aylo
Speaking Points
- In February 2025, my Office filed an application with the Federal Court pursuant to section 15 of PIPEDA to seek an order requiring Aylo (formerly MindGeek), the operator of Pornhub and other popular pornographic websites, to comply with Canadian privacy law.
- The application follows an investigation by my Office that found significant problems with Aylo’s privacy practices, which allowed highly sensitive intimate content to be posted online without consent. The Report of Findings was issued in 2024.
- My Office is seeking an order that would require Aylo to implement clear and specific measures to ensure that meaningful consent is obtained directly from all individuals who appear in intimate images and videos that are uploaded to its websites.
- While Aylo changed some of its privacy practices and consent verification mechanisms during and after the investigation, my Office has stated in the application that the company’s practices continue to fail to ensure that meaningful consent is obtained from everyone who appears in the videos.
Background
- In April 2020, the OPC received a complaint against Aylo stemming from its alleged failure to obtain consent from everyone depicted in intimate content posted on its various websites.
- The OPC investigation found that Aylo contravened PIPEDA by enabling intimate content to be shared on its websites without the direct knowledge or consent of everyone depicted. We recommended that Aylo immediately stop the collection, use and disclosure of user-generated intimate images and videos until it had implemented measures to ensure compliance with PIPEDA.
- The OPC was previously in litigation with Aylo from 2023-2024. Aylo had sought an injunction from the Court to prevent the OPC from releasing the report of findings of the investigation while its application for judicial review was ongoing. The Federal Court dismissed Aylo’s injunction request, and the Federal Court of Appeal unanimously dismissed Aylo’s appeal.
Lead: Legal
Litigation: Google
Speaking Points
- In August my Office released the Report of Findings into a complaint from an individual alleging that Google had contravened PIPEDA by displaying links to news articles when their name was searched. They alleged that they were outdated, inaccurate, and disclosed sensitive information that caused significant harm.
- My Office found that in limited circumstances, where there is a serious risk of harm to an individual that outweighs the public interest in freedom of expression, PIPEDA provides individuals the right to have certain information about them de-listed from search engine results when their name is searched online.
- In this case, my Office concluded that Google should de-list the articles in question. Google has declined to implement this recommendation. I am considering all available options to secure compliance with the Act, including seeking an order in Federal Court requiring Google to comply with the recommendation.
Background
- In 2018, the OPC filed a reference with the Federal Court seeking clarity on whether it had jurisdiction over the complaint because Google argued it did not.
- In 2021, the Federal Court ruled that PIPEDA applies since Google’s search engine collects, uses, and discloses personal information in the course of commercial activities; and that Google is not exempt from PIPEDA by virtue of the journalistic-purposes exemption (as they had argued). Google appealed the Federal Court’s decision, which the Federal Court of Appeal upheld in 2023.
- After a Report of Findings (RF) has been issued, section 14 gives complainants the right to apply to the Court for a hearing in respect of any matter referred to that relates to certain provisions of PIPEDA, including subsection 5(3). The application must be brought within one year from issuance of the RF.
- Under section 15(a), with the consent of the complainant, the Commissioner may bring the application. Section 14 and 15 applications are de novo proceedings. Although the RF can be entered into evidence, it is not binding on the Court. Pursuant to section 16 of PIPEDA, the Applicant can ask the Court to “order an organization to correct its practices in order to comply with sections 5 to 10.”
Lead: Legal
- Date modified: