Contributions Program projects underway
On June 20, 2022, the Office of the Privacy Commissioner of Canada (OPC) announced funding for a new round of independent research and knowledge translation projects funded under its Contributions Program. These projects will be completed by March 31, 2023. The OPC will post a summary of completed projects, as well as links to their outcomes, once the projects are completed and reviewed by the OPC.
2022-23 Contributions Program funding recipients
Organization: York University
Project title: Privacy Evaluation of Virtual Classrooms
Amount awarded: $ 49,679.00
Project leader: Yan Shvartzshnaider
The COVID-19 pandemic has forced universities to transition to online platforms, which has exposed them to greater privacy challenges and threats. What makes the situation more complex is that the information handling practices of these platforms often go beyond the educational context. This project seeks to understand the privacy implications of using online platforms for educational purposes.
Specifically, the project will measure to what extent the functionalities and information-handling practices of platforms align with privacy regulations, users' expectations, and ethical concerns. The project will also explore how the pandemic has changed established norms, as remote learning becomes more pervasive, focusing on technological practices such as facial recognition or location-based tracking services to mark student attendances or in-class attention. At the end of the project, the research team hopes to be able to provide informative guidance to stakeholders in designing effective privacy preserving online systems.
Organization: University of Guelph
Project title: Securing Privacy: Examining the Tension Between Push and Pull of Cybersecurity Adoption
Amount awarded: $ 49,450.00
Project leader: Davar Rezania
This research project seeks to examine the privacy protection practices of small and medium-sized enterprises (SMEs). Specifically it looks at how the interaction of external factors (for example government policies, technological advances and market forces) and internal factors (such as the characteristics of an organization, its privacy practices and reputation) affects cybersecurity adoption within those organizations. The researchers hope that identifying the factors that contribute to an SME’s decision to adopt cybersecurity measures can form the basis of recommendations for future amendments to the Personal Information Protection and Electronic Documents Act (PIPEDA).
Organization: York University
Project title: Deconstructing/Performing the Amazon Ring Security Apparatus
Amount awarded: $ 27,020.00
Project leader: Evan Light
This project will examine how the Amazon Ring home security system watches and tracks people, where the data generated by this surveillance goes, and the laws and policies that govern these interactions. A key component of the project is the construction of an Amazon Ring show home within an exhibition space at York University. Within and outside the home, a number of technical components will perform and display a real-time analysis of what is happening behind the scenes.
Organization: Pinnguaq Association
Project title: Privacy, AI, and Machine Learning Through a Rural, Remote, and Indigenous Lens: A Resource and Toolkit
Amount awarded: $ 50,000.00
Project leader: Ryan Oliver
Who is impacted by the privacy and digital safety risks associated with artificial intelligence (AI) and machine learning in rural, remote and Indigenous communities across Canada? This project will explore that question and produce a free, responsive educator toolkit to be distributed across Canada that supports the assessment and mitigation of privacy risks, as well as an understanding of the barriers and inequalities for Canadians living in these communities.
Organization: First Nations Information Governance Centre (FNIGC)
Project title: First Nations Data Sovereignty and the Personal Information Protection and Electronic Documents Act (PIPEDA)
Amount awarded: $48,740.45
Project leader: Albert Armieri
This project intends to support First Nations’ awareness and understanding of the Personal Information Protection and Electronic Documents Act (PIPEDA). FNIGC will develop a plain language guide to the legislation with a focus on practical information for First Nations governments, organizations, and individuals responsible for its application. By enabling First Nations to better understand and apply the legislation, this project also seeks to help First Nations businesses to thrive.
Organization: Open North
Project title: The Intersectional Privacy Risks of Law Enforcement Influence and Involvement in Smart City Projects
Amount awarded: $ 50,000.00
Project leader: Merlin Chatwin
This project focuses on the link between the private and public sectors in the development of smart cities. Specifically, it will investigate how smart city projects in Canada have been influenced by law enforcement agencies, or have taken steps to accommodate law enforcement’s interest in their projects. The project will also develop an intersectional analysis of differential privacy harms caused by such influence. Finally, a final report, as well as a set of field guides and workshop templates will be produced for existing or aspiring smart city projects, to help them navigate law enforcement influence and make informed decisions about smart city project designs.
Organization: University of Regina
Project title: Public Perspectives on Facial Recognition Technology: Attitudes, Preferences, Hopes, and Concerns
Amount awarded: $ 49,450.00
Project leader: Justin Longo
This research proposes to test empirically what Canadians deem acceptable in the context of facial recognition technology (FRT) applications used by private sector actors. A survey of Canadian residents from all provinces and territories will gather information on attitudes towards FRT used in a variety of settings, focusing on safety, privacy, fairness, and discrimination concerns. The researchers hope their work will have implications for the adoption of FRT by the private sector and the development of legislation and regulation in response to its use.
Organization: Queen's University
Project title: Proof of Erasure: Secure Personal Data Deletion with Public Verifiability
Amount awarded: $ 50,000.00
Project leader: Jianbing Ni
Customers increasingly are asking service providers that maintain their digital information to delete their data. However, the current legislative landscape raises a number of questions, notably with regard to the right to erasure of Canadians. This project aims to acquire knowledge and understanding of data deletion policies, as well as explore effective approaches for secure data deletion and advanced designs of proof of erasure. In particular, the project will study the extent to which the right of data deletion is recognized by various privacy laws and regulations, such as the Personal Information Protection and Electronic Documents Act (PIPEDA), and analyze how that right is deployed on existing data platforms. It will also identify through an online survey and a case study the secure personal data deletion policies expected by the public.
Organization: Centre de documentation sur l’éducation des adultes et la condition féminine (CDEACF)
Project title: Protecting victims’ privacy to prevent spousal homicide
Amount awarded: $50,000.00
Project leader: Lise Chovino
This project aims to help us understand and document the need for digital security and privacy protection when evaluating the safety net of women in second-stage housing (i.e. shelters for women who are at risk of being victims of spousal homicide after leaving an emergency shelter). It will allow us to evaluate security breaches due to digital service provider practices, gauge the level of safety felt by housed women with access to these services and better protect them from the impacts of information collection by developing adapted teaching tools.
Organization: Canadian National Institute for the Blind (CNIB)
Project title: Consent and Inclusion, Diversity, Equity and Accessibility
Amount awarded: $ 48,944.00
Project leader: Mahadeo Sukhai
This project will research the relationship between individual consent as it relates to personal information and the factors that make up a person’s identity. Specifically, the project team will seek to better understand the ways in which people who are blind in Canada understand, provide and conceptualize consent, while also considering other factors in the course of their research, such as mother tongue, ethnicity, race, educational background, additional disability, age, and employment. This will in turn allow the CNIB to shape its practices in ways that respect the concepts of inclusion, diversity, equity and accessibility. These findings and practices will be shared with other Canadian organizations in the hopes of shaping their practices for the better.
Organization: Concordia University
Project title: Privacy Analysis of Technologies Used in Intimate Partner Abuse
Amount awarded: $ 26,716.55
Project leader: Mohammad Mannan
Technology plays a major role in facilitating intimate partner violence (IPC) where invasion of privacy has become a significant form of IPC. Such violation of privacy can happen through monitoring the victim’s movements using stalkerware and hidden Wi-Fi cameras, and by using drones to non-consensually film or harass the victim. Manipulating and distributing intimate images obtained through creepshots and deepfakes are other forms of such abuse. The aim of this project is to investigate the cyber-IPC ecosystem including the technological tools that are used by abusers, as well as the computer security tools or apps that can be provided to help victims. The researchers will produce a public report that summarizes their findings, presents recommendations for solutions, and include guidelines for victims. The team will also produce a technical paper.
2021-22 Contributions Program funding recipients
Project title: #For You: A Game About Artificial Intelligence and Privacy Project
Amount requested: $50,000
Project leader: Kara Brisson-Boivin
#ForYou: A Game About Artificial Intelligence and Privacy will equip young people to consider and discuss the role of machine learning and recommendation algorithms in their lives and the impacts that these have on their privacy, in a way that is concrete and engaging.
The project places youth in the role of video creators who are attempting to gain a following and monetize their videos on a platform such as YouTube or TikTok.
This educational resource will give youth an opportunity to: 1) reflect on the ways in which they interact with AI in their day-to-day lives; 2) learn more about how AI and algorithms work; 3) understand the implications of AI and algorithms on their privacy, and; 4) learn how to take action to advocate for and design algorithms that are equitable and practice good stewardship of personally identifying information.
Organization: Women's College Hospital
Project title: Commercial Virtual Care Services in Canada: Consumer Data, De-Identification and Privacy
Amount requested: $49,535
Project leader: Sheryl Spithoff
The use of commercial virtual care services in Canada has exploded with the COVID-19 pandemic, improving access and transforming care, as well as raising concerns about the privacy and security of personal health information.
This project’s research goal is to describe, analyze and critique commercial virtual care services in Canada. It focuses on how the collection and commercialization of de-identified data affects privacy as well as other risks that emerge with inadequate privacy protections.
Ultimately, the goal of the project is to create new normative frameworks and enhance privacy protections and accountability.
Organization: Association des juristes d’expression française de l’Ontario (AJEFO)
Project title: Get Informed and Protect your Privacy Online
Amount requested: $50,000
Project leader: Andrée-Anne Martel
This project aims to create and disseminate Canada-wide legal information resources using plain language on online privacy rights and obligations. More specifically, AJEFO will develop legal news articles, plainlanguage infographics, a video clip, virtual legal information sessions and an educational resource for the general public.
The intensification of online activities and the growing need to connect in order to work, consume, study, interact socially, have fun, and even to understand and deal with legal issues, peaked in 2020, as a result of the COVID-19 pandemic. Yet, in this digital sphere, members of the public continue to take risks every day and often lack information to better protect their privacy.
Faced with this reality, AJEFO will develop clear, simple and accessible information to empower Canadians and enable them to better control and protect the dissemination of their personal information online. Available in a variety of formats, these resources developed entirely in French will enable the general Francophone public across Canada to know, understand and defend their online privacy rights.
Organization: Canadian Standards Association (CSA Group)
Project title: Privacy for Wellness Wearables: Emerging Trends and Power Dynamics in a Grey Area
Amount requested: $50,000
Project leader: Nicki Islic
This research and knowledge translation project focuses on the privacy implications of wellness wearables.
There has been a rapid increase in uptake of commercial health and wellness apps and wearable devices. By enabling the active and passive collection of consumers’ sensitive health-related data, these apps and devices pose substantial privacy risks (e.g. the creation of behavioural profiles for prediction, data breaches, etc.). As both commercial and health products, they occupy a regulatory grey area that allows for increased data sharing, cross-border transmission and storage and other risks to intimate data.
The project supports the development and understanding of privacy rights for those developing standards, policies, and guidance related to this field.
Organization: Institute for Information Systems Engineering (CIISE), Concordia University
Project title: Privacy Report Card for Online Solutions Targeting Seniors
Amount requested: $49,740
Project leader: Mohammad Mannan & Amr Youssef
According to the World Economic Forum, 70% of seniors are now online. Many Canadian seniors track their medical conditions with remote monitoring health devices. Internet of Things (IoT) devices that provide services such as GPS tracking, automatic fall detection, as well as many other medical alert solutions are readily available in the Canadian market that focus on the elderly population and their caregivers. Hundreds of mobile apps, specifically designed for the elderly and their care givers are available on different app stores.
However, very little is known about the costs of using these devices or applications in terms of a new source for privacy leakage and security/safety risks. This project examines, through a comprehensive and systematic technical/experimental investigation, the security and privacy risks associated with such solutions that are commonly used by many Canadian elderly citizens and their caregivers.
The project will produce a public report card for online solutions designed for seniors and the elderly, summarizing the findings of the research investigation and presenting recommendations for improving these solutions in terms of effectiveness, security and privacy. The findings of this report will be provided online free. The project will also produce an academic paper detailing the full methodology and results, and technical recommendations.
Organization: Department of Information Systems and Quantitative Management Methods, School of Management, Université de Sherbrooke
Project title: Guide for Creating a Secure Digital Identity Framework as Part of a Cloud-Based Digital Organizational Transformation
Amount requested: $47,443
Project leader: Pierre-Martin Tardif
Canadian companies manage the digital identity of many people, such as clients and staff, which poses challenges in terms of personal information protection and legal compliance, in a context of de facto transnational cloud-based data storage. However, scientific knowledge in the field is still limited and most often dispersed across several disciplines, which limits the ability of organizations to securely implement digital identity.
This research project will identify and evaluate innovative practices to create a Guide for creating a secure digital identity framework as part of a cloud-based digital organizational transformation. More specifically, it aims to answer the following research question: How do we implement a digital identity within a company while respecting the privacy of Canadians in accordance with the legal framework in which data are collected, stored and shared through cloud-based resources?
This research will identify and evaluate innovative practices in this area and highlight best practices. Lastly, the Guide that will be produced will make this knowledge available to Canadian companies to facilitate its applicability.
Organization: Institute for Information Systems Engineering (CIISE), Concordia University
Project title: Privacy Design Landscape for Central Bank Digital Currency
Amount requested: $26,450
Project leader: Jeremy Clark
There are many technical complexities involved in designing a central bank digital currency (CBDC) but one of the most complicated of the design parameters is privacy. This research project has one main deliverable: to produce an in-depth study of the design landscape for privacy in CBDCs, both for an academic audience and for a general audience.
To ensure the results have technical depth, the research explores suitable designs through experimentation, such as cryptographic protocol design and blockchain smart contracts (all code artifacts released free and open source with performance measurements).
This project does not take a normative position that a CBDC ought to be deployed in a modern economy. Rather, it takes the position that CBDCs are a possibility. The project affirms that privacy must be a starting point for designing a CBDC, not an afterthought.
Organization: Surveillance Studies Centre, Queen's University
Project title: Deeper Learning? Marketing, Personal Data and Privacy After Surveillance Capitalism
Amount requested: $49,407.78
Project leader: David Murakami Wood and Stephen Thomas
The researchers behind this project propose that the model of “surveillance capitalism” is already out of date. Sparked partly by the demands of the European General Data Protection Regulation (GDPR), they argue that we are seeing the emergence of new models of marketing using open-source intelligence, more time- or topic-limited types of personal data, and AI models, particularly Deep Learning. These models may not use less data but they do use data that is claimed to be less personally identifiable, while generating more lucrative outcomes for their marketing users.
The project thus asks five questions: 1. How do these models work? 2. How widespread is their use already? 3. Are the claims of their advocates justified, and what are the implications for privacy? 4. Are Canada’s current privacy laws and regulations capable of dealing with these models and their privacy implications? 5. What changes to law and regulation might be needed to do so?
The purpose of this project, in producing a report, technological walk-throughs and regulatory scenarios, is to increase knowledge and understanding within government, Parliament, and regulators of the actual and potential future practices of corporations regarding the collection, use and analysis of data, how these practices affect life chances and choices, and what options are available to protect human rights, particularly privacy.
Organization: Faculty of Law, Civil Law Section, University of Ottawa
Project title: Online Exam Monitoring Software During the Pandemic: Seeking to Minimize the Risks to Student Privacy
Amount requested: $28,500
Project leader: Céline Castets-Renard
The COVID-19 pandemic forced much of the world’s population to telework. Canadian universities are no exception and the majority of courses and exams must now be completed online. In order to do this, most universities have developed software to monitor exams referred to as “proctors”. These software programs, such as Respondus Monitor, ProctorExam, Examity, ProctorU and Proctorio, offer various solutions. Although we understand the good intentions of universities and teachers, as well as the desire to guarantee academic quality and integrity, these tools do raise concerns. The main concerns relate to the potential infringement of student privacy and the massive collection of personal information.
This research will analyze the main exam monitoring software used in Canada and highlight the privacy risks to students. The goal is to find ways to reduce these risks by making recommendations to legislators as part of amendments to the Personal Information Protection and Electronic Documents Act (PIPEDA) through Bill C-11. The recommendations will focus on specific areas of concern identified through the use of exam monitoring tools.
Organization: Children’s Hospital of Eastern Ontario (CHEO) Research Institute
Project title: A Pan-Canadian Descriptive Study of Privacy Risks from Data Synthesis Practices within the Evolving Canadian Legislative Landscape
Amount requested: $49,404
Project leader: Khaled El Eman
Data synthesis is rapidly emerging as a practical privacy enhancing technology (PET) for sharing data for secondary purposes. However, the strengths and weaknesses of this emerging technology are not fully appreciated and need to be evaluated. As well, we need to develop an understanding of how data synthesis would be treated under various privacy regimes in Canada.
This project aims to provide a detailed overview of data synthesis as a PET used to facilitate data sharing within the Canadian context. It is intended to help Canadian organizations understand what data synthesis is, and to provide an assessment of contemporary methods and technologies and how they can be applied under current and proposed regulatory regimes.
The proposed project consists of three main research phases: 1. An overview of data synthesis (environmental scan/literature review); 2. A legal analysis of data synthesis under Part I of PIPEDA and the Consumer Privacy Protection Act (CPPA) component of the Digital Charter Implementation Act, 2020 (Bill C-11), and; 3. Perspectives of Canadian regulators on data synthesis. The research will assess: i) whether PIPEDA and the proposed provisions of the CPPA adequately address data synthesis as a PET to protect individual privacy; ii) identify if there are gaps in both PIPEDA and in the legislative proposal, and the nature of such gaps; and iii) propose solutions to “close the gaps.”
Organization: Faculty of Philosophy, Université Laval
Project title: Thinking About Privacy Rights in Relation to Individual and Collective Rights
Amount requested: $49,508
Project leader: Jocelyn Maclure
In the age of Big Data, recent technologies and AI allow for the inference of private attributes from a quantity of data that is innocuous when viewed in isolation, but which, when cross-referenced with other data, can reveal in a probabilistic manner things such as a person’s political opinions, religious beliefs, sexual orientation, lifestyle and health status.
This information, inferred from information that is not necessarily personal, can be used to influence or manipulate behaviour or opinions and to engage in profiling or discrimination. More and more experts are stressing the importance of addressing this relatively new area of privacy and data protection. Some are even claiming a right to “reasonable inferences”.
In their recent work, researchers have shown that the safeguards offered by current and proposed Canadian legislation (Bill C-11) do not adequately protect the privacy of individuals with respect to the risks posed by the use of inferred information. To adequately protect individuals from these risks, they argue there is a need to shift the focus from individual consent-based control of personal data to a regulatory framework for the use of inferences. These researchers have even suggested that certain inferences should be prohibited. The results of their work support the view that the right to privacy is a necessary condition for the exercise of other fundamental rights. However, this proposal faces a major challenge.
On the one hand, certain experts argue that the protection of fundamental rights must not take place within the context of privacy protection, since these fundamental rights can be protected independently of privacy. On the other, some advocates of a distinctive view of privacy – who support the concept of “collective” privacy – argue that the protection of privacy rights must include protection of algorithmic groups, these groups of people who share a generic identity generated by algorithms.
What should we make of this? The researchers will undertake a critical analysis of these two fundamental theses. In their view, this analysis is necessary if there is to be privacy protection based on rights and recognition of the right to privacy in its entirety.
Report a problem or mistake on this page
- Date modified: