Language selection

Search

Contributions Program projects underway

On June 23, 2021, the Office of the Privacy Commissioner of Canada (OPC) announced funding for a new round of independent research and knowledge translation projects funded under its Contributions Program. These projects will be completed by March 31, 2022. The OPC will post a summary of completed projects, as well as links to their outcomes, once the projects are completed and reviewed by the OPC.

2021-22 Contributions Program funding recipients

Organization: MediaSmarts
Project title: #For You: A Game About Artificial Intelligence and Privacy Project
Amount requested: $50,000
Project leader: Kara Brisson-Boivin
Province: Ontario

Project summary:

#ForYou: A Game About Artificial Intelligence and Privacy will equip young people to consider and discuss the role of machine learning and recommendation algorithms in their lives and the impacts that these have on their privacy, in a way that is concrete and engaging.

The project places youth in the role of video creators who are attempting to gain a following and monetize their videos on a platform such as YouTube or TikTok.

This educational resource will give youth an opportunity to: 1) reflect on the ways in which they interact with AI in their day-to-day lives; 2) learn more about how AI and algorithms work; 3) understand the implications of AI and algorithms on their privacy, and; 4) learn how to take action to advocate for and design algorithms that are equitable and practice good stewardship of personally identifying information.


Organization: Women's College Hospital
Project title: Commercial Virtual Care Services in Canada: Consumer Data, De-Identification and Privacy
Amount requested: $49,535
Project leader: Sheryl Spithoff
Province: Ontario

Project summary:

The use of commercial virtual care services in Canada has exploded with the COVID-19 pandemic, improving access and transforming care, as well as raising concerns about the privacy and security of personal health information.

This project’s research goal is to describe, analyze and critique commercial virtual care services in Canada. It focuses on how the collection and commercialization of de-identified data affects privacy as well as other risks that emerge with inadequate privacy protections.

Ultimately, the goal of the project is to create new normative frameworks and enhance privacy protections and accountability.


Organization: Association des juristes d’expression française de l’Ontario (AJEFO)
Project title: Get Informed and Protect your Privacy Online
Amount requested: $50,000
Project leader: Andrée-Anne Martel
Province: Ontario

Project summary:

This project aims to create and disseminate Canada-wide legal information resources using plain language on online privacy rights and obligations. More specifically, AJEFO will develop legal news articles, plainlanguage infographics, a video clip, virtual legal information sessions and an educational resource for the general public.

The intensification of online activities and the growing need to connect in order to work, consume, study, interact socially, have fun, and even to understand and deal with legal issues, peaked in 2020, as a result of the COVID-19 pandemic. Yet, in this digital sphere, members of the public continue to take risks every day and often lack information to better protect their privacy.

Faced with this reality, AJEFO will develop clear, simple and accessible information to empower Canadians and enable them to better control and protect the dissemination of their personal information online. Available in a variety of formats, these resources developed entirely in French will enable the general Francophone public across Canada to know, understand and defend their online privacy rights.


Organization: Canadian Standards Association (CSA Group)
Project title: Privacy for Wellness Wearables: Emerging Trends and Power Dynamics in a Grey Area
Amount requested: $50,000
Project leader: Nicki Islic
Province: Ontario

Project summary:

This research and knowledge translation project focuses on the privacy implications of wellness wearables.

There has been a rapid increase in uptake of commercial health and wellness apps and wearable devices. By enabling the active and passive collection of consumers’ sensitive health-related data, these apps and devices pose substantial privacy risks (e.g. the creation of behavioural profiles for prediction, data breaches, etc.). As both commercial and health products, they occupy a regulatory grey area that allows for increased data sharing, cross-border transmission and storage and other risks to intimate data.

The project supports the development and understanding of privacy rights for those developing standards, policies, and guidance related to this field.


Organization: Institute for Information Systems Engineering (CIISE), Concordia University
Project title: Privacy Report Card for Online Solutions Targeting Seniors
Amount requested: $49,740
Project leader: Mohammad Mannan & Amr Youssef
Province: Quebec

Project summary:

According to the World Economic Forum, 70% of seniors are now online. Many Canadian seniors track their medical conditions with remote monitoring health devices. Internet of Things (IoT) devices that provide services such as GPS tracking, automatic fall detection, as well as many other medical alert solutions are readily available in the Canadian market that focus on the elderly population and their caregivers. Hundreds of mobile apps, specifically designed for the elderly and their care givers are available on different app stores.

However, very little is known about the costs of using these devices or applications in terms of a new source for privacy leakage and security/safety risks. This project examines, through a comprehensive and systematic technical/experimental investigation, the security and privacy risks associated with such solutions that are commonly used by many Canadian elderly citizens and their caregivers.

The project will produce a public report card for online solutions designed for seniors and the elderly, summarizing the findings of the research investigation and presenting recommendations for improving these solutions in terms of effectiveness, security and privacy. The findings of this report will be provided online free. The project will also produce an academic paper detailing the full methodology and results, and technical recommendations.


Organization: Department of Information Systems and Quantitative Management Methods, School of Management, Université de Sherbrooke
Project title: Guide for Creating a Secure Digital Identity Framework as Part of a Cloud-Based Digital Organizational Transformation
Amount requested: $47,443
Project leader: Pierre-Martin Tardif
Province: Quebec

Project summary:

Canadian companies manage the digital identity of many people, such as clients and staff, which poses challenges in terms of personal information protection and legal compliance, in a context of de facto transnational cloud-based data storage. However, scientific knowledge in the field is still limited and most often dispersed across several disciplines, which limits the ability of organizations to securely implement digital identity.

This research project will identify and evaluate innovative practices to create a Guide for creating a secure digital identity framework as part of a cloud-based digital organizational transformation. More specifically, it aims to answer the following research question: How do we implement a digital identity within a company while respecting the privacy of Canadians in accordance with the legal framework in which data are collected, stored and shared through cloud-based resources?

This research will identify and evaluate innovative practices in this area and highlight best practices. Lastly, the Guide that will be produced will make this knowledge available to Canadian companies to facilitate its applicability.


Organization: Institute for Information Systems Engineering (CIISE), Concordia University
Project title: Privacy Design Landscape for Central Bank Digital Currency
Amount requested: $26,450
Project leader: Jeremy Clark
Province: Quebec

Project summary:

There are many technical complexities involved in designing a central bank digital currency (CBDC) but one of the most complicated of the design parameters is privacy. This research project has one main deliverable: to produce an in-depth study of the design landscape for privacy in CBDCs, both for an academic audience and for a general audience.

To ensure the results have technical depth, the research explores suitable designs through experimentation, such as cryptographic protocol design and blockchain smart contracts (all code artifacts released free and open source with performance measurements).

This project does not take a normative position that a CBDC ought to be deployed in a modern economy. Rather, it takes the position that CBDCs are a possibility. The project affirms that privacy must be a starting point for designing a CBDC, not an afterthought.


Organization: Surveillance Studies Centre, Queen's University
Project title: Deeper Learning? Marketing, Personal Data and Privacy After Surveillance Capitalism
Amount requested: $49,407.78
Project leader: David Murakami Wood and Stephen Thomas

Province: Ontario

Project summary:

The researchers behind this project propose that the model of “surveillance capitalism” is already out of date. Sparked partly by the demands of the European General Data Protection Regulation (GDPR), they argue that we are seeing the emergence of new models of marketing using open-source intelligence, more time- or topic-limited types of personal data, and AI models, particularly Deep Learning. These models may not use less data but they do use data that is claimed to be less personally identifiable, while generating more lucrative outcomes for their marketing users.

The project thus asks five questions: 1. How do these models work? 2. How widespread is their use already? 3. Are the claims of their advocates justified, and what are the implications for privacy? 4. Are Canada’s current privacy laws and regulations capable of dealing with these models and their privacy implications? 5. What changes to law and regulation might be needed to do so?

The purpose of this project, in producing a report, technological walk-throughs and regulatory scenarios, is to increase knowledge and understanding within government, Parliament, and regulators of the actual and potential future practices of corporations regarding the collection, use and analysis of data, how these practices affect life chances and choices, and what options are available to protect human rights, particularly privacy.


Organization: Faculty of Law, Civil Law Section, University of Ottawa

Project title: Online Exam Monitoring Software During the Pandemic: Seeking to Minimize the Risks to Student Privacy
Amount requested: $28,500
Project leader: Céline Castets-Renard
Province: Ontario

Project summary:

The COVID-19 pandemic forced much of the world’s population to telework. Canadian universities are no exception and the majority of courses and exams must now be completed online. In order to do this, most universities have developed software to monitor exams referred to as “proctors”. These software programs, such as Respondus Monitor, ProctorExam, Examity, ProctorU and Proctorio, offer various solutions. Although we understand the good intentions of universities and teachers, as well as the desire to guarantee academic quality and integrity, these tools do raise concerns. The main concerns relate to the potential infringement of student privacy and the massive collection of personal information.

This research will analyze the main exam monitoring software used in Canada and highlight the privacy risks to students. The goal is to find ways to reduce these risks by making recommendations to legislators as part of amendments to the Personal Information Protection and Electronic Documents Act (PIPEDA) through Bill C-11. The recommendations will focus on specific areas of concern identified through the use of exam monitoring tools.


Organization: Children’s Hospital of Eastern Ontario (CHEO) Research Institute
Project title: A Pan-Canadian Descriptive Study of Privacy Risks from Data Synthesis Practices within the Evolving Canadian Legislative Landscape
Amount requested: $49,404
Project leader: Khaled El Eman
Province: Ontario

Project summary:

Data synthesis is rapidly emerging as a practical privacy enhancing technology (PET) for sharing data for secondary purposes. However, the strengths and weaknesses of this emerging technology are not fully appreciated and need to be evaluated. As well, we need to develop an understanding of how data synthesis would be treated under various privacy regimes in Canada.

This project aims to provide a detailed overview of data synthesis as a PET used to facilitate data sharing within the Canadian context. It is intended to help Canadian organizations understand what data synthesis is, and to provide an assessment of contemporary methods and technologies and how they can be applied under current and proposed regulatory regimes.

The proposed project consists of three main research phases: 1. An overview of data synthesis (environmental scan/literature review); 2. A legal analysis of data synthesis under Part I of PIPEDA and the Consumer Privacy Protection Act (CPPA) component of the Digital Charter Implementation Act, 2020 (Bill C-11), and; 3. Perspectives of Canadian regulators on data synthesis. The research will assess: i) whether PIPEDA and the proposed provisions of the CPPA adequately address data synthesis as a PET to protect individual privacy; ii) identify if there are gaps in both PIPEDA and in the legislative proposal, and the nature of such gaps; and iii) propose solutions to “close the gaps.”


Organization: Faculty of Philosophy, Université Laval
Project title: Thinking About Privacy Rights in Relation to Individual and Collective Rights
Amount requested: $49,508
Project leader: Jocelyn Maclure
Province: Quebec

Project summary:

In the age of Big Data, recent technologies and AI allow for the inference of private attributes from a quantity of data that is innocuous when viewed in isolation, but which, when cross-referenced with other data, can reveal in a probabilistic manner things such as a person’s political opinions, religious beliefs, sexual orientation, lifestyle and health status.

This information, inferred from information that is not necessarily personal, can be used to influence or manipulate behaviour or opinions and to engage in profiling or discrimination. More and more experts are stressing the importance of addressing this relatively new area of privacy and data protection. Some are even claiming a right to “reasonable inferences”.

In their recent work, researchers have shown that the safeguards offered by current and proposed Canadian legislation (Bill C-11) do not adequately protect the privacy of individuals with respect to the risks posed by the use of inferred information. To adequately protect individuals from these risks, they argue there is a need to shift the focus from individual consent-based control of personal data to a regulatory framework for the use of inferences. These researchers have even suggested that certain inferences should be prohibited. The results of their work support the view that the right to privacy is a necessary condition for the exercise of other fundamental rights. However, this proposal faces a major challenge.

On the one hand, certain experts argue that the protection of fundamental rights must not take place within the context of privacy protection, since these fundamental rights can be protected independently of privacy. On the other, some advocates of a distinctive view of privacy – who support the concept of “collective” privacy – argue that the protection of privacy rights must include protection of algorithmic groups, these groups of people who share a generic identity generated by algorithms.

What should we make of this? The researchers will undertake a critical analysis of these two fundamental theses. In their view, this analysis is necessary if there is to be privacy protection based on rights and recognition of the right to privacy in its entirety.


Report a problem or mistake on this page
Error 1: No selection was made. You must choose at least 1 answer.
Please select all that apply (required):

Note

Date modified: