Introduction
Each year the OPC identifies a theme for its Contributions Program, which funds independent privacy research and related knowledge translation programs. While proposals related to any topic relevant to privacy issues in the private sector are welcome, the annual theme is an area in which we believe additional research will have particular benefit for the privacy interests of Canadians.
In 2023-24, in support of the OPC’s Strategic Priority of “addressing and advocating for privacy in a time of technological change”, our Contributions Program theme was “The future is now! Assessing and managing the privacy impacts of immersive and embeddable technologies.” Three of the funded projects related to this topic were:
- Making Privacy More than a Virtual Reality: The Challenges of Extending Canadian Privacy Law to Extended Reality
- Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic – (CIPPIC; leads: Emily Chu, Renae Pennington, Chloe Bechard, Shaarini Ravitharan, Harmon Imeson Jorna, and Christian Clavette)
- In the matrix – Consumer privacy in the metaverse Footnote 1
- Option Consommateurs (leads: Sara Eve Levac and Luis Pineda)
- Privacy Analysis of Virtual Reality/Augmented Reality Online Shopping Applications Footnote 2
- Concordia University (leads: Mohammad Mannan and Amr Youssef)
With this blog post, we will share some of the key learnings and takeaways from these themed projects.Footnote 3 For interested readers, we would also encourage you to consult the summaries of all completed projects which are available on our website, as well as our previous Real Results publications.
About Immersive Technologies
Broadly, “immersive technologies” create experiences that imitate or enhance physical reality or create wholly new digital realities. A largely equivalent term – which we will use interchangeably in this post – is extended reality (XR). As explained in both the CIPPIC and Option Consommateurs reports, XR is an umbrella term which encompasses:
- Augmented reality (AR), in which digital information (such as text or images) is superimposed on the physical world. This can be done on a smartphone display (for example, games such as Pokemon Go or apps that allow users to virtually ‘try-on’ clothing) or through a wearable device (such as smart glasses).
- Virtual reality (VR), in which users are completely immersed in a virtual world, generally using a headset that obscures an individual’s vision of the physical world.
- Mixed reality (MR), in which the physical and digital worlds are blended to create a hybrid reality with varying degrees of immersion.
The Option Consommateurs report also gives a definition of the term metaverse. From the Larousse French dictionary, a metaverse is “a parallel, immersive digital universe in which one can develop and interact (work, play, form relationships, etc.) just like in real life.”Footnote 4 That is, the term metaverse can be understood to be the reality (or realities) created by XR technology, especially VR and MR.
For the purposes of this blog post, the specific dividing lines between these terms is not important; instead, one should takeaway from this section that “immersive technology” and “extended reality” refer to technologies that range from those that enhance the physical world with digital information to those that create entirely new digital realities.
The Projects
To begin, we will provide an overview of the three immersive technology-themed projects and their findings. As with all Contributions Program projects, these are undertaken at arms-length from the OPC. That means that we did not direct the research process nor validate results, and that conclusions made by the researchers may not necessarily reflect our position on an issue. These are very high-level summaries of extensive works of research; if you are interested in the topic of privacy and immersive technology, we recommend reading the reports in their entirety.
Option Consommateurs
In its report “In the matrix: Consumer privacy in the metaverse”, Option Consommateurs provides a broad overview of XR technologies and many of their associated privacy challenges. The report walks the reader through a description of the metaverse and the technologies that enable it, the data that is collected and used by these systems, how that collection and use is explained to individuals, and how privacy laws apply to these practices. It provides the reader with a strong foundation in immersive technology and its privacy impacts.
In this summary, we will focus on the report’s research into how privacy information is presented to individuals using XR technologies. To highlight the importance of ensuring that individuals are given meaningful information about how data is collected and used, the researchers note that the immersive nature of metaverse technologies blurs the boundary between public and private spaces, between the real and virtual worlds, and between sensitive and non-sensitive data. This, they argue, raises risks and challenges to the right to privacy, consumer rights, and children’s rights.
Option Consommateurs researchers found, though, that a user seeking the necessary information to understand these risks and challenges will often be presented with voluminous privacy-related information spread across multiple documents. These documents are drafted in a way such that even if they were to be read in full, an individual may not fully understand the potential privacy impacts of the technology. For instance, Option Consommateurs notes that privacy policies may speak to what information is collected and how information is used in separate sections, creating uncertainty for the user as to what information is associated with which purpose.
The researchers note that experts they spoke to stressed the need to re-think the way that individuals are provided with information and their consent sought, potentially by integrating it into the immersive experience as part of a tutorial.
In addition to the clarity of privacy notices, Option Consommateurs researchers examined:
- The need to enhance protections for the kind of biometric information collected by XR technologies in order to enhance immersion in the metaverse, which can reveal sensitive information even if not being used to identify an individual (and thus potentially falling outside of some current legislative definitions of biometric informationFootnote 5); and,
- The importance of ensuring the privacy rights of children – recommending, for example, a prohibition on using children’s information collected by immersive technology for commercial purposes such as targeted advertising.
In the end, in addition to recommending better transparency practices related to immersive technology, Option Consommateurs concluded that legislation would be needed to create better protections for users and to meet the challenges posed by the metaverse and XR technologies.
CIPPIC
CIPPIC’s “Making Privacy More than a Virtual Reality” complements the work of Option Consommateurs with a thorough consideration of the privacy challenges associated with immersive technology, and the changes to Canada’s regulatory approach that those challenges may necessitate. Based on a combination of literature review, legal analysis and stakeholder interviews, the researchers argue that current Canadian privacy laws do not adequately address the intricacies of immersive technologies.
These intricacies, and the challenges they create, include:
- Extensive biometric data collection: XR technologies will generally rely on extensive data collection about one’s environment. For some devices this collection will extend to biometric data about the user, such as pupillometry (a measure of pupil dilation and responsiveness). As noted in the report, when this data is combined with information about what is being displayed to a user (or what they are looking at) it can allow highly accurate, and highly intimate inferences to be made about a person’s behaviours, interests and characteristics.
- The “bystander problem”: In collecting information about a user’s environment, XR technology will almost inevitably collect information about other individuals. The report states that while some devices do attempt to mitigate these impacts through measures such as indicator lights, the nature of this data collection is both unprecedented and not contemplated under Canada’s current privacy legislation. The report also notes that integration of XR technologies into additional facets of life risks blurring the boundaries between ‘public’ and ‘private’ spaces.
- Complexities with informed consent: The report flags multiple challenges related to obtaining informed consent for XR technologies. These include issues with respect to providing sufficient information to users to allow them to understand potential privacy harms (particularly given the extensive data collected by these systems); that users will be unable to provide consent on behalf of any ‘bystanders’ whose data is captured because they happen to be near that XR user; and that children may not be able to provide informed consent at all.
The report concludes with a series of recommendations for government and industry, including to create regulations tailored to the nuanced challenges of XR, to implement stricter data protection measures for children, and to support research and development of privacy-enhancing technologies that can be integrated into XR systems.
Concordia University
The potential privacy impacts of immersive technologies make it important to understand how data actually flows within these systems. This is the work taken on by researchers at Concordia University with their “Privacy Analysis of VR/AR Online Shopping Applications.”
For this project, researchers analysed 138 websites and 28 Android apps that offered “virtual try-on” – the use of augmented reality to show how a piece of clothing would look on a person, how a piece of furniture would look in a room, etc. A small number of full virtual reality shopping apps were identified, but the researchers determined that they were not sufficiently mature for this analysis. By capturing and analysing network traffic when using these services, flows of data among multiple parties could be identified – and, importantly, compared with statements made in privacy notices.
Of the 138 websites tested, 90 were found to transmit user images to servers (either first-party or third-party, such as virtual try-on service providers or analytics servers). However, in 15 of those 90 instances the practice did not align with information in privacy notices. Further, the researchers found that 35 sites used a virtual try-on service provider whose practices did not align with information in its privacy notice. Even more alarming, according to the researchers, were the instances in which a user was explicitly informed (via a privacy policy or pop-up message) that their image would be processed entirely on-device, only for traffic analysis to show images being sent to external servers or shared with other parties. Only 5 of 28 Android apps tested were found to send images to a server, but again two of these five were found to contradict information in privacy notices. Extensive use of web and app trackers was also found, in addition to some cybersecurity-related vulnerabilities (of which the services in question were notified).
Based on this, the researchers concluded that privacy concerns are present in virtual try-on apps, particularly with respect to the handling of images. Among other recommendations, the researchers suggested that users of virtual try-on systems carefully read privacy materials (particularly with respect to handling of images and biometrics), that developers of shopping apps prioritize user privacy in design, and that privacy regulators develop clear consent requirements for AR and VR applications, while also performing regular audits and enforcement actions.
Commonalities and Takeaways
As noted in the introduction, one of the OPC’s current Strategic Priorities is to “address and advocate for privacy in this time of technological change.” As immersive technologies continue to develop and be adopted for a variety of uses, it is important that to the greatest possible extent we remain ahead of, and ready to address, any impacts they may have on the privacy of Canadians. Using the three Contributions Program projects as a basis, what commonalities or takeaways can be identified, and what might the OPC or other relevant actors consider pursuing in future years?
Immersive technology can (and often does) generate sensitive information
By their very nature, immersive technologies collect significant amounts of information (including personal information) through a variety of sensors. This can include data about a person’s local environment (including the people in it), the user’s movements, and even involuntary actions such as pupillary response.
To be clear, such data collection may be reasonable in the context of a particular product or service. The tracking of an individual’s eye movements, for instance, may be an important part of creating an immersive experience in a virtual reality world; similarly, creating an accurate representation of the user’s environment may be necessary for both functionality and safety in augmented or mixed reality systems. The collection of certain data should not be understood to be de facto inappropriate. However, it does come with considerations – including that it is important for developers, users, regulators and legislators to all understand the potential privacy impacts.
For example, advances in image recognition technology mean that in addition to mapping out obstacles in a room, images of one’s environment could be anaylzed to identify the items that a person has in their home - which could subsequently be used to, for example, build a detailed consumer profile of that person. Involuntary facial movements in response to targeted stimuli in a virtual world could be analysed to predict information about one’s sexual preferences, political or religious affiliations. Video from augmented reality tools such as smart glasses could identify not only where a person goes throughout the day, but how they interact with their environment and the people within it. Moreover, given the extent of control that virtual or mixed reality technologies have over an individual’s virtual environment, profiles developed through this detailed data collection could be used to ‘nudge’ individuals into taking actions they might not otherwise have taken, or even that are against their own interests.
We do not claim that this is currently taking place in all immersive technology applications, nor that it is the end goal of developers. However, it nonetheless is clear that all parties should be aware of, at minimum, the potential uses and sensitivity of information being collected by these systems – and the ensuing obligations under Canadian privacy law.
Effective (and accurate) privacy communications are crucial
Given the extent of personal information that will often be collected through immersive technology, and the uses to which it might be put, the effective and accurate communication of privacy-related information is highly important. However, as evidenced by the Option Consommateurs and Concordia projects described above, this does not always occur.
Describing complex or expansive data practices in a way that is understandable to the average user will always be a challenge. However, there are ways in which the status quo could be improved. For instance, the disconnect Option Consommateurs notes between information about collection and information about use makes it difficult for a user of immersive technology to assess whether a collection is reasonable (and whether they will consent to it). This could be addressed, as the researchers recommend, with a table that sets out what information is collected and how it is used. Option Consommateurs also suggests that given companies’ creativity and experience in interface design, they could develop ways to inform users of data practices in a clear, attractive and fun way – such as by presenting them as part of a tutorial when the user is becoming familiar with the system.
Of course, as highlighted by the Concordia study it is also critical that information presented to a user is accurate – particularly where it may have a meaningful impact on whether the user consents to a practice (such as whether data is processed on-device).
This problem – how to communicate complex practices in an accessible manner – is not unique to immersive technology. In fact, it may be shared by most (if not all) emerging technologies which rely on consent as their basis to process personal information. However, given the potential sensitivities of the information in question, and the potential harms to individuals if it is mis-used, it is a problem that must be addressed in an effective manner.
The bystander problem needs to be considered
For many applications, immersive technologies require an understanding of the user’s environs. As both CIPPIC and Option Consommateurs researchers point out, this means that information about other individuals in the surrounding area – “bystanders” – is likely to be captured by these systems. At minimum, this raises the challenge that there is no realistic way for an immersive technology user to obtain consent from, or provide consent on behalf of, these bystanders for collection of their personal information. The researchers also note that this kind of collection can lead to a blurring of private and public spaces.
Again, this is not an issue that is unique to immersive technology. Smart home technologies, smart and connected vehicles, and even emerging AI “companion” devices will often be equipped with cameras or other sensors that are meant to capture information about an individual’s environment – and, as CIPPIC notes, the application of current Canadian privacy law to these practices may not always be clear. It is thus an area to which developers, regulators (including the OPC) and legislators may need to turn their attention in the near future.
These are but three of the areas for further exploration raised by the Contributions Program reports. It will be important that government, regulators, civil society and organizations developing or deploying immersive technologies continue to work to address the issues that have been identified – or will emerge – as work continues in this area.
Next steps
Immersive technology and its impacts on privacy is an area of increasing interest for regulators such as the OPC. Going forward, we will follow legal, regulatory, and technical developments in this space to ensure that we are ready to provide guidance on, and enforce, the application of Canada’s privacy laws in this space.
To highlight some examples of current and on-going work, we would point to regulators such as the data protection authorities of SpainFootnote 6 and the United KingdomFootnote 7, academic and civil society researchers such as those profiled above, and interest groups such as the Future of Privacy ForumFootnote 8 and the Extended Reality Safety InstituteFootnote 9, among many others. We are also actively working with our international counterparts on this topic through the International Working Group on Data Protection in TechnologyFootnote 10. (the “Berlin Group”).
We look forward to continuing to advance our Strategic Priorities through examination of this and other data-centric technologies.