Consolidated Issue Sheets on the Study of Bill S-209
OPC Views on Age Assurance
Speaking Points
- The OPC has been actively engaging on age assurance, including through international collaboration with other privacy authorities and engaging Canadian stakeholders through an exploratory consultation.
- We believe that age assurance is a valid means of advancing the goal of creating safer online experiences for young people. However, considering the significant potential privacy impacts that age assurance can have, it must be (i) designed to be privacy protective and (ii) used in a risk-based manner.
- In particular, age-assurance systems – particularly if they are made mandatory to access certain content or services – must not result in increased online surveillance or disclosures of sensitive information.
- I believe that this is a realistic outcome, and my Office is currently developing guidance documents on assessing when age assurance should be used and how it can be designed to be privacy protective.
Background
- In our exploratory consultation, we put forward the following preliminary position that the use of age assurance should:
- be restricted to situations that pose a high risk to the rights of children;
- not permit information collected for age verification to be used for any other purposes;
- be designed in accordance with relevant industry standards and guidance from regulators, including my office; and
- be subject to effective oversight.
- In the “What We Heard” report following the consultation, we noted that we would nuance the first of these positions. Specifically, rather than an outright restriction to ‘high-risk’ situations, to accommodate all potentially appropriate uses of age assurance, we would now take the position that a risk-based approach is needed to ensure that privacy impacts of age assurance are proportionate to the harms being addressed.
LEAD: PRPA
Privacy Risks Related to Age Assurance
Speaking Points
- The privacy risks associated with age assurance primarily come from two sources: the information collected to perform it, and what it can enable or lead to.
- Depending on the form of age assurance being used, information collected from an individual could include IDs, biometric images or behavioural patterns – all of which could, without proper protections, be used for secondary purposes or compromised through data breaches.
- On top of this, without proper design, age assurance could enable a website to identify formerly anonymous users, or lead to individuals having their online activities tracked across multiple services.
- Such risks can be mitigated by considering privacy throughout the design and operation of an age-assurance system and by applying innovative privacy-enhancing technologies – and such mitigations can be made mandatory by requiring them in law.
Background
- Privacy protections referenced in S-209 include ensuring that age assurance methods:
- Maintain user privacy and protect user personal information;
- Collect and use personal information solely for age assurance purposes;
- Limit the collection of personal information to what is strictly necessary;
- Destroy personal information once age assurance is completed;
- Generally comply with best practices in the field of privacy protection.
- When S-210 was before this Committee, questions were asked about whether individuals’ personal information (including government-issued IDs) might be stolen and used for cybercrime, and whether age verification would create opportunities for the government to collect or supervise the collection of sensitive personal information.
- In short, both are risks, but such risks can be mitigated (for instance, by limiting what is collected and retained, and by preventing tracking).
LEAD: PRPA
Comparison of Bill S-209 and S-210
Speaking Points
- Bill S-209 is similar to its predecessor, Bill S-210, but there are some key differences between the two.
- Bill S-209 expands the list of privacy-related criteria for prescribed age verification or estimation methods and requires the Governor in Council to ensure these criteria are met rather than merely considered.
- The latest Bill applies to a narrower range of online content than its previous iteration. For example, Bill S-209 focuses on visual pornographic materials, whereas Bill S-210 would have also applied to sexually explicit material in audio and written form.
- Bill S-209 also allows for the use of either age verification or estimation technologies to prevent children from accessing pornography, while the previous Bill only mentioned age verification.
Background
- There have been two previous iterations of Bill S-209: Bill S-203 (introduced in 2020) and Bill S-210 (introduced in 2021). Both died on the Order Paper.
- Bill S-209 states that the designated methods of age-verification or estimation must meet the criteria set out in s. 12(2) of the Bill, while Bill S-210 would have only required the Governor in Council to consider these criteria. Bill S-209 also expands on the list of applicable criteria including a new requirement to limit the collection of personal information to what is strictly necessary. The OPC had recommended this addition in its comments on Bill S-210.
- Bill S-209 uses the term “pornographic material,” which is defined in s. 2 of the Bill, whereas Bill S-210 used the term “sexually explicit material” as defined s. 171.1(5) in the Criminal Code for the purpose of the offence set out in s. 171.1(1) of the Criminal Code of making sexually explicit material available to a child.
- Bill S-209 contains a new provision (s. 6), which clarifies that it is not an offence for an organization to provide services that search, transmit, download, store or access pornographic material “incidentally and not deliberately.” It is unclear whether this will meaningfully limit the scope of the offence.
LEAD: LEGAL
Scope of application
Speaking Points
- S-209 would potentially require age assurance for a broad range of websites as pornography is widely available on search engines and social media sites such as Google, X, and Reddit. These companies make money from advertising, which likely satisfies the commercial purpose requirement of the offence.
- The Bill’s scope of application could potentially be narrowed by section 6, which clarifies that it is not an offence to provide certain services “incidentally and not deliberately,” or paragraph 12(1)(a) of the Bill, which would allow the Governor in Council to clarify circumstances that would not constitute an offence in regulations.
- Despite these changes, ambiguity remains with respect to the scope of the offence in section 5 of the Bill. Careful consideration will have to be given, during the regulation-making process, to what services should not be included in the scope of the offence, potentially through explicit exemptions for certain types of websites (e.g., search engines).
- There is a trade-off between effectiveness and risk. If the Bill applies widely, it may be more effective at protecting young people, but this also heightens risks to privacy and other human rights such as freedom of expression.
Background
- Section 5 of the Bill makes it a summary offence for an organization to make pornographic material available to young persons online for commercial purposes.
- Section 6 of the Bill clarifies that it is not an offence under section 5 for an organization to “incidentally and not deliberately” provide a service that is used to search for, transmit, download, store or access content on the Internet that is alleged to constitute pornographic material. It is unclear whether social media sites, search engines or streaming services would meet the “incidentally and non-deliberately” requirement.
- Section 12 (1)(a) authorizes the Governor in Council to make regulations specifying when pornographic material is or is not to be regarded as made available for commercial purposes.
LEAD: LEGAL
Definition of “Young Person”
Speaking Points
- Bill S-209 makes it an offence to make pornographic material available online for commercial purposes to a “young person,” which is defined as a person under 18 years of age.
- Provinces and territories define the age of majority in their respective jurisdictions, which is set at either 18 or 19 years of age.
- The definition of “young person” in Bill S-209 is consistent with the definition of “child” that was used in Bill C-63 and the definition of “minor” in Bill C-27, two federal bills introduced in the previous Parliament that died on the Order Paper.
Background
- Bill C-63 (Online Harms Act) and Bill C-27 (Consumer Privacy Protection Act), which were both introduced under the previous Parliament and died on the Order Paper when an election was called, did not use the term “young person.” However, they adopted the same age threshold as Bill S-209. Bill C-63 used the term “child”, which was defined as a person under 18 years of age. Bill C-27 used the term “minor”, which was initially not defined.
- The Standing House Committee on Industry and Technology amended the Bill during clause-by-clause review to define a minor as an individual under 18 years of age.
- The Canadian Bar Association recommended to the Senate Committee on Legal and Constitutional Affairs that the definition of “young person” in the previous iteration of this Bill—Bill S-210— be changed to one under 16 to align with the age of consent in the Criminal Code.
- Federal legislation does not typically define who is or is not a minor, as the age of majority is specified by provinces or territories. However, there are some exceptions where the federal government will define an age of majority in specific contexts (e.g., there is a definition of age of majority in the Divorce Act).
- Since Bill S-209 is made under the federal criminal law power, it is appropriate for the federal government to define young person for the purpose of this Bill.
LEAD: LEGAL
Change from “Sexually Explicit Material” to “Pornographic Material”
Speaking Points
- Bill S-209 uses the term “pornographic material,” whereas its previous iteration, Bill S-210, referred to “sexually explicit material.”
- Pornographic material is defined in Bill S-209. In contrast, the previous Bill used a definition of sexually explicit material taken from the Criminal Code offence for making sexually explicit material available to a child.
- The definition of pornographic material used in the current Bill is narrower than the definition of sexually explicit material used in the previous Bill. Bill S-209 applies to visual pornographic materials only, whereas Bill S-210 included audio and written materials.
- The change in definitions modestly narrows the scope of content that age verification or estimation will be required for. This means that less online content will be affected by the privacy risks inherent to these technologies.
Background
- Bill S-210 used the definition of sexually explicit material set out in s. 171.1(5) of the Criminal Code for the purpose of the offence set out in s. 171.1(1) of the Criminal Code of making sexually explicit material available to a child.
- Section 2 of Bill S-209 defines pornographic material as “any photographic, film, video or other visual representation…the dominant characteristic of which is the depiction, for a sexual purpose, of a person’s genital organs or anal region or, if the person is female, her breasts...”
- In contrast, the Criminal Code definition used in Bill S-210 defines sexually explicit material to include not only visual representations, but also “written material whose dominant characteristic is the description, for a sexual purpose, of explicit sexual activity with a person” and “an audio recording whose dominant characteristic is the description, presentation or representation, for a sexual purpose, of explicit sexual activity with a person.”
- Child pornography as defined in subsection 163.1(1) of the Criminal Code is excluded from both the definition of pornographic material in Bill S-209 and sexually explicit material in Bill S-210.
LEAD: LEGAL
Offence provisions
Speaking Points
- As drafted, the offence provision in section 5 of the Bill would capture a wide range of online content as pornographic material is widely available on many websites that make money from advertising, such as search engines or social media websites.
- The Bill would essentially require many websites to use prescribed age verification or estimation methods. Only organizations using these methods could, if charged under section 5, rely on the defence that they believed that the person accessing pornographic material was an adult.
- Sections 6 and 12 of the Bill could limit the scope of the offence. However, the impact of these provisions is uncertain as it is not clear what is meant by “incidentally and not deliberately” in section 6, and the impact of section 12 would depend on regulations.
- As Senator Miville-Dechêne stated at Second Reading, determining the precise scope of this Bill is a delicate task. I agree and emphasize the need for careful consideration in the regulation-making process about the Bill’s scope, including a potential explicit exemption of specific types of websites (such as search engines).
Background
- Section 5 of the Bill makes it a summary offence for organizations to make pornographic material available online to young persons for commercial purposes. Section 7 creates defences for organizations that: (1) use prescribed age verification or estimation methods; (2) have legitimate purposes related to science, medicine, education or the arts; or (3) comply with a regulatory notice.
- Section 6 clarifies that it is not an offence to “incidentally and not deliberately” provide a service used to search for, transmit, download, store or access online content that is alleged to constitute pornographic material. It is unclear whether social media sites, search engines or streaming services would meet the “incidentally and non-deliberately” requirement.
- Section 12(1)(a) authorizes the Governor in Council to make regulations specifying the circumstances in which pornographic material is or is not to be regarded as made available for commercial purposes.
LEAD: LEGAL
Criteria for prescribed age-verification or age-estimation methods
Speaking Points
- Bill S-209 includes two new criteria for prescribed methods of age verification or estimation. They must: (1) be conducted by a third-party that deals at arm’s length from the organization making pornographic material available online, and (2) only collect personal information that is strictly necessary for verifying or estimating age.
- I am pleased that the principle of necessity has been added to the criteria. My office previously advocated for this change, along with a reference to principle of proportionality, which has not been added.
- The requirement that only third-party organizations conduct age verification or estimation could potentially enhance privacy, particularly if the organization verifying or estimating age cannot share the identity of the user with the website making pornography available.
- Another valuable change is that the previous iteration of this Bill only required the Governor in Council to consider privacy-related criteria before prescribing a method of age verification. Under Bill S-209, it must ensure that a prescribed method meets the criteria.
Background
- Section 12(2) of the Bill S-209 sets out seven criteria that the Governor in Council must ensure are met before prescribing a method of age verification or estimation. The two new requirements are set out in s. 12(2)(b) (third party dealing at arm’s length), and s.12(2)(e) (necessity).
- Another change to the criteria compared to the previous version of Bill is that Section 12(2)(a) was revised to require the method to be “highly effective” instead of “reliable.” This updated wording is more closely aligned with the Federal Court’s language in Turner v. Telus Communications Inc 2005 FC 1601 (at para 48).
LEAD: PRPA
Forms of Age Assurance
Speaking Points
- Age assurance is an umbrella term for a range of approaches to determining an individual’s age, including age verification, estimation, declaration, and the newly introduced term “age inference.”
- While the accuracy rates associated with methods in these sub-categories will vary, many regulators have found that each – except age declaration – can meet the criteria of being “highly effective” if correctly designed and implemented.
- Each form of age assurance will have different privacy implications for individuals, but each can be designed in a way that prioritizes privacy.
- Allowing individuals the choice between different forms of highly effective, privacy-protective age assurance can lessen the potential that a person will forego accessing a service because they are uncomfortable with providing a specific type of personal information.
Background
- Bill S-209 amended S-210 to add age estimation as a potentially valid form of age assurance (along with age verification).
- The upcoming ISO standard on age assurance (ISO 27566) creates separate terms for age estimation (estimating age based on biometric characteristics) and age inference (inferring age or age range based on a person’s age based on verifiable facts, such as transaction history, service usage, etc.). Age inference is not yet a commonly used term.
- Age verification involves calculating a person’s age or age range based on verifying their date or year of birth (for instance, through the provision of government-issued ID); age declaration involves an individual declaring their age or age range without providing additional verification.
- In our “What We Heard” report which we published following our Age Assurance Exploratory consultation late last year, we cited both an academic and an age assurance service provider who stated that preferred methods of age assurance vary between individuals, with the service provider stating “What might be one person’s preference might not be possible for another individual, nor may they feel comfortable providing the required information.”
LEAD: PRPA
Use of Third-Party Intermediaries
Speaking Points
- The use of third-party intermediaries is becoming increasingly common – and in some cases mandated – to meet age assurance requirements.
- Separating the parties that issue proof-of-age and rely on that proof-of-age for access to services or content can have significant privacy benefits.
- Designed appropriately, such an approach can ensure that the age assurance process does not result in a website knowing the identity or exact age of the person visiting it, and that the third-party intermediary does not know what sites the individual visits.
- However, given the potential invasiveness should a third party not meet privacy requirements, Canadians will need to have a way to determine which third parties have designed their service in a privacy protective manner and are otherwise trustworthy.
Background
- Bill S-209 adds the criteria that a prescribed age assurance method must be “operated by a third-party organization that deals at arm’s length from any organization making pornographic material available on the Internet for commercial purposes” (s.12(2)(b)).
- In general, third-party intermediaries will use one of multiple age assurance processes to validate an individual’s age, and then either (i) provide them with a reusable age credential that the individual can use at any site which accepts it, or (ii) send a ‘yes/no’ signal to a site at the direction of the individual to indicate whether they are above or below a given age.
- A technical requirement for age assurance systems document developed by the French communications regulator (Arcom) and approved by the CNIL includes, as requirement 1, “Indépendance du prestataire de système de vérification de l’âge vis-à-vis des services visés diffusant des contenus à caractère pornographique”.
- In a 2022 guidance document on age assurance, the CNIL recommended “the use of a trusted independent third party to prevent the direct transmission of identifying data about the user to the site or application offering pornographic content”.
LEAD: PRPA
Use of Facial Age Estimation for Age Assurance
Speaking Points
- Facial age estimation involves the analysis of an image of an individual’s face to approximate their age or age range. Many age assurance service providers offer this option, and some regulators have determined that it can be “highly effective.”
- As described in the “What We Heard” report drafted by my Office following our exploratory consultation on age assurance, opinions vary significantly on the privacy implications of facial age estimation.
- Some suggest that it is an invasive collection of sensitive information subject to bias based on gender or skin-tone, while others argue that done correctly it can be a more privacy-protective option.
- I continue to believe that facial age estimation can be implemented in a privacy-protective and sufficiently accurate manner. However, certain key steps must be taken – including that no unique “faceprint” is created during the process, facial images are deleted immediately after processing, and individuals who do not want to provide facial images are given other options to prove their age.
Background
- Facial recognition will also often be used as part of an age verification process, to ensure that a person is a match to the ID they are presenting. However, when “face scans” are discussed it is generally in reference to facial age estimation.
- Age assurance service provider Yoti self-reports that when given a choice between age assurance methods on their platform over 80% of individuals choose facial age estimation, and that over 700 million such estimations have been completed since it launched the service in December 2018.
- The UK’s Communications Regulator (Ofcom) lists facial age estimation as one of seven kinds of age assurance that are “capable of being highly effective”, and the Australian Age Assurance Technology Trial described age estimation (including, but not specific to, facial age estimation) as a “practical and scalable method of age assurance, particularly effective for threshold-based decisions (e.g., determining whether a user is likely over 13, 16 or 18).”
LEAD: PRPA
Use of VPNs to Bypass Requirements
Speaking Points
- Virtual private networks (VPNs) can keep individuals’ online activities private and secure by encrypting their data, protecting them from online trackers, and obscuring their location.
- However, this same technology could enable minors to bypass age assurance on a website by appearing to connect from a country without such requirements. Content providers (such as video streaming sites) face similar challenges when seeking to restrict access to certain content based on region.
- VPNs are an important privacy tool for many individuals; we believe that they can co-exist with age assurance requirements and policy intentions. For instance, VPNs would have little impact on the goal of reducing unintended exposure to harmful information.
- As emphasized by respondents to our exploratory consultation on age assurance, the existence of workarounds does not necessarily render age assurance ineffective – but it may be a consideration in determining what methods are deemed “highly effective.”
Background
- VPNs operate by establishing a secure connection between a user’s device, a VPN service provider, and the Internet. A VPN will encrypt Internet traffic, mask the user’s real IP address, and route it through a server in a location of the user’s choice before connecting (e.g., to a website).
- Following the July 2025 implementation of the age assurance requirements in the United Kingdom’s Online Services Act, a spike was seen in the number of downloads of VPNs. However, a similar spike was seen in downloads of age assurance apps (such as Yoti).
- As part of its “robustness” criterion for age assurance methods, the UK Ofcom – the UK’s regulator for communications services – states that service providers should not encourage users to circumvent age assurance (such as by providing information about or links to VPNs).
- Some academics have flagged that, as free VPNs will often collect and sell personal information, their use to circumvent age assurance requirements may create additional privacy challenges for those unable to pay for a VPN.
LEAD: PRPA
Encryption as an Effective Safeguard
Speaking Points
- Encrypting the personal information that is transmitted and stored during an age assurance process is an effective way to mitigate certain privacy risks, such as an individual’s ID or biometric information being intercepted by a bad actor or exposed in a data breach.
- It is likely that such use of encryption would fall under the criteria of “best practices in the field of privacy protection” as described at s. 12(2)(g) in S-209.
- Encryption can also be a fundamental part of the cryptographic techniques that underpin some privacy-protective age assurance methods, such as the “zero-knowledge proof” approach design by the French CNIL.
- Encrypting data is not by itself sufficient to mitigate all privacy risks, though. As such, while encryption should be considered an important privacy protection, it should be seen as one among many.
Background
- A “zero-knowledge proof” in relation to age assurance is one in which neither the relying website nor the age assurance provider learns the identity of the individual.
- The effectiveness of encryption depends on several factors, including the chosen encryption algorithms, key length, key management schemes, and their implementations. The integrity of the underlying system where encryption is used is also paramount to ensuring confidentiality, integrity, and authenticity.
- Encryption can be used to protect data stored (“at rest”) or sent (“in transit”) between user devices and servers.
LEAD: PRPA
OPC Actions to Protect and Promote Children’s Privacy
Speaking Points
- Ensuring that children’s privacy is protected and that young people understand and are able to exercise their privacy rights is one of my key strategic priorities.
- My strategic plan details specific areas of focus and various initiatives we are undertaking to help advance this priority.
- For example, my Office has applied a children’s privacy lens to compliance work, including by making children’s privacy a focus in my office’s investigation into TikTok.
- We are also engaging with young people and funding youth-specific research to better understand their views to help guide our work. For example, my Office recently announced plans to launch a Youth Advisory Council.
- In June, I welcomed youth leaders, stakeholders from various sectors and data protection authorities from around the world for the 2025 OPC Symposium on youth privacy in a digital age. We explored issues such as AI, deceptive design, educational technologies and their impact on young people.
Background
- The OPC is hosting an event in November with youth from the Ottawa area to better understand their views on the concept of the best interests of the child, in partnership with researchers from McGill and the University of Ottawa.
- OPC’s Youth Advisory Council will be comprised of seven individuals between the ages of 13-17 from across Canada. We received 61 applications, have completed interviews, and plan to inform successful candidates in the next few weeks, following Commissioner approval.
- Children’s privacy was a primary theme for our office’s 2024-2025 call for proposals for the contributions program. We funded 4 projects, including on young people’s perspectives on AI, and on young children and online gaming.
LEAD: PRPA
OPC Age Assurance Exploratory Consultation
Speaking Points
- From June to September 2024, my Office ran an exploratory consultation on age assurance. In it, we set out our preliminary positions on, and understanding of, age assurance technologies and the privacy implications of their use, and sought feedback with the goal of increasing our knowledge and prompting meaningful discussion of the topic.
- We received 40 responses representing a wide variety of stakeholder groups, including industry, civil society, academia, policy think-tanks, individual Canadians, and international data protection authorities.
- These responses – as well as our monitoring of on-going international developments in this space – have helped us to evolve our position on the topic, including through a greater recognition of age assurance as a meaningful tool to help achieve the goal of creating safer online experiences for young people.
- We are currently working on our next step, drafting guidance on assessing when age assurance should be used, and how to design age assurance in a privacy-protective manner.
Background
- Six themes were identified in responses to the consultation:
- Differentiate between forms and uses of age assurance.
- The impacts associated with the use or misuse of age assurance should not be underestimated.
- Age assurance is not the goal; it is a way to achieve the goal of a safer online experience for young people.
- Consider who should be responsible for age assurance.
- Age estimation deserves special caution – or could be preferable to age verification.
- The use of age assurance should be subject to a risk-based assessment.
LEAD: PRPA
TikTok and Age Assurance
Speaking Points
- On September 23, 2025, I, along with my counterparts in Quebec, British Columbia and Alberta, issued the findings of a joint investigation into TikTok’s privacy practices, in particular as they relate to children.
- Our investigation found serious deficiencies in TikTok’s age assurance mechanisms, which allowed hundreds of thousands of children in Canada under the age of 13 to access TikTok each year, contrary to its own terms and policies.
- TikTok’s age assurance measures relied on age self-declaration and analysis of posted content. The latter measure is not effective for identifying underage users who watch videos but do not post content.
- While TikTok used sophisticated tools to analyse biometrics and usage patterns to estimate the age ranges of users for various business purposes, it did not use this information to keep children off its platform.
- As a result of our investigation, TikTok committed to, among other things, develop and implement enhanced, demonstrably effective and proportionately privacy-protective age assurance measures.
- TikTok confirmed that these measures will leverage non-identifying facial analytics, as well as analysis of user-posted content and behaviour on the site, to estimate the age of a user and flag suspected underage accounts for potential removal.
- Our Offices will work with TikTok throughout the development and implementation of these measures to ensure that they are satisfactory.
Background
- From 2021-2023, TikTok removed, on average, approximately 500,000 children in Canada under the age of 13 per year after having bypassed the age gate.
- TikTok was unable to provide data about how long, on average, underage users were able to use the platform before being identified and removed.
- TikTok has committed to providing PIAs and testing results for the proposed age assurance measures, for our Offices’ review.
LEAD: PRPA
Canadian Standard on Age Assurance Technologies
Speaking Points
- This recently published standard specifies minimum requirements for age assurance technologies and methods to estimate or verify a person’s age or age range to control access to services to suitable age-groups, taking into account the rights and needs of minors.
- Privacy plays an important role in the standard, with the first requirement being to conduct a comprehensive privacy impact assessment (PIA) to identify privacy risks.
- Other notable requirements include: a Child Rights Impact Assessment (CRIA) to assess potential effects on minors’ rights; privacy by design; privacy by default; security by design; and proportionality and data minimization principles.
- While I am not in a position to state that compliance with this standard would meet an organization’s obligations under PIPEDA, I nonetheless consider it a useful resource for organizations looking to ensure that they take a privacy-protective approach to age assurance.
Background
- The standard was published in August 2025 by the Digital Governance Standards Institute, which is an organization accredited by the Standards Council of Canada. The OPC was an observer during the development process for this standard, but did not participate in the vote during which it was adopted.
- The standard gives priority to minors’ best interests over the interests of other stakeholders, stating that “[w]here conflicts of interests […] emerge, the best interests of the minor shall be a primary consideration”.
- The standard references UNICEF’s CRIA Toolbox, which uses a methodology based on the United Nations Guiding Principles on Business and Human Rights (UNGPs) to assess how a company’s products, services and operations impact all child rights, as detailed in the Convention on the Rights of the Child and other relevant human rights instruments.
LEAD: COMPLIANCE
Status of Digital ID Implementation in Canada
Speaking Points
- Many privacy-protective age verification solutions rely on individuals being issued a digital credential containing information about their date of birth. These credentials can be issued by private or public sector entities and are generally held in “digital wallets.”
- Once such a credential is established, the individual can authoritatively prove their age to a website or online service without disclosing additional identity information.
- While some jurisdictions, such as the EU, are actively advancing government-led digital identity or wallet projects, to our knowledge Canada does not have a similar initiative underway (though both British Columbia and Alberta have introduced digital wallets at the provincial level).
- Should Canada pursue such a project, from a privacy perspective it can be guided by a resolution issued in 2022 by my Office as well as my provincial and territorial counterparts on ensuring the right to privacy and transparency in Canada’s digital identity ecosystem.
Background
- The EU has mandated that all member states make at least one digital wallet available by the end of December 2026, per the European Digital Identity Regulation (or eIDAS 2.0) adopted in May 2024.
- Many jurisdictions in Canada have issued a digital credential that allows access to online government services, including Newfoundland and Labrador (MyGovNL), Nova Scotia (MyAccountNS), Ontario (ONe-key), Yukon (MyYukon), Quebec (clicSÉQUR); and the Northwest Territories (NTKey). However, these do not serve as age credentials.
- Similarly, Canadians can authenticate themselves to the Canada Revenue Agency using digital credentials issued by BC or Alberta or through online banking credentials.
- In addition to Alberta and BC, Quebec is actively pursuing a digital-ID system that would issue credentials (identity, photo, and address) to be stored in a digital wallet.
LEAD: PRPA
Foreign Laws Limiting Children’s Access to Pornography
Speaking Points
- The legal and regulatory landscape on the issue of limiting children’s access to pornography is rapidly evolving. Several jurisdictions have laws or regulatory frameworks effectively requiring age assurance to access pornographic material.
- Bill S-209 would impose age assurance requirements on a broader range of online content than some jurisdictions. For example, Texas and Utah only require age verification for sites with a certain threshold of pornographic content, and Ireland only requires age assurance for video-sharing platforms whose EU headquarters are in Ireland.
- Unlike Bill S-209, some jurisdictions impose different requirements for age verification or estimation depending on the type of website. For example, the EU places more onerous obligations on large websites, and Australia has developed industry codes that impose age assurance requirements in specific contexts.
Background
- In Texas and Utah, age verification is required where more than a third of a website’s material is sexual material that is harmful to minors (Texas H.B. No.1181, at Sec.129B.002; Utah Code 78B-Chapter 3, 1001 and 1002).
- In the EU, “very large platforms” and “very large search engines” (more than 45 million monthly active EU users) have an additional obligation to protect children including age verification and parental controls. (Digital Services Act, at 35 (1)(j)).
- The Irish Online Safety Code only applies to video-sharing platforms with EU headquarters in Ireland (Online Safety Code, s. 2; Online Safety and Media Regulation Act 2022, Section 2B).
- Australia’s eSafety Commissioner has registered industry codes that will, once in effect, require age assurance to access pornography (known as Class 1C and Class 2 material) in certain contexts, e.g., search engines must implement age assurance for account holders (see Industry Codes for Class 1C and 2 Material).
LEAD: LEGAL
International Best Practices
Speaking Points
- Guidance and best practices for privacy-protective age assurance have been emerging over the past years.
- Data protection authorities in Spain, France, and the United Kingdom (among others) have all published important guidance on the topic, and Spain, France, Greece and the European Commission have all developed proofs of concept or prototype privacy-protective age verification solutions.
- New or soon-to-launch international standards about age assurance, such as those from Canada’s Digital Governance Standards Institute and the International Organization for Standardization (ISO), also refer to key privacy protections.
- I am encouraged by the work being undertaken in this area, which has further assured me that prioritizing privacy in age assurance is a practical reality and should be an expectation.
Background
- Common privacy best practices within this work include that:
- Online services requiring age assurance should learn as little as possible about the individual (ideally, only a ‘yes/no’ about whether they are within the desired age range);
- No party, including the age assurance service provider, should be able to track an individual’s online activity based on information generated during the age assurance process;
- Information collected for age assurance purposes should not be re-used for other purposes, and should be deleted after processing; and,
- The collection of personal information should be limited to what is necessary.
- Referenced standards for age assurance:
- ISO/IEC WD 27566 – Age assurance systems (completed but not yet published);
- DGSI 127 – Age Assurance Technologies.
LEAD: PRPA
International Age Assurance Working Group
Speaking Points
- My Office is a member of an international age assurance working group chaired by the UK Information Commissioner’s Office. We have been participating in the group since July 2022.
- Last September, this group published a statement setting out several principles for shared expectations related to data protection and privacy when using age assurance.
- These principles include that age assurance must be compliant with data protection laws; be fair, transparent and non-discriminatory; and should be implemented in the best interests of the child while guaranteeing all users’ fundamental right to access information from the Internet.
- International collaboration on age assurance is essential given the challenge of developing principles, codes, and standards for websites and services that are accessible across multiple jurisdictions.
Background
- The International Age-Assurance Working Group meets 2-3 times a year.
- The objectives of the working group are to:
- share information on approaches to age assurance among data protection authorities, including their accuracy and efficacy;
- promote the harmonization of policy approaches to age assurance where possible; and
- develop a joint statement on age assurance.
- Other group members include data protection authorities from Australia, Belgium, Ireland, France, Germany, Spain, Portugal, Slovenia, Cyprus, Morocco, USA, Mexico, Singapore, and Japan.
- Other statement signatories are the United Kingdom, Philippines, Argentina, Mexico, and Gibraltar.
LEAD: PRPA
Australian Age Assurance Technology Trial
Speaking Points
- In November 2024, the Australian Government commissioned a study to “evaluate the effectiveness, reliability and privacy impacts of a range of age assurance technologies.” The study examined over 60 technologies from 48 vendors.
- The final report, published in August 2025, found that age assurance systems “can be private, robust and effective”, and identified “careful, critical thinking by providers on the development and deployment of age assurance systems, considering efficacy, privacy, data and security concerns.”
- The report did note, though, that “continued attention to privacy compliance will support long-term trust and accountability.”
- While my Office has not independently verified these results – which explicitly note they consider only the Australian context – we are nonetheless encouraged that privacy-protective age assurance appears to be possible with current technology.
Background
- The trial was commissioned by the Australian Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts and undertaken by a group led by representatives from the UK-based Age Check Certification Scheme.
- The findings of this study reverse a 2023 finding, also from Australia, that then-available age assurance technologies were not sufficiently mature to work reliably, be comprehensively applied, and balance privacy and security.
- As noted in our exploratory consultation, based on the 2023 finding Australia opted not to implement age assurance for access to pornographic websites. However, since then Australia has:
- Developed industry codes (some of which are registered, some of which are under review) which will require various online services to implement age assurance in relation to harmful materials (including pornography); and,
- Passed a law requiring “reasonable steps” be taken to prevent youth under 16 from holding social media accounts.
LEAD: PRPA
Stakeholder Views on S-209 and Similar Legislation
Speaking Points
- Bill S-209 and similar proposals have generated vigorous public debate about how to protect children from age-inappropriate content without infringing the rights of other Internet users, including the right to privacy.
- Many stakeholders point out that the potential privacy impacts of age assurance will depend on the method used and how it is implemented.
- Bill S-209 establishes stricter requirements in this regard, but some continue to urge caution, arguing that the use of a given method must be proportionate to the risk being addressed.
- We also believe that there is a critical role for risk-based assessments when determining whether or how to deploy age assurance, and are currently developing guidance on this subject.
Background
- In recent years, several countries have considered or introduced legislation intended to prevent children and youth from viewing sexually explicit material online, often by way of mandatory age-verification checks.
- Advocates of such proposals have argued that state action is required to protect children against the harmful effects of pornography for the same reasons that access restrictions are imposed in the offline world.
- Critics maintain that online age verification is fundamentally different and more burdensome than laws requiring adults to show their IDs in physical places; that adults must bear the burden – and the attendant privacy risks – of submitting personal information in order to access lawful content that they have a constitutionally protected right to see; that non-pornographic content may be restricted incidentally; and that methods of estimating or verifying age can be imprecise and readily bypassed.
- Bill S-209’s predecessors (S-210 and S-203) were the subject of similar debates. Certain faith-based organizations, women’s rights groups, and parents’ associations supported the previous iterations, while many academics and open-Internet advocates were highly critical. Other civil society groups varied, depending on their focus.
- Bill S-209 has generated many of the same arguments. Its proposed remedy of allowing the Federal Court to block websites (which the bill expressly recognizes may result in over-blocking) remains especially controversial. However, some critics have acknowledged that the bill’s narrower scope and stricter requirements for prescribed age-assurance methods are distinct improvements over the previous versions.
LEAD: PRPA
- Date modified: