Language selection

Search

Consolidated Issue Sheets on the Wind Up of TikTok Technology Canada, Inc.

Appearance before the Standing Committee on Access to Information, Privacy and Ethics (ETHI)


TikTok Investigation - Update

Speaking Points

  • In February 2023, my Office, along with my counterparts in Québec, British Columbia, and Alberta, launched a joint investigation into TikTok Pte Ltd.
  • We are examining whether TikTok’s practices comply with Canadian privacy legislation and in particular, whether valid and meaningful consent is being obtained for the collection, use, and disclosure of personal information.
  • This investigation remains a high priority for my Office, especially given the importance of protecting the fundamental right to privacy of young people, who represent a notable proportion of TikTok users.
  • My Office made note of the decision made by the Government of Canada under the Investment Canada Act, ordering TikTok Technology Canada Inc. to wind up its Canadian operations. We were not involved in this decision, and can confirm that it does not affect this investigation.
  • We intend to release the results of the investigation in the coming months. Until then, I am unable to provide specifics on the investigation.
  • While I note concerns regarding potential Chinese government access to TikTok users’ data, this was generally outside the scope of our investigation, which focused on consent and the appropriateness of TikTok’s commercial practices vis-a-vis children’s personal information.

Background

  • The business entity responsible for Canadians’ personal information and TikTok’s privacy practices is TikTok Pte Ltd., a Singapore-based company, and it is this entity that is under investigation. This has not been specifically stated as such publicly as we normally just refer to investigating "TikTok".
  • The investigation was initiated in the wake of now-settled class-action lawsuits in the United States and Canada, as well as numerous media reports related to TikTok’s collection, use and disclosure of personal information.
  • TikTok’s own policies prohibit youth under 13 from using the platform, who still make use of it nonetheless.

OPC Investigation Process

Speaking Points

  • My Office can launch an investigation upon receiving a complaint or if I am of the opinion that a specific privacy matter must be investigated. That said, the Privacy Act specifically requires me to investigate each complaint received.
  • When investigating a complaint, my Office’s compliance teams will identify the issues to be examined, determine the best approach, gather evidence and conduct an analysis of the information obtained to determine whether there has been a contravention.
  • At the end of an investigation, my Office will issue a report of findings which summarizes the parties’ positions, and outlines our findings and recommendations.
  • When it is possible or appropriate, we will allow the respondent to resolve our concerns before concluding the investigation.
  • Generally speaking, we aim to complete our investigations in over the course of approximately a year. Some factors may delay our timeline, such as undertaking joint investigations with several partners, or investigating several respondents as part of an investigation.
  • Where investigations are ongoing, due to confidentiality obligations, I usually cannot provide details until the investigation is complete.

Background

  • Under PIPEDA, the OPC investigates complaints about privacy practices in private sector organizations engaged in commercial activity.
  • Pursuant to s.12 (1) of PIPEDA, Complaints can be initiated by individuals or under s.11 (2) of PIPEDA they can be commissioner initiated when systemic privacy issue are identified.

Other Relevant Complaints / Investigations - information going to third parties outside Canada

Speaking Points

  • In a complaint related to the outsourcing of fraud claims processing services to a third-party in India, my Office investigated TD Canada Trust in 2020. Ultimately, we found that TD was open about this transfer for processing and that it had ensured adequate protection for the personal information of its clients.
  • Following the receipt of a complaint in 2019, we also investigated the transfer practices of Loblaws; the allegations pertained to a gift card offering program where there were concerns that the information collected was being processed by a program administrator is the U.S. We found the complaint to be not well founded in part, given that the information was limited to only what was necessary, and the contractual requirements were sufficient to ensure adequate protection
  • In 2022, we investigated an issue tied to location tracking by the Tim Hortons App. With assistance from a U.S. third party service provider, Tim Hortons could track and collect the location of users’ devices. In our well-founded findings, we noted concerns with respect to the contractual protections implemented to protect the personal information of Canadians processed by the third party. Following our recommendations, Tim Hortons and the third-party processor accepted to delete all location data, and implemented a privacy management program with respect to the app.
  • In 2019, we completed our investigation on Equifax Inc, a U.S. entity. it had found itself the victim of a breach impacting millions of individuals, including a number of Canadians. Our investigation found that the information about Canadians had actually been collected by Equifax Canada Co. That said, we looked into the adequacy of safeguards of both organizations, whether Equifax Canada had adequate accountability for Canadian data processed by Equifax Inc., and obtained valid consent for this processing from individuals. In the end we found both organizations to be in contravention with the Act.

Resources and Length of investigations

Speaking Points

  • Generally speaking, we aim to complete our investigations in over the course of approximately a year. Some factors may delay our timeline, such as undertaking joint investigations with several partners, or investigating several respondents as part of an investigation.
  • For the 405 complaints closed under PIPEDA in 2023–2024, the average treatment time was approximately 7 months:
    • 90% (363) of these complaints were resolved in an average of 6.6 months using mediated approach (early resolution).
    • 5% (20) were concluded in an average of 11.9 months with a simplified report (summary investigations).
    • 5% (22) were completed in over 12 months (average of 16 months) as they required a more in-depth investigation.
  • To maximize our impact and keep up with the increasing volume and complexity of complaints and breaches, the OPC sometimes collaborates with provincial or international counterparts.
  • Over the past 5 years, the OPC has launched 10 joint or coordinated investigations under PIPEDA with other data protection authorities. Of these, 9 were with provincial counterparts.

Background

  • Of the joint or coordinated investigations completed under PIPEDA in the past 5 years, the average time was 17 months. These investigations were Clearview, Tim Hortons, Corefour and Biron. The remaining, such as Tiktok or OpenAI, are still active and some are not known publicly.
  • A notable number of resources across the OPC are generally dedicated to the more complex investigations. As relates to the Tiktok investigation, approximately 10 OPC resources at the working level are currently involved.

Extra-territorial Enforcement

Speaking Points

  • PIPEDA applies to foreign-based organizations with a real and substantial connection to Canada. This includes court orders under ss. 14-17 of PIPEDA, which can follow the issuance of OPC reports of findings.
  • However, enforcing court orders outside of Canada is ultimately subject to the authority of foreign governments and courts.

Background

  • Canadian courts will issue orders under PIPEDA when there is a “real and substantial connection to Canada” (A.T. v. Globe24h.com, 2017 FC 114 at paras 50-64; Lawson v. Accusearch Inc. (F.C.), 2007 FC 125 at paras 38-51; Facebook, Inc. v. Canada (Privacy Commissioner), 2023 FC 534 at para 84).
  • Canadian courts have also been willing to enforce orders from courts in other jurisdictions. The Supreme Court of Canada noted that in the age of increasing cross-border transactions “comity requires an increasing willingness on the part of courts to recognize the acts of other states.” The Court noted that this was “essential to allow individuals and companies to conduct international business without worrying that their participation … will jeopardize or negate their legal rights.” (Chevron Corp v. Yaiguaje, 2015 SCC 42 (CanLII) at para. 75).
  • While international comity is an accepted principle in Canadian jurisprudence, its application in other jurisdictions has been uneven, and will depend on the laws and context in the other jurisdiction.
  • In A.T. v. Globe24h.com, 2017 FC 114 (CanLII) the Federal Court issued an order under PIPEDA that applied to a Romanian-based entity. The Court noted that “[g]iven the involvement of the Romanian counterpart to the OPC, this Court’s findings would complement rather than offend any action that may be taken in a Romanian court.” (para. 60).
  • However, not all Canadian orders directed at parties in foreign jurisdictions have been well-received. In Google Inc. v. Equuustek Solutions Inc., 2017 SCC 34, the Supreme Court of Canada ordered that Google de-index websites that were in violation of intellectual property-related court orders (para. 48). Following the decision, however, Google successfully obtained an injunction in an American court against the enforcement of the Canadian order (Google LLC, v. Equustek Solutions Inc. et al., Case No. 5:17-CV-04207-EJD, (N.D. Cal. Dec 14, 2017).

National security reviews under the Investment Canada Act

Speaking Points

  • It is my understanding that the wind up of TikTok’s Canadian business was ordered on national security grounds under the ICA.
  • Although national security reviews under the ICA do not require the government to consult with my Office, and I was not consulted on this decision, I am aware that, under the guidelines issued under the ICA, the government may take into account various factors, including the potential of an investment to enable access to sensitive personal data that could be leveraged to harm national security.
  • I understand and support the government’s role in undertaking these types of reviews to protect Canadians. I am concerned whenever Canadians’ personal information may be collected, used or disclosed in unexpected ways, or be subject to heightened privacy risks as a result of being transferred beyond our borders.

Background

  • s. 25.3(6) of the ICA - if the Minister of Industry, after consulting the Minister of Public Safety, “is satisfied that an investment would be injurious to national security,” they shall refer the investment under review to the Governor in Council.
  • s. 25.4(1) of the ICA - on the referral of an investment by the Minister, the Governor in Council may “take any measures in respect of the investment that he or she considers advisable to protect national security”.
  • The Guidelines on the National Security Review of Investments (Mar. 2021), issued under s. 38 of the ICA, provide, at s. 8(xi), that sensitive personal data includes, but is not limited to: personally identifiable health or genetic information; biometric information; financial information; private communications; geolocation; and personal data concerning government officials.
  • ISED’s Policy Statement on Foreign Investment Review in the Interactive Digital Media Sector (Mar. 2024) states that foreign investments by entities owned or influenced by foreign states in the “interactive digital media” sector will be subject to enhanced scrutiny under the ICA.
    • “Interactive digital media” includes environments in which users can actively participate or that facilitates collaborative participation among multiple users for the purposes of entertainment, information or education, and commonly delivered via the Internet or mobile networks.

Social Media and Foreign Interference

Speaking Points

  • The ongoing Public Inquiry into Foreign Interference in Federal Electoral Processes and Democratic Institutions is mandated to examine Canada’s experience with foreign interference, including the role that social media plays with respect to foreign interference.
  • I am aware that the Foreign Interference Commission has released an overview report on social media, which found that social media creates both benefits and vulnerabilities for democracy and healthy political discourse, and that potential vulnerabilities include foreign interference.
  • Although our own ongoing investigation into TikTok limits my ability to comment, I can highlight, from our past work, that besides foreign states, there are political parties, corporations and a wide range of other organizations who monitor and record significant aspects of political public opinion very closely, including through social media.
  • Furthermore, privacy research by academics and other regulators has noted that, without meaningful privacy rights, digital surveillance and big data can be used to effectively erode the secrecy of voters’ ballots before those ballots are even cast.

Background

  • The Foreign Interference Commission’s overview report on social media contains the following observations:
    • Social media can have a strong positive effect on democracy by increasing political literacy and engagement;
    • Social media can be exploited to undermine democracy through foreign interference:
      • Widespread use of social media provides a path to saturate the population with messaging;
      • Algorithms driven by users’ data can amplify echo chambers;
      • Bots can seek to manipulate algorithms and individuals, and scrape data from social media platforms; and
      • Trolls can seek to shape the content and tone of online discourse.

Privacy Concerns with Social Media

Speaking Points

  • Social media prompts individuals to share personal information online. Users can be careful about the information they share and adjust their privacy settings, but social networking sites must also meet their legal obligations under privacy laws.
  • Information shared on social media may be accessible to many parties, who may collect, use or scrape it for a myriad of purposes including profiling, tracking, and targeted advertising.
  • Individuals should be able to use social media while maintaining a reasonable expectation of privacy.
  • While considering the issue, it is important to remember that young people are impacted by social media differently than adults, are at greater risk of being affected by privacy-related issues, and therefore require special protections.

Background

  • In October 2023, our Office and FPT counterparts published a resolution on putting the best interests of young people at the forefront, which in part sets guidelines for organizations to ensure young people are not tracked or profiled without justification, knowledge, or consent.
  • In August 2023, our Office, along with members of the GPA’s International Enforcement Cooperation Working Group (IEWG), published a joint statement on data scraping, which outlines the privacy risks, sets out how social media companies can safeguard personal information, and suggests ways users can protect themselves.
  • In October 2024, following engagement with social media companies and other industry stakeholders, the IEWG published a concluding statement on data scraping that reinforces requirements set out in the statement, shares best practices and lessons learned, and sets out further expectations.
  • PIPEDA permits organizations to collect, use or disclose PI without knowledge or consent if it is publicly available and specified by the regulations. Personal information that is posted to social media sites is not specified by the current regulations.

Transborder data flows (security of PI when leaving the country)

Speaking Points

  • PIPEDA does not prohibit organizations in Canada from transferring personal information to an organization in another jurisdiction. Nor does it differentiate between domestic and cross-border transfers of personal information for processing.
  • However, organizations must be transparent about their data practices. Customers should be advised that their information may be sent to another jurisdiction for processing and may be accessed by their courts, law enforcement and national security authorities.
  • Organizations also remain responsible for personal information that has been transferred to a third party for processing and must ensure that a “comparable level of protection” is provided.
  • My Office has advocated for Canadian private sector privacy law to include provisions that explicitly and separately address trans-border data flows, which would align with modern privacy laws like those in Australia, New Zealand and the GDPR. Rules that are specific to international flows of data would offer clear and accessible protections that address the inherent risks associated with cross-border transfers.

Background

  • Principle 4.1.3. of PIPEDA requires organizations to use contractual or other means to provide a “comparable level of protection” when transferring information to a third party for processing. This means that the third party must provide a level of protection comparable to the level the personal information would receive if it had not been transferred. The protections do not have to be the same, but they should be generally equivalent.
  • Other jurisdictions provide for specific transfer mechanisms, such as adequacy rulings, standard contractual clauses, codes of conduct or other schemes such as binding corporate rules.
  • The OPC’s May 2021 C-11 submission recommended that a separate scheme for trans-border data flows be implemented and address considerations relating to, among other things: 1) to whom the obligations apply; 2) accountability; 3) conditions to be met and 4) protections in the destination State.

Government Access to Data

Speaking Points

  • PIPEDA does not prohibit organizations in Canada from transferring personal information to an organization in another jurisdiction. However, organizations must be transparent about their data practices, and customers should be advised when their information may be sent to another jurisdiction where it could be accessed by that country’s courts, law enforcement and national security authorities.
  • PIPEDA recognizes the need for law enforcement to access personal information in certain circumstances and contains exceptions to consent for the disclosure of personal information in certain law enforcement and national security contexts where the government institution has lawful authority to request the information.
  • Government access should not go unfettered and should respect fundamental tenets such as independent judicial oversight, necessity, proportionality, and transparency.

Background

  • PIPEDA permits the disclosure of personal information without knowledge or consent if the disclosure is made to a government institution, or part thereof, that has made a request for the information, identified its lawful authority to obtain the information and indicated that: it suspects the information relates to national security, the defence of Canada, or the conduct of international affairs (s. 7(3)(c.1)(i)); the request is for the enforcement of, or carrying out an investigation relating to the enforcement of, any law of Canada, province or a foreign jurisdiction (s. 7(3)(c.1)(ii)); or is requesting the information for administering any law of Canada or a province (s. 7(3)(c.1)(iii)).
  • An organization can make a disclosure of their own initiative under s. 7(3)(d) of PIPEDA, if they have reasonable grounds to believe that the information relates to a contravention of Canadian, provincial, or foreign laws; or if they suspect that the information relates to national security.
  • The OPC co-authored the Global Privacy Assembly’s (GPA) resolution on Government Access to Data, Privacy and the Rule of Law in October 2021, and the GPA’s resolution on Transparency Reporting in October 2015.

Protecting personal information when a company is sold

Speaking Points

  • PIPEDA applies to foreign-based organizations with a real and substantial connection to Canada.
  • A real and substantial connection to Canada does not require a physical presence in Canada and can be established based on a number of factors. In the online context for example, these could include the location of an application’s target audience, the source of its content, or the location of a website operator or host server.
  • PIPEDA therefore continues to apply to an organization that is sold to a foreign entity where there is a real and substantial connection to Canada.

Background

  • Before asserting jurisdiction over a matter, the OPC looks for indicators establishing a “real and substantial connection to Canada”, which includes a non-exhaustive list of elements developed by jurisprudence over time.
  • Relevant court decisions on “real and substantial connection to Canada” applicable to PIPEDA include: (1) A.T. v. Globe24h.com, 2017 FC 114 at paras 50-64; (2) Lawson v. Accusearch Inc. (F.C.), 2007 FC 125 at paras 38-51; and (3) Facebook, Inc. v. Canada (Privacy Commissioner), 2023 FC 534 at para 84. OPC has applied this test in its own investigations, including into Globe24h (PIPEDA 2015-002), and Clearview AI (PIPEDA 2021-001).
  • The Federal Court set out a non-exhaustive list of factors to establish a “real and substantial connection to Canada” in A.T. v. Globe24h.com: (1) the location of the target audience of the website, (2) the source of the content on the website, (3) the location of the website operator, and (4) the location of the host server.
  • Physical presence of an organization in Canada is not required for a “real and substantial connection” as per the Federal Court in A.T. v. Globe24h.com.
  • OPC’s investigative powers under PIPEDA (s. 12.1) may be operationally difficult to exercise or altogether unavailable outside of Canada. OPC may therefore need to rely on a range of approaches including voluntary compliance as well as provincial and international collaboration. PIPEDA provides for the establishment of written information-sharing agreements with provincial and foreign data protection authorities (ss. 23 and 23.1 of PIPEDA).

PIPEDA’s applicability to personal information stored abroad

Speaking Points

  • PIPEDA can still apply to personal information stored on servers located outside of Canada in some circumstances.
  • PIPEDA applies to transborder data flows of personal information entering/leaving Canada or between provinces.
  • A real and substantial connection to Canada is required vis-à-vis an organization’s collection, use, or disclosure of personal information for PIPEDA to apply.

Background

  • When there is an ongoing investigation, PIPEDA’s confidentiality provisions apply, and we may not be in a position to publicly reveal specific jurisdictional aspects of that matter for the duration of that investigation.
  • Before asserting jurisdiction over a matter, the OPC looks for indicators establishing a “real and substantial connection to Canada”, which include a non exhaustive list of elements developed by jurisprudence over time.
  • Relevant court decisions on “real and substantial connection to Canada” applicable to PIPEDA include: (1) A.T. v. Globe24h.com, 2017 FC 114 at paras 50-64; (2) Lawson v. Accusearch Inc. (F.C.), 2007 FC 125 at paras 38-51; and, (3) Facebook, Inc. v. Canada (Privacy Commissioner), 2023 FC 534 at para 84. OPC has applied this test in its own investigations, including into Globe24h (PIPEDA 2015-002), and Clearview AI (PIPEDA 2021-001).
  • The Federal Court set out a non-exhaustive list of factors to establish a “real and substantial connection to Canada” in A.T. v. Globe24h.com: (1) the location of the target audience of the website; (2) the source of the content on the website; (3) the location of the website operator; and (4) the location of the host server.
  • Physical presence of an organization in Canada is not required for a “real and substantial connection” as per the Federal Court in A.T. v. Globe24h.com.
  • Personal information collected, used (including storage), or disclosed within the borders of provinces with privacy legislation deemed substantially similar to Part 1 of PIPEDA (i.e., BC, Alberta, and Quebec) is subject to provincial privacy laws, whereas PIPEDA would apply in the rest of Canada. PIPEDA would also apply where there are transborder data flows between provinces or to/from another country.

PIA Engagement with ISED on the Investment Canada Act

Speaking Points

  • Innovation, Science and Economic Development Canada (ISED) submitted a Privacy Impact Assessment (PIA) regarding Investment Canada Act reviews to my office in June 2023.
  • The PIA examined how ISED manages the personal information of foreign investors who wish to acquire control of an existing Canadian business and / or establish a new Canadians business.
  • The PIA notes that ISED shares information of applicants with Public Safety, who in turn shares that information with investigative bodies to conduct national security reviews.
  • However, the PIA does not describe what steps are taken to assess if there are national security or privacy concerns regarding the business or investor.
  • My office’s recommendations to ISED included ensuring appropriate Information Sharing Agreements with partner institutions who conduct national security reviews, privacy notices that are compliant with TBS requirements, appropriate retention periods for personal information, and procedures in the event of a privacy breach.

Background

  • Investment Canada Act reviews have been operational since 1985. The PIA was conducted retroactive to substantial changes to the collection of personal information which were implemented in 2017, such as residential address and date of birth.
  • ISED’s response to our letter of recommendation noted that some of the recommendations would be reflected in an updated PIA, however to date we have not received it.
  • Public Safety has not submitted a PIA concerning how it conducts its review of investors for national security.

OPC advice relating to the Investment Canada Act

Speaking Points

  • Although my Office has not been involved in any specific reviews under the ICA, we have provided general advice to government departments regarding privacy and ICA reviews, as well as privacy risks associated with foreign entities’ acquisition or control of federally regulated entities.
  • We have recommended, for example:
    • that organizations be required to proactively identify potential privacy considerations as part of the application process;
    • that the government consider constraining organizations’ transfers of data outside Canada for high-risk investments;
    • that organizations be required to conduct Privacy Impact Assessments for high-risk investments; and
    • that organizations be required to provide regular updates regarding changes to their domestic legislation that may meaningfully impact their handling of personal information.

Background

  • In March 2019, our Office met with Finance Canada and OSFI, at their request, to discuss privacy issues relating to changes in acquisition and/or control of federally regulated entities. Following the meeting, we wrote to OSFI identifying privacy-related factors it could consider during its analyses, and conditions it could impose on organizations to protect privacy.
  • On May 30, 2019, a Member of Parliament wrote to our Office expressing concerns regarding the privacy rights of customers and employees in connection with the foreign acquisition of two companies. Our Office responded on August 26, 2019, indicating that we had contacted ISED to discuss privacy issues relating to the ICA.
  • On August 26, 2019, our Office wrote to ISED offering advice regarding how to incorporate privacy considerations in ICA reviews.
  • In October 2020, after the RCMP reached out to our Office seeking our advice on privacy matters, we provided information on PIPEDA (application of the law, sensitive information, consent) as well as on the interplay of privacy and the ICA.

TikTok ban on GoC devices

Speaking Points

  • Five days after the announcement of the OPC’s investigation into TikTok on February 23, 2023, the Government of Canada banned use of TikTok on all GoC devices citing privacy and security concerns.
  • We were not specifically consulted on the ban of TikTok on government devices; it was a decision made by Canada’s Chief Information Officer that the application poses an unacceptable level of risk to privacy and security.
  • Under the policy, TikTok must be removed from government devices and the application blocked so that it cannot be installed going forward.

Background

  • Our Office falls under the Government Policy on Service and Digital and thus we are subject to the ban. Consequently, we have blocked the TikTok application for our devices as required.
  • The app has been banned for use on government devices by other jurisdictions, including all Canadian provinces, Britain, France, Australia, the United States federal government, and many states.
  • The TBS Policy on Privacy Protection requires institutions to notify our office of any planned initiatives that may have an impact on the privacy of Canadians. We consult regularly with TBS on issues related to privacy policies but ultimately, as with all initiatives, policies and programs, the institution makes the final decision on what course of action to take.
  • TBS and the Office of the Chief Information Officer consult regularly with our office on overall implications of digital services; we have monthly working level meetings and I also meet with CIO Rochon regularly. This specific matter was not discussed as part of these meetings.

Age-assurance consultation

Speaking Points

  • Age assurance can be an effective tool in an overall strategy to promote online safety for young people. In addition to restricting access to harmful content, age assurance could be used to direct young people to a version of a service that uses data practices tailored to youth and children.
  • However, age assurance can also have significant privacy implications for all internet users.
  • In June 2024, in support of my Office’s strategic priorities of advocating for privacy in a time of technological change and championing children’s privacy rights, the OPC launched an exploratory consultation on the topic of age assurance.
  • Our consultation will increase our understanding both of the challenges posed, and opportunities created, by this technology.

Background

  • Potential privacy impacts of age assurance include tracking of Internet usage and disclosure of identity information (e.g., via a breach).
  • Our consultation closed on September 10, 2024. We received a total of 40 submissions from civil society and academia, age-assurance providers, online services, Canadian industry associations, and interested individuals.
  • Anticipated next steps will include, at a minimum, publishing a summary of what we heard from respondents and developing guidance on the use and design of age-assurance systems.

ETHI report on Social Media monitoring

Speaking Points

  • I read with great interest the Committee’s report on the Oversight of Social Media Platforms which was tabled on December 5.
  • I agree with your conclusion that the business model and practices of social media platforms pose risks to Canadians’ health, safety and national security.
  • I support your recommendations to modernize federal privacy legislation to require data minimization, regulate data transfers, protect the privacy of children, and give my Office order-making powers.

Background

  • You appeared before ETHI on its social media study on October 25, 2024.
  • Recommendation 1: That the GoC reevaluate its digital standards regarding the download and use of all social media apps on government-issued devices in order to ensure they are used primarily for government business.
  • That the GoC amend PIPEDA to:
    • impose additional data minimization obligations, including a ban on engaging in certain forms of data collection. (Recommendation 2)
    • give the Privacy Commissioner power to make binding orders and impose significant administrative monetary penalties. (Recommendation 3)
    • add explicit rules regarding transfers of Canadians’ data outside Canada to ensure equivalent levels of protection for such data. (Recommendation 4)
    • require organizations to provide consent mechanisms appropriate for minors and include an explicit right to the deletion or deindexing of the personal information of minors. (Recommendation 5)
  • Recommendation 6: That the GoC adopt an EU-style code of practice on disinformation, compel social media platforms to report regularly on their trust and safety activities, and provide Canadian researchers with access to their data.
  • Recommendation 7: That the GoC increase funding to the RCMP so additional resources can be allocated to providing education and to fighting cybercrime.
  • Recommendation 8: That the GoC invest more in digital literacy to better equip Canadians to protect their personal information online, recognize disinformation and misinformation, and identify harmful content online.

Online Harms Act (C-63)

Speaking Points

  • The government tabled Bill C-63, the Online Harms Act, with the stated goal of holding social media platforms accountable for addressing harmful content on their platforms and to create a safer online space that protects all people in Canada, especially for kids.
  • I have made championing children’s privacy rights a strategic priority of my Office, as children need to be able to navigate online spaces securely. This priority relates to areas of C-63, such as the duty to protect children and certain design features for the protection of children that may be required to be integrated into services, such as age-appropriate design.
  • C-63 also addresses intimate images communicated without consent, of interest to my Office given my findings related to the Aylo investigation.
  • I am happy to further discuss the privacy implications of Bill C-63 should I be called to comment on the bill in Parliament.

Background

  • Bill C-63 legislates a duty to protect children. As part of this duty, s. 65 states that “an operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age-appropriate design, that are provided for by regulations.” S. 140(o) outlines that regulations respecting design features may include privacy settings for children.
  • Bill C-63 also establishes a duty to make certain content inaccessible. Regulated services, whether it be content flagged by the service itself or by a user, must take down content it has reasonable grounds to suspect is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent within 24 hours of identifying it (s. 67). The content must remain offline until the service has made a decision on whether the content should remain inaccessible.
  • The Standing Committee on Justice and Human Rights is undertaking a pre-study of C-63. On December 4, Justice Minister Arif Virani confirmed that the bill would be split, in an effort to ensure passage of the less divisive part of the original bill.

Technology Analysis Capabilities Related to Applications

Speaking Points

  • Our Technology Analysis Division’s Laboratory possesses specialized capabilities to assess various technologies for cybersecurity and privacy-related issues.
  • The Lab uses the following combination of methods to specifically analyze personal computer or mobile device applications:
    • They most commonly reverse engineer applications through a combination of specialized skills and purpose-built software.
    • They also leverage external research findings from other Data Protection Authorities, academia and industry.
  • The Lab focuses its work in response to questions and requests from this committee, other parliamentarians, and investigations that are initiated by the OPC following a privacy complaint under the Privacy Act and PIPEDA.
  • Currently, the Lab does not have the adequate resources to keep up with the growing demand for complex and highly technical assessments beyond this focus.

Background

  • There currently are over 2M iOS apps, and over 3.4M Android apps, and growing.
  • Worldwide, across all platforms, most downloaded apps:
    1 - Instagram (182.2M); 2 - Facebook (146M); 3 - TikTok (117.7M); 4 - WhatsApp (113.9M); 5 - Telegram (85.1M); 6 - Temu (79.6M)
  • TikTok was the most downloaded iOS app worldwide in June 2024, with 15 million downloads.
  • In Canada, TikTok ranks at #4, behind Temu, ChatGPT and Threads with approximately 14.9M users in 2024.
  • December 2022: TikTok admitted to spying on reporters.
  • June 2022: Chinese TikTok employees were found to be accessing American user data, contrary to public statements.

International limits/bans of TikTok

Speaking Points

  • Canada is not the first country to consider restricting the use of TikTok as an application - some jurisdictions have banned it outright.
  • Since 2018, when TikTok became globally available for download, some countries such as India, Iran and Somalia have implemented a complete ban of the application, and several more such as Australia, Belgium, Denmark and numerous others have imposed specific use limitations.
  • For this second group, generally, government has used administrative controls to block installation of the application on public sector devices or those provided to legislators and their staff.

Background

  • Complete prohibitions: The list of states undertaking a complete ban includes Afghanistan, India, Iran, Kyrgyzstan, Nepal, and Somalia. For countries wishing to ban use of the application completely, the most efficient technique appears to be by requiring its removal from availability on major app vendors like Google Play and Apple App Store. Other states such as India have ordered internet service providers to block the application at the national network level (using domain name service, or DNS, controls). India was the first state to undertake this measure in June 2020.
  • Use limitation measures: This list of places imposing use restrictions includes Australia, Belgium, Denmark, France, Latvia, Netherlands, New Zealand, Norway, Taiwan, the UK, the US, along with the European Parliament, European Commission, and European Council. Where states have instead opted to impose specific use limitations (e.g. prohibiting download and installation on government-issued work devices), this has been typically accomplished at the level of IT administration (such as blocking the application from being installed on managed devices or using firewalls to prevent connections to the service). Canada undertook this measure in February 2023.
  • Temporary measures: At one time or another, several countries such as Pakistan and Indonesia have implemented full bans specifically citing harmful or immoral content via the application. However, these were time-limited bans and have since been lifted.

Australia: ban of social media for children

Speaking Points

  • We are aware of the recently enacted Australian ban on social media use by children under 16, as well as similar proposals in other jurisdictions.
  • I would defer to the will of Parliament to determine whether such a ban is appropriate in Canada.
  • However, I believe that alternative mechanisms available to Parliament – including strengthening privacy law and mandating an age-appropriate design code – could also meet the intended ends of creating safer online experiences for, and meeting the needs of, youth.
  • I would also emphasize that, should such a ban be adopted, it will be important to ensure that any requirements about how to determine the age of users be designed in a privacy-protective manner. My Office would stand ready to provide advice on how this might be done.

Background

  • The Australian Online Safety Amendment (Social Media Minimum Age) Bill 2024 was introduced on November 21, 2024 and passed both houses on November 29, 2024. It will come into force no later than 12 months after this passage.
  • The primary requirement of the bill is that social media companies “take reasonable steps to prevent [Australians under 16] having accounts with the … platform.”
  • Australia’s bill is the first national-level ban on children having social media accounts. Florida has enacted a similar ban (prohibition on children under 14 holding accounts; accounts require parental consent for children above 14 but under 16), but it is being challenged on free speech issues. Other jurisdictions, including France and multiple US states, require or propose to require parental consent for youth under a defined age (which varies between jurisdictions) to create social media accounts.
  • Australia has an Age Assurance Technology Trial underway which aims to determine the effectiveness of available age assurance technologies.
  • Australia’s Privacy Commissioner (Carly Kind) has expressed concern with this ban, arguing that making social media a safer place for youth (including through improved privacy laws) would be the preferrable option.

Undertakings to protect children in other jurisdictions

Speaking Points

  • Jurisdictions around the world are taking action to ensure the digital environment protects children from harm while maximizing benefits.
  • The UK Online Safety Act (2023) requires online platforms to assess whether children are likely to access a service and requires them to conduct a risk assessment on children’s rights, among other duties.
  • The UK also has a Children’s Code containing 15 standards that online services such as social media platforms need to follow.
  • Similarly, the EU’s Digital Services Act (2022) requires that online platforms accessible to minors ensure a high level of privacy, safety, and security. It also requires online platforms to identify and assess potential online risks for children and young people and prevents them from advertising to minors based on profiling.
  • Australia has also just passed a law that would ban children under 16 from social media. Platforms will need to take reasonable steps to prevent children under 16 from having accounts.

Background

  • Bill C-27 provides that personal information of a minor is considered sensitive information which impacts requirements regarding consent, privacy management programs, appropriate purposes, retention periods and transparency for retention periods, security safeguards, breach reporting, and de-identification measures, among others. Exceptions to honouring disposal requests are also more limited if the information is in relation to a minor.
  • Bill C-63 would impose three broad duties on social media platforms: 1) a duty to act responsibly; 2) a duty to protect children by integrating design features respecting the protection of children; and 3) a duty to make non-consensually distributed intimate images and child sex abuse material inaccessible within 24 hours. At the time of drafting, C-63 has been referred to JUST for pre-study.
  • The 2023 FPT resolution on the best interests of the child set out expectations for organizations to adopt to protect children, including: building in privacy by design, rejecting deceptive design, being transparent, setting privacy protective services by default, and turning off tracking and profiling, among others.

TikTok – Findings from other DPA investigations

Speaking Points

  • In 2023, the UK and Irish DPAs found that TikTok breached the UK and EU GDPRs, including violations relating to children’s privacy and lack of transparency with users. France’s DPA also found that TikTok violated the French Data Protection Act’s requirements on cookie consent.
  • In 2019, the FTC announced a $5.7M US settlement with TikTok (then Musical.ly) for alleged violations of the US Children’s Online Privacy Protection Act (COPPA).
  • In 2020, South Korea’s telecommunications regulator found that TikTok had violated laws regarding children’s privacy and overseas data transfer, and issued a ₩186M fine (~ Can $211,296 in 2020 $).
  • In November 2024, Brazil’s National Data Protection Authority issued compliance orders regarding TikTok’s age verification mechanisms and initiated a sanctioning process, in its on-going investigation of TikTok’s alleged irregularities in processing the data of children and adolescents.

Background

  • In April 2023, the UK ICO found that TikTok: (1) handled data of kids too young to consent; (2) failed to provide users with easy to understand information; and (3) did not process user data lawfully, fairly, or transparently. A fine of £12.7M was issued.
  • In September 2023, Ireland’s DPC found that TikTok: (1) made child profiles public by default; (2) allowed activation of the “Family Pairing” account feature by (unverified) non-parents/guardians; (3) lacked transparency with child users; and (4) employed “deceptive patterns” nudging towards privacy-intrusive options. It issued a fine of €345M.
  • In January 2023, France’s CNIL found that TikTok’s cookie consent flows made it easier for users to accept than opt-out of tracking. A fine of €5M was issued.
  • In April 2024, the European Commission initiated proceedings against TikTok for alleged breaches of the EU Digital Services Act. In August 2024, the US DOJ and FTC filed a civil suit against TikTok for alleged post-settlement COPPA violations.
  • In October 2024, South Korea’s DPA (Personal Information Protection Commission) and telecommunications regulator (Korea Communications Commission) launched further investigations into TikTok’s consent and data handling practices.

TikTok Investigation – Privacy-Related Class Actions

Speaking Points

  • In February 2023, the OPC’s TikTok investigation announcement included that it “was initiated in the wake of now settled, class action lawsuits in the United States and Canada”. This referred to two cases.
    • The US class action alleged that TikTok surreptitiously harvested and profited from collecting users’ personal information, including their biometric data.
    • The Canadian class action alleged that TikTok breached users’ privacy, including regarding consent for minor users.

Background

  • In the US, several federal lawsuits against TikTok were consolidated into a class action before the US District Court for the Northern District of Illinois in 2020. The lawsuit had a nation-wide class and an Illinois state subclass.
    • The nation-wide class alleged that without users’ consent, TikTok collected personal information (e.g. biometric data), disclosed video-related personal information to Facebook and Google, and violated consumer protection laws.
    • The Illinois subclass’s allegations included that TikTok harvested users’ biometric information without consent and sold it to third parties.
  • A settlement for $92M USD to class members was approved in July 2022.
  • In Canada, two class action lawsuits against TikTok were certified together for settlement purposes by the Supreme Court of British Columbia. The class members were TikTok users in Canada, with a subclass of minor users.
    • The class members’ allegations included that TikTok collected, used, and sold users’ personal information to third parties without their consent. The subclass also alleged that parental consents were not obtained for minors.
  • A settlement for $2M CDN in charitable donations was approved in February 2022.
  • TikTok did not admit to any wrongdoing in either settlement agreement.
  • In September 2024, a class action against TikTok was filed in California on behalf of children’s parents, claiming violations of the US Children’s Online Privacy Protection Act. Another on-going class action, filed in 2021 by Dutch foundations on behalf of child and adult users, is claiming breaches of privacy regulations.

Lawsuit by US DOJ and FTC against TikTok

Speaking Points

  • In August 2024, the US DOJ and FTC filed a civil suit in California against TikTok for alleged violations of the US Children’s Online Privacy Protection Act (COPPA).
  • The lawsuit alleges that since 2019, TikTok has:
    • Failed to notify and obtain parental consent before collecting and using personal information from children under 13;
    • Failed to limit the collection, use, and disclosure of children’s personal information, and to delete it when requested by parents or when no longer needed;
    • Violated the terms of a 2019 court order that required TikTok to comply with COPPA, as part of a settlement with the FTC.

Background

  • In 2019, the FTC announced a $5.7M US settlement with TikTok (then Musical.ly) for alleged violations of COPPA. The settlement included a consent order barring TikTok from violating COPPA and imposed measures to ensure its compliance.
  • Despite the court order, the current lawsuit alleges that TikTok knowingly allowed children to create TikTok accounts, to create, share, and view content, and to interact with other users, including adults.
  • Even where accounts were created in “Kids Mode” (a pared-back version intended for children under 13), the lawsuit claims that TikTok unlawfully collected and retained children’s email addresses and other personal information.
  • Further, TikTok allegedly had deficient and ineffective policies and processes for identifying and deleting accounts created by children, and frequently failed to honour parents’ requests to delete their children’s accounts.
  • Other allegations include that TikTok:
    • Collected more data than needed, such as about children’s activities on the app and persistent identifiers used to build profiles on children.
    • Built back doors into its platform that allowed children to bypass age-gating and parental consents by using credentials from third-party services, such as Google and Instagram.

Lawsuit by Texas AG against TikTok

Speaking Points

  • Texas is suing TikTok for operating its platform in a manner that puts childrens’ online safety and privacy at risk and for violating the state’s Securing Children Online through Parental Empowerment Act (“SCOPE Act”).
  • Texas law requires social media companies like TikTok to take steps to protect children online and requires them to provide tools for parents to do the same.
  • SCOPE Act prohibits TikTok from sharing, disclosing, or selling a minor’s personal information without permission from their parent or guardian and requires that parents be provided with tools to manage and control the privacy and account settings on the child’s account.

Background

  • Action brought on October 3rd, 2024 before the Galveston County District Court.
  • Texas’ Attorney General claims that TikTok has violated the SCOPE Act on the following three counts:
    • (i) Failing to use a commercially reasonable method for a parent/guardian to verify their identity and relationship to a known minor using TikTok
    • (ii) Unlawfully sharing, disclosing, and selling known minors’ personal information collected by TikTok
    • (iii) Failing to create and provide parental tools for the accounts of known minors using TikTok
  • The Texas AG is seeking civil penalties of up to $100,000 USD per violation and injunctive relief to prevent future violations.
  • Various TikTok affiliated entities (subsidiaries of Beijing-based ByteDance) are named as defendants, including Singapore-based TikTok Pte Ltd which is responsible for distributing the TikTok app in the Google Play Store and Apple App Store. The latter is the same entity being jointly investigated by the OPC and three provincial counterparts.
  • Citation: State of Texas vs. TikTok Ltd., et al. (No. 24-CV-1763)

US law requiring divestiture or ban of TikTok

Speaking Points

  • Congress passed a law potentially banning TikTok from the US as early as January 19th, 2025 (i.e., 270 days after the law was passed); the ban’s start date can be delayed by an additional 90 days by way of a one-time extension by the President (i.e., until April 19th, 2025).
  • The law requires Beijing-based ByteDance (TikTok’s parent company) to sell TikTok to an entity based in a “non-foreign adversary country”.
  • Unless ByteDance can find a qualifying buyer before the deadline, the law would make it illegal for web-hosting services to support TikTok and force providers like Apple and Google to remove it from their app stores—rendering TikTok unusable in the US.

Background

  • On April 24th, 2024, President Biden signed a sweeping foreign aid package into law [H.R.815 - 118th Congress (2023-2024): Making emergency supplemental appropriations for the fiscal year ending September 30, 2024, and for other purposes, H.R.815, 118th Cong. (2024); 15 USC 9901] that also included an amended version of the Protecting Americans from Foreign Adversary Controlled Applications Act (“the Act”), which would practically ban TikTok from the US.
  • There are only four “foreign adversary countries” for the purpose of this Act: (i) the People’s Republic of China; (ii) the Democratic People’s Republic of North Korea; (iii) the Russian Federation; and (iv) the Islamic Republic of Iran [see Division H, in conjunction with 10 U.S.C. § 4872(d)(2)].
  • This law applies to “foreign adversary-controlled applications”, including TikTok (via its parent company, ByteDance), as per § 2(g)(3)(A) & (B) of the Act; both TikTok and ByteDance are specifically mentioned by name.
  • If an entity based somewhere other than in one of the deemed “foreign adversary countries” owned/controlled TikTok, this law would cease to apply to it.
  • On December 6th, 2024, the D.C. Circuit Court of Appeals found this law to be constitutional (i.e., does not infringe “freedom of expression” under the US constitution’s 1st amendment) [TikTok Inc. v. Merrick Garland, 24-1113, (D.C. Cir.)];
  • December 9th, 2024, counsel for TikTok/ByteDance filed an emergency motion with the U.S. Court of Appeals for the D.C. Circuit asking for a temporary injunction (by December 16th) to prevent the Act from taking effect on January 19th, 2025.

Recent Romanian Presidential Election

Speaking Points

  • I am aware of allegations concerning TikTok’s role with respect to the results of the first round of Romania’s presidential election.
  • The concerns raised by researchers, NGOs and Romanian authorities highlight the potential risks created by social media platforms’ use of algorithms driven by user data.
  • As the Foreign Interference Commission has found, bad actors can seek to manipulate such algorithms to amplify certain voices and messages, including through the use of bots.

Background

  • On November 24, 2024, Călin Georgescu finished first in the first round of Romania’s presidential election, with approximately 23% of the vote. Many have labelled Georgescu “far-right”, based on his opposition to aid for Ukraine, his criticisms of NATO, and his praise for the leader of Romania’s fascist, German-aligned government during the Second World War.
  • On the day before the first round of voting, a report published by Expert Forum, a Bucharest-based think tank, claimed that Georgescu had experienced a “burst of visibility” on TikTok in the lead-up to the vote, which seemed “suddenly and artificially created” and corresponded with his “sudde[n] explos[ion]” in the polls.
  • More than 20 Romanian NGOs and Romania’s National Audiovisual Council have requested that the European Commission investigate whether TikTok breached its obligations under the Digital Services Act (DSA).
  • The EC has sent TikTok urgent requests for information and issued a “retention order” under the DSA, ordering TikTok to freeze and preserve all data and evidence linked to the Romanian election, including internal documents and information regarding the design and functioning of TikTok’s recommender systems.
  • On December 4, 2024, Romania’s president declassified intelligence documents suggesting that thousands of TikTok accounts created years ago by Russia were suddenly activated in support of Georgescu in the weeks before the vote.
  • On December 6, 2024, Romania’s Constitutional Court annulled the first-round results of the presidential election on the basis of concerns regarding foreign interference. The government must now establish a date for a re-run.

International Developments in Age Assurance

Speaking Points

  • International developments in age assurance have been occurring on many fronts.
  • Multiple jurisdictions, including in the US, EU, and Australia, have introduced age assurance requirements related to prohibitions or restrictions on youth access to certain types of online services (such as social media or dating websites) or content (such as sexually explicit material).
  • At the same time, the effectiveness of current age assurance techniques is being evaluated through projects such as Australia’s Age Assurance Technology Trial or the US NIST’s evaluation of age estimation systems. Standards are also being developed.
  • Data protection authorities (including my Office) have also been examining and providing guidance on the privacy implications of age assurance, including through principles set out in a Joint Statement on a Common International Approach to Age Assurance.

Background

  • US NIST = United States National Institute for Standards and Technology
  • In-development standards include ISO 27566; Canadian Digital Governance Standards Institute CAN/DGSI 127
  • Prohibitions vs. restrictions: For example, Australia has implemented a ban on children under 16 holding social media accounts (prohibition), whereas France requires children under 15 to obtain parental consent for such an account (restriction).
  • 10 signatories to Joint Statement: UK; Gibraltar; Philippines; Canada; Argentina; Mexico; Guernsey; Bermuda; Jersey; Isle of Man
  • Sample legislation: age assurance:
    • Sexually explicit material: UK Online Safety Act; France Loi Visant à Sécuriser et à Réguler l’Espace Numérique
    • Social media: EU Digital Services Act; Australia Online Safety Amendment (Social Media Minimum Age) Bill

Date modified: