For a Holistic Vision of the Toolkit
Remarks for the “Innovative Supervising” panel at the First International Congress on Data Protection
June 6, 2013
Santa Marta, Colombia
Address by Chantal Bernier
Assistant Privacy Commissioner of Canada
(Check against delivery)
Introduction
Today, we have been asked to discuss the strategies adopted by data protection authorities (DPAs) to more effectively enforce legislation, and the creative methods we use in our monitoring activities to achieve better results.
I am very proud of what we have accomplished in this area at the Office of the Privacy Commissioner of Canada (OPC) over the last few years. My objective today is to share with you our practical experience. I hope it will also help you do more with less.
I will divide my remarks into three parts:
- first, I will tell you a little bit about the OPC’s mandate;
- then, I will explain the specific viewpoint we have adopted in order to identify the full spectrum of actions at our disposal; and
- finally, I will provide you with an inventory of the tools at our disposal, along with concrete examples.
I. About the OPC
In order to put my remarks into context, I will start by briefly describing the OPC’s mandate.
We are responsible for overseeing compliance with two federal acts: the Privacy Act, which covers the personal information-handling practices of some 250 federal institutions, and the Personal Information Protection and Electronic Documents Act (PIPEDA), which covers the use of personal information in commercial activities that fall under federal jurisdiction.
These two acts were adopted 20 years apart and essentially grant us two powers:
- the power to investigate complaints received from the public or initiated on our own; and
- the power to conduct audits of organizations at our discretion with respect to the public sector, and on the basis of reasonable grounds to believe there has been a violation of the law, with respect to the private sector.
In both cases the Commissioner ultimately issues a report containing specific recommendations for the organization concerned.
A key point: we do not have the authority to impose sanctions, orders or fines.
We only have the authority to:
- publicly name the organization in all cases with respect to the public sector, but only where public interest justifies it with respect to the private sector; and
- use the courts in certain cases.
This limitation of our enforcement powers, as well as the variety of risks to and violations of privacy are the two key factors in our approach.
II. Adopting a holistic view of enforcement measures
It would be easy to restrict our responses to these narrow limits, but this would be too simplistic given the variety of circumstances and challenges, which are constantly evolving politically, technologically and economically, in the protection of personal information.
Our mandate from the Parliament of Canada is not to conduct investigations and audits. It is to protect the right to privacy in Canada — investigations and audits are just two of the tools we use. The variety of risks requires that we continue to develop tools and that they be adapted to each situation.
By adopting this change in perspective, the range of tools becomes much broader.
III. The Office of the Privacy Commissioner of Canada’s toolkit — with concrete examples
In the third part of my talk, I will describe the range of tools we have developed to fulfill our mandate to protect the right to privacy in Canada.
The range of actions we take is as follows, from the most flexible to the most restrictive:
- First is “upstream” action, which we use in the absence of a specific suspicion of a violation of the law:
- Education of the public and organizations, either through awareness activities or by promoting compliance through partnership;
- A more in-depth review of Privacy Impact Assessments (PIAs).
- Next is “downstream” action, which we use when there is a suspicion of violation:
- Monitoring;
- Structured dialogue;
- Early resolution;
- Investigation;
- Audit.
I will now describe each of the tools we use “upstream” and illustrate them with concrete examples.
(a) Monitoring
When we suspect that either act has been violated (e.g., on the basis of research conducted by a third party, media reports or because we have been alerted by our technologists or another data protection authority (DPA)), the first thing we do is very simple: we monitor. We observe what is going on around us and we prepare to take action.
This “flexible” response is sometimes the first step toward more intensive action and can be summarized as the collection of information and analysis of the risk based on the information available at the time.
The following is a very concrete example: In 2011, when Sony revealed that it had been the victim of hacking and that the personal information of millions of users had been stolen, we turned our attention to the issue to determine whether there were grounds for action.
We established a small team that included technologists, lawyers and investigators, and began gathering information: What happened? Were the security measures in place appropriate? Are there reasonable grounds to believe that the law was broken?
The technologists consulted the sources of information available to the public, including Sony’s blog, and contacted their colleagues from other DPAs. They learned that Sony was already conducting an internal investigation, that law enforcement was also conducting an investigation, that there was no indication that the personal information had been posted publicly, and that the credit card numbers obtained by the hackers were not encrypted, but protected by a hashing algorithm.
On completion of our investigation of Sony’s risk management (since cybersecurity is an obligation of means and not an obligation of results), we decided not to take action. As we know, our British counterparts did decide to conduct an investigation and concluded that there was a breach of their laws. However, we remain confident that we made the right decision based on the information we had when the hacking was made public.
In the same spirit, we issue guidelines on emerging technology issues. For example, after having read enough articles suggesting a certain laissez-faire attitude on the part of mobile app developers, we issued guidance to help them view privacy as a competitive advantage rather than an obstacle to innovation.
(b) Structured dialogue
The next level on our response scale is structured dialogue.
When we suspect that the personal information management practices of an organization or sector do not comply with our laws, but are hopeful that we can resolve the situation without a formal investigation by the Commissioner, we initiate a dialogue with the organization or organizations involved.
Two examples from our files over the last year are the hacking of LinkedIn and web leakage.
First, I’ll talk about LinkedIn. You will remember that this professional networking site was the victim of hacking in June 2012, when close to 6.5 million passwords were decrypted and posted online. As we did in the Sony case, we established a multidisciplinary team to conduct a preliminary analysis based on the information available to the public. This time, by comparing what we knew about the hacking with what we knew about the industry standards, we concluded that the breach could point to certain weaknesses in LinkedIn’s computer security measures.
Along with our counterparts in three provinces whose legislation is very similar to PIPEDA, we entered into a structured dialogue with the company, during which we obtained more information (in confidence), and made recommendations which were implemented by LinkedIn.
Ultimately, the privacy of LinkedIn users in Canada and elsewhere in the world is better protected, following a quicker and less costly response than a formal investigation.
Web leakage is another matter in which we resorted to structured dialogue.
Through a study conducted by a U.S. university (Worcester Polytechnic Institute), we learned of a phenomenon in which personal information provided by website users could be forwarded to third-party organizations without the users’ knowledge or consent, meaning that personal information was being leaked to third-party sites.
Because this would constitute a breach under PIPEDA, we decided to see whether the same thing was happening on popular Canadian sites.
Therefore, we took a more or less random sample of a dozen websites of Canadian organizations, mostly in the private sector. Our technologists then reproduced the experiment conducted by the U.S. university.
Our technologists established that follow-up was necessary with nine of the organizations in the sample. The Commissioner corresponded individually and confidentially with each organization to explain our methodology and what we had observed on their sites, as well as to ask them to change their practices in order to comply with the legislation.
The Commissioner continued her correspondence with the businesses involved throughout the process to inform them of their progress with respect to the recommendations specific to them. At the same time, OPC analysts communicated directly with those responsible for the websites of the organizations concerned to answer their technical questions and guide them as needed in the updating of their sites.
In the end, the organizations concerned implemented all of our recommendations to the benefit of all Canadians using their websites, once again, more quickly and at a lower cost than a more formal response.
(c) Early resolution
When we receive complaints that we believe could be resolved quickly, we refer the file to an officer on a team dedicated to early resolution, rather than launching a formal investigation. These include complaints on subjects on which the OPC has already issued findings, complaints concerning organizations that, in our opinion, have already responded to allegations in a satisfactory manner and cases where it seems possible to resolve the allegations quickly.
Here is an example from our early resolution files from last year: an individual made an online application to a utility company.
On the application form, the fields for the SIN, driver’s licence number and employer information were required fields. The person expressed his concerns about this to the company, which responded that the information was necessary to authenticate clients.
Not satisfied by this answer, the individual in question submitted a complaint to the OPC, and the complaint was sent to an early resolution officer, who contacted the company to explain the very specific circumstances in which a SIN or driver’s licence number can be requested, as well as the risks that the excessive collection of personal information can entail. The officer also provided the company with guidance documents produced by the OPC on the subject.
The company has now stopped collecting SIN and driver’s licence numbers, and has stopped requiring employer information. These practices have been replaced by the use of security questions.
In the end, the company appreciated the fact that the OPC provided it with the information, and the complainant was satisfied with the measures adopted by the company in response to his concerns.
Since we began using early resolution three years ago, we have been able to more quickly and cost effectively resolve some 590 cases under the Privacy Act and 311 cases under PIPEDA. The percentage of cases that fall under early resolution each year rose from 13% in 2010-2011 to 33% in 2012-2013 for the public sector and from 24% in 2010 to 44% in 2012 for the private sector.
(d) Investigation
Now we come to the first of the two responses explicitly named in our enabling legislation, that is, the authority to conduct investigations.
I will not go into detail about regular investigations because I have a feeling that almost everyone here has conducted or undergone at least one: we receive a complaint from an individual, we assign it to an investigator, the investigator establishes the facts, analyzes them and drafts an investigation report, and the Commissioner issues a finding on whether or not the complaint is well founded.
In the case of well-founded complaints, we have, since 2011, broadened the list of findings we issue.
- - In the event that the organization has acted in breach of the law, but has resolved the issue in the course of the investigation, we issue the finding of “well-founded and resolved” rather than “resolved,” to ensure that the violation is recognized.
- - In the event that the organization at fault agrees to rectify the situation, but does not have the time to implement all of our recommendations before the investigation is completed, we issue the finding of “well-founded and conditionally resolved.”
This leads me to an example of a complaint for which we issued this type of finding and required follow-up by the organization in question. It concerns our investigation of Google Wi-Fi, the results of which were published in October 2010.
At the Commissioner’s initiative, the investigation was launched after Google announced that its cars — roaming Canadian streets for the mapping service Street View — had inadvertently collected data transmitted on unsecured wireless networks.
In light of the investigation, the Commissioner recommended that Google ensure that its governance model enable it to comply with privacy laws. In addition, the Commissioner recommended that Google improve its privacy training, as well as name one or more persons responsible for privacy issues and compliance with the company’s obligations in that area — an obligation under Canadian privacy law.
The Commissioner also asked Google to delete the Canadian payload data collected, as long as it did not interfere with compliance with outstanding obligations under Canadian and U.S. laws, such as the retention of evidence related to legal proceedings. If the Canadian payload data could not be immediately deleted, it had to be securely saved and access to the data had to be restricted.
With regard to the extent of the violation, the number of people affected and the scope of the recommendations the company needed to address, the Commissioner required confirmation that Google had indeed met all its commitments. The Commissioner therefore required that Google submit, within a prescribed period, an audit report prepared by a third party certifying that all the recommendations made on completion of the investigation had been implemented.
In addition to the follow-up measures for the implementation of our recommendations, the Google Wi-Fi example also illustrates another tool we are using more and more to ensure that our actions benefit Canadians as much as possible: we release both the investigation report and the name of the company.
Most organizations we investigate comply with all of our recommendations, often even before the investigation has been completed. However, we sometimes have to use heavy tactics to ensure our recommendations are implemented by referring the matter to the Federal Court.
This is what happened upon completion of an investigation of Nexopia, a social networking site for young people. The investigation was conducted after a complaint submitted by the Public Interest Advocacy Centre revealed that Nexopia was violating a number of aspects of PIPEDA. A total of 24 recommendations were made upon completion of the investigation.
Nexopia responded to 20 of them to the satisfaction of the Commissioner. However, the OPC and Nexopia were not able to agree on how to resolve four issues concerning the retention of personal information of Nexopia users.
The Privacy Commissioner of Canada therefore turned to the Federal Court to request an order requiring Nexopia to stop retaining personal information for an indeterminate period and to create a delete function. Nexopia changed hands after the request was submitted. The new owner agreed to address all the recommendations made in the Report of Findings. The file is still before the Federal Court.
We can release any information related to the management practices of an organization if we believe that it is in the public interest to do so. In order to determine whether the public interest threshold has been reached in a particular case, we base our analysis on many factors. In general, we apply the following criteria:
- The decision to name the organization must be supported by facts and an analysis that shows that the public interest is involved;
- The decision must be made on a case-by-case basis, and not on the basis of an all-encompassing policy promoting the universal use of the power to name an organization;
- The use of the power to name organizations must advance the objectives of the legislation — the decision to name an organization must be rationally linked to the reasons for which the power was granted to begin with;
- The decision must reflect the proper balance between disclosure and confidentiality because both are in the public’s interest; and
- The scope of the disclosure must be restricted to the information necessary for reaching the stated objective.
(e) Audit
The OPC’s fifth and final response mechanism is audit, a review of an organization or sector’s personal information management practices that we use to address systemic issues.
Sometimes the existence of systemic issues that lead to an audit is revealed during an investigation. This is what happened during an investigation of Veterans Affairs Canada conducted in October 2010, when we determined that the controls in place at the Department did not adequately protect the personal information in its possession under the Privacy Act.
In particular, the investigation revealed that the sensitive medical information of a veteran had been communicated to public servants who had no legitimate reason to see it. Certain medical information was even found in ministerial briefing notes describing the veteran’s advocacy efforts.
The privacy breach in this case was so serious that there was reason to believe there were systemic issues. Therefore, we initiated an audit of the Department’s personal information management practices.
On the basis of similar reasoning, we initiated an audit of Staples, a Canadian office supplies retailer. In the case of Staples, we conducted two investigations between 2004 and 2008, stemming from two separate complaints that the company was not deleting all client information from storage devices returned to the store, such as laptops and external hard drives, before putting them back on the shelves.
At the conclusion of the two investigations, the company agreed to take the necessary corrective measures. In light of media reports in 2009 that Staples always resold returned electronic devices without erasing the personal data of the previous owner, we conducted an audit.
Because our audit showed that the systemic issues related to the management of returned data storage devices had not yet been completely resolved, we required the company to submit, the year following the audit, a report by an independent third party stipulating how the company had complied with our recommendations.
Conclusion
In conclusion, I hope leave you with some key points.
First, the variety of risks to privacy requires the adoption of a holistic vision of the response methods available to DPAs. This comprehensive vision is crucial for adopting the creative, strategic methods necessary to address increasingly complex cases with resources that for many of us, unfortunately, are increasingly limited.
Second, we must consider the reality in order to remain effective, but also relevant, given the new business models and technological realities that have such a significant impact on the fulfillment of our mandates to protect privacy in our respective jurisdictions. This requires flexibility in our response, making it possible to apply the correct response according to the circumstances.
Third, our greatest power is in public engagement, and we must therefore root our responses in the expectations, rights and needs of the public.
I will now turn the floor over to my esteemed colleagues. I look forward to continuing this conversation with you.
- Date modified: