Speeches

The Role of the Privacy Impact Assessment

Managing Government Information
2nd Annual Forum

March 10, 2004
Ottawa, Ontario

Stuart Bloomfield
Office of the Privacy Commissioner of Canada

(Check against delivery)


Introduction

I have been asked to speak to you this morning about Privacy Impact Assessments (PIAs) that are now required to be conducted on all new government projects that involve the collection, use and disclosure of personal information.

Over the course of the next 20 minutes or so I will touch upon four things:

  • The rationale for conducting PIAs, and the potential consequences for failing to do so;
  • The process of conducting a PIA;
  • The role of the OPC in this process; and
  • Our experience to date with PIAs.

Role of the Office of the Privacy Commissioner of Canada

Before addressing these subjects, it may be appropriate at this juncture to say something, for the benefit of those who are unfamiliar with the OPC, about what we are and what we do. In a nutshell, the OPC is an ombudsman — an independent guardian of the privacy rights of Canadians.

This role includes overseeing and enforcing two federal privacy statutes; the Privacy Act that applies to all federal government institutions, and the new Personal Information Protection and Electronic Documents Act (PIPEDA) which extends personal data protection rights to the federally regulated private sector.

The OPC is responsible for ensuring that the gathering and handling of personal information, in the public and private sectors, does not violate the privacy rights of Canadians. That means not only investigating and responding to complaints, but undertaking audits, conducting research into privacy issues, promoting public awareness and education, and providing advice to Parliament, government, and the private sector on privacy issues.

In short, we are a watch dog charged with generally keeping a watchful eye on anything that may have an impact on the privacy rights of Canadians, which is why we, of course, have an interest in government initiatives that involve the collection, use and disclosure of personal information.

The Rationale for Conducting PIAs

In order to appreciate the rationale for the government's adoption of this Policy, it may be useful to revisit what it is that we mean when we speak of privacy. Privacy is often defined as the "right to control access to one's person and information about oneself." In other words, the right to privacy is the right of individuals to decide what and how much information to give up, to whom, and for what purpose.

Now, clearly the degree of control one can exercise over one's personal information varies depending on the context in which the exchange takes place. In a commercial context, for example, parties are free to enter into transactions and define the terms of the exchange according to their respective interests. In other words, the disclosure of personal information figures into this exchange as a subject of negotiation.

When we talk about government, however, the nature of the relationship is very different. Rarely are individuals in a strong bargaining position when it comes to the collection and use of their personal information by government. When a government agency or program needs personal information to carry out its mission, that information will be collected.

Because citizens are not in a strong bargaining position when dealing with government, government has a special trust relationship with citizens — a fiduciary duty to protect personal information under its charge. Performing PIAs constitutes one way that government institutions can honor that public trust, and in so doing earn the confidence of their clients and the public at large.

How do PIAs perform this role? They do so by providing departments with an approach to forecast the impacts of a proposal on privacy, assess compliance with privacy legislation and principles, and determine what may be needed to overcome or reduce the negative impacts. In short, PIAs serve as a privacy risk management tool.

Privacy risks themselves are identified by examining specific design elements or operational features of a given program against a set of privacy principles, which are itemized in the PIA Policy Guidelines. Testing compliance with the principles is achieved by essentially asking questions.

For example, it is a fundamental tenet of privacy that only that information necessary to perform a specific activity be solicited from individuals. This is somewhat different from the requirements of the Privacy Act, which merely demands that information collected relate to a government program or activity. Information collected may relate to a government program or activity, but if the information is not necessary, then you have identified a privacy risk.

By asking the right questions — i.e., whether the information requested is truly necessary, whether the use is consistent with the stated purpose, whether retention is rationally connected to its use, etc, the PIA serves to give effect to the fair information practice principles.

In sum, PIAs perform the following roles:

  1. They act as an early warning and planning tool;
  2. They forecast and/or confirm the impacts of a government proposal on the privacy of individuals and groups;
  3. They provide a mechanism to assess a proposal's compliance with privacy protection legislation and principles; and
  4. They provide a framework for the development and implementation of actions and strategies required to avoid or overcome the negative impacts of the proposal on privacy.

In conducting a PIA and acting upon the advice advanced therein, government departments can:

  1. Avoid adverse publicity, the loss of credibility and public confidence and the legal costs, remedies and sanctions that could result from negative impacts; and
  2. Increase Canadians' privacy awareness and confidence with the government's handling of their personal information by informing them of the details of the proposal.

The potential costs to departments by not conducting a PIA where one is required should not be underestimated. One need only recall the highly publicized debacle over HRDC's Longitudinal Labour Force File (LLF) whose subsequent dismantlement following public complaints against the database cost the department millions of dollars. Arguably had a PIA been done on the LLF prior to implementation, HRDC could have avoided the adverse publicity and financial losses that it suffered as a result of this incident.

The Process of conducting a PIA

Turning to the subject of process, compliance with the Policy presupposes, at the most basic level, that staff within the department are aware of the Policy and its requirements. To this end, we have been encouraging departments to promulgate the principles of the Policy among their staff and to define the roles of different personnel categories in the identification, review, and reporting of projects which may require PIAs.

Identifying projects which may require a PIA typically falls on the shoulders of program or project managers, since they are often the most familiar with design features of the program/project under their charge. It is thus of critical importance that these individuals know what to look for in order to identify projects which may require a PIA.

Regarding what to look for, the Policy itself lists several indicators that should alert project managers of the need to conduct a PIA. While in many instances the need for a PIA will be obvious, in other instances it may not. In this regards we recommend that departments run their program through the Questionnaire Template in the Guidelines. Responses to these questions will often highlight problems that the indicators, on their face, may not immediately reveal.

Beyond the identification of projects/programs which may require a PIA, structures must be put in place to review these initiatives to determine, on the basis of a preliminary assessment of the program/project's features, whether to commit the resources to prepare a full PIA. Some departments have recognized this need and have taken step to set up such administrative structures, drawing on personnel from the department's legal, ATIP, and IT branches to specifically review projects with the aim of determining whether a PIA is required. We think this is a sensible approach and one all departments ought to emulate.

Once a decision has been made go ahead with a PIA, the next step will be to decide whether to conduct the assessment internally or with the assistance of outside expertise. Whether you decide to go one way or the other will depend on a number of factors — financial resources, availability of internal expertise, time, etc.

The decision to rely on in-house or outside expertise need not come down to a stark choice between one or the other. The conduct of a PIA is a co-operative endeavor, requiring a variety of skill sets, including program managers, technical specialists, privacy and legal advisors. Some of these skills will be available in-house, while others may not. The important thing is to recruit the skills necessary to do the job well, and this may, in some instances, require engaging the services of outside expertise.

The Role of the OPC

The Policy, of course, requires departments to consult with our Office in regards to all projects for which PIAs have been conducted. Our role in this exercise is not to approve or reject projects — our role is to assess whether or not departments have done a good job of evaluating the privacy impacts of a project and to provide advice, where appropriate, for further improvement. This makes for a collaborative, non-judgmental way of promoting the goals of the Policy.

Upon receipt of a PIA submission, the report is assigned to a Project Review Officer. That Officer's first task is to determine whether the PIA includes all the documentation necessary to conduct a proper appraisal of the project, and whether the report has been prepared in compliance with the Policy Guidelines. This review results in the production of a Preliminary Assessment Report which identifies any gaps or omissions in the material provided.

What will our office be looking for when reviewing a PIA? Several things;

  1. We will want to make sure that the department has the legal authority to collect and use personal information;
  2. We will want to ensure that the PIA is very clear about the amount and type of personal information that will be collected, how it will be used, and to whom it will be disclosed;
  3. We will want to satisfy ourselves that all the privacy risk associated with the project have been identified;
  4. We will want to satisfy ourselves that all the mitigating measures proposed are reasonable and appropriate, and
  5. We will want to know what the department ultimately intends to do to mitigate the risks identified.

If the Preliminary Assessment of the PIA concludes that information is missing or that risks have not been identified or adequately addressed, the submitting department will be notified. Unless there are security concerns, this will normally be done through e-mail. Meetings to discuss issues or work through problems may follow such notices.

Our Experience to Date

Since the PIA Policy came into effect in May 2002, our office has received around 90 final PIAs and PPIAs.

Common Omissions

So far there has been no PIA, and certainly no PPIA, where we have not found it necessary to go back to the submitting department for additional information. Some common omissions in information we have encountered include:

  • Failure to include a complete inventory of data elements collected and used (information may be described, but not itemized);
  • Failure to describe adequately the business process;
  • Failure to adequately describe the information security infrastructure associated with the project.
  • Failure to include an action plan.

Regarding the failure to include an action plan, quite often a PIA will list a series of recommendations to mitigate the privacy risks identified, without specific information on how these recommendations will be implemented. Furthermore, there is often no indication in the submission whether the department has accepted these recommendations. This is particularly true in cases where the PIA has been prepared by an independent consultant.

Absent an action plan, stating precisely what the department intends to do to mitigate specific risks, we are left commenting on mere proposals. It is important for departments to keep in mind that the PIA is not an end in itself, but a tool to provide guidance as to what actions must be taken to render a given project privacy friendly. In other words, an action plan is the logical and expected outcome of the PIA review process.

We will, in any event, ask departments in all cases to advise us of what they intend to do in response to the PIA's recommendations. We therefore strongly encourage departments to include an action plan with their final PIAs. Doing so, will help us finalize the review process.

Though a PIA, if well done, should operate as a "stand alone" document, there are background documents we frequently request to assist us in conducting our review. These include:

  • Provisions dealing with privacy and security in agreements, where third party service providers are involved;
  • Threat and Risk Assessment (TRA) reports, where they have a bearing on the security features of a given system;
  • Project feasibility studies, where conducted to support the business case;
  • Project management plans, as they relate to project design; and
  • Technical specifications relating to system design.

These documents need not be provided in their entirety, but they should be made available on request.

Common Problems with the Privacy Analysis
  1. Confusing Privacy with Security and Confidentiality
    • We have received PIAs that have described a system's security structure in detail, but which have overlooked other important issues such as excess collection, client notification, role of consent, third party contracting, etc.
    • Though security related risks do figure prominently in many of the submissions we receive, particularly those involving electronic service delivery, it is important to remember that safeguarding personal information is but one of the ten principles constituting the Policy Guidelines.
  2. Attention to Security Limited to Transmission of Information:
    • Quite often, especially with on-line applications, PIAs will focus on the security features used to protect information while in transit (SSL encryption, for example) while ignoring other points of vulnerability i.e., security of departmental databases and the security of the client's computer.
    • Studies have shown that up to 70 to 80% of database intrusions are committed by persons who have network authorization, knowledge of database access codes and an appreciation of the value of the data they wish to exploit. In other words, insiders.
    • Documented access privileges, logins, roles and passwords to restrict queries and application usage are of critical importance, but if the information in the database is in "clear text," which is more than often the case, the information is vulnerable. To mitigate this risk, we have been encouraging departments to implement appropriate audit procedures and to adopt selective database encryption i.e., encrypting certain fields containing sensitive information.
    • Clients themselves constitute a point of vulnerability often overlooked. Indeed, clients are often cited by security experts as the weakest link in the chain when dealing with an online service. Clients need to be advised of the risks which immediately affect them, particularly with regards to the use of shared computers, and what steps they can take to protect their personal information.
  3. Seeing the PIA Process as a Privacy Compliance Audit
    • Often we will see PIAs that assess privacy risks strictly in reference to the requirements of the law. This represents a misunderstanding of the fundamental role of a PIA.
    • A PIA is a risk assessment tool that measures compliance not just against established legal standards, which represents minimum acceptable practice, but against best practice principles.
  4. Failure to Link Identified Risks with Specific Design Elements of a Project
    • We have received PIAs which have identified risks, but without reference to the system feature which has given rise to the risk.
    • Understanding a problem is the first step to finding an appropriate solution. Unless a given risk can be traced to a specific feature of a given system or program, that risk remains hypothetical, rendering it difficult to determine if the mitigating measures proposed are appropriate. The purpose of the Policy is not to impose costs on departments that are unwarranted.
  5. Proposed Mitigating Measure not Appropriate to the Risk Identified
    • This is perhaps the most common defect that we encounter in reviewing PIAs and where we can perhaps be of most use to departments.
    • An example of just such a case was a recommendation to a department to launch a communication strategy to allay public fears about the program, without actually stating what the public had to fear in the first place, and what was needed to remedy the problem.
GOL Specific Privacy Risks

Most of the PIAs that we have received have been for initiatives or projects involving the electronic delivery of services to individuals through the Internet. In our review of these projects we have identified a number of common privacy risks. These risks, and the mitigating measures relating to them, are as follows:

  1. Notice
    • Proper notice constitutes one of the most basic principles of privacy and one that is often given little attention. Clients using an online service need to be informed up front of the purpose of a given collection, the authority upon which the information is being collected, the uses and disclosures to be made of their personal information, and their rights under the Privacy Act.
    • Users of online services need also to be advised, as will be discussed shortly, about the risks associated with conducting a transaction online and the steps the department has taken to mitigate those risks.
    • Users of online services must also be informed that use of the online service is voluntary and advised of alternatives to the service sought.
  2. Client Authentication/Shared Secret
    • The kind of client authentication required in any given instance varies depending on the sensitivity of the information involved and the potential impact the transaction may have on the individual. The type and number of data elements required to establish identity must be calibrated to the level of confidence needed to perform the transaction.
    • Shared secrets form the basis of most of the online authentication systems currently in use by the federal government. Our concern with this has been that shared secrets are just that — secrets shared by too many people. As a result, we have been encouraging departments to select data elements, or combinations thereof, which cannot be easily guessed by others.
  3. User ID and Password Management
    • User IDs and passwords are the keys used to provide database access. Here too, departments have a duty to ensure that these access tools conform to acceptable standards of security.
    • User IDs and passwords that can be easily guessed by others provide little safeguard against unauthorized access. Persons should generally avoid using their name as a User ID ad passwords should generally be configured using a combination of alpha-numeric values and case sensitive elements.
  4. Desktop Security
    • To the extent that desktop computers serve as the portal to database access, they constitute a significant security hazard.
    • Departments have a duty to advise users of the security hazards associated with their desktop computers — the retention of transaction data in their temporary memory files, print command files, etc., - and the ways these risks can be mitigated.
    • Departments have a duty to also advise users of the threat posed to their information security by the presence of malicious software — Trojan horses, spyware, etc., — which may reside in their hard drives, and to advise them of the tools available to protect oneself from these threats.
  5. Use of Shared Computers
    • The security risks to desktop computers cited above are magnified when the computer used to perform on online transaction is shared by other users.
    • Again, departments have a duty to advise users of the risks specific to shared computers and the steps that can be taken to mitigate those risks.
  6. Third Party Service Providers
    • The use of third party service provides constitutes a significant privacy risk which departments must turn their attention to. Simply including a clause in an agreement which states that the contractor is subject to the Privacy Act is not enough.
    • Specific provisions in the contract dealing with ownership, liabilities, access privileges, use and disclosure rights, etc., have to be detailed in the agreement.
  7. Transaction Monitoring
    • Transaction monitoring constitutes both a privacy risk, and a privacy mitigating measure. It constitutes a privacy risk in so far as it acts as a surreptitious form of collection of personal information, as well as a risk if undertaken for the purpose of constructing behavioral profiles of individuals. It constitutes a mitigating measure if undertaken to track, and thus validate, transactions for audit and control purposes.
    • The risk element of transaction monitoring can be avoided by informing clients up front of the purpose and scope of the activity, and by resisting the temptation to use the information for client profile purposes, unless done with the express consent of the individual.
  8. Database Access Controls
    • As stated earlier, studies have shown that the greatest threat to data security and integrity comes from persons with privileged access to the database. This risk is often underestimated, if not overlooked altogether.
    • The establishment of security clearance standards for certain access privileges, the development of access and use profiles, the implementation of appropriate access control mechanisms, and the institution of activity tracking systems are critical to mitigating the risks posed by rogue employees.
    • Beyond these measures, we believe that departments should give serious consideration to selective database encryption. Once thought to be too expensive and taxing on operational performance, encryption products are now available on the market that are neither costly or deleterious to operational efficiency. Again, something we encourage departments to consider.

Conclusion

To sum up, although there still exists considerable variation in the quality of reports received, the PIA Division has remarked on a general improvement in the level of rigor and professionalism brought to the preparation of PIAs since the early days immediately following the issuance of the Policy. This improvement in the quality of submission reflects among departments an increasing familiarity and comfort level with the Policy's requirements.

The most significant benefit though we have seen over the last 21 months that can be attributed to the PIA Policy is the increased awareness among government personnel at all levels of the importance of privacy and how it impacts on their day-to-day functions. Increasingly we see privacy truly becoming a core consideration in the conception, design, and implementation of government programs and services, which is the purpose of the Policy.

Thank you.