The Privacy Challenge - Connecting citizens with all levels of Government
This page has been archived on the Web
Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.
Conference Board of Canada's
2002 eGovernment Conference
Crossing Bridges to Success
May 9, 2002
Privacy Commissioner of Canada
(Check Against Delivery)
The subject that you've come together to discuss, connecting governments on-line to citizens and to each other, is one of the most positive developments to come out of the Internet age. It's also one of the most important privacy challenges facing us as a society. I want to talk to you today about how we can work together to address that challenge.
Anyone who has ever stood in line, or been bumped from one recorded message to another, can appreciate the benefits that e-government offers.
But e-government involves the collection, use, and disclosure of personal information. And that means that it's about privacy. In fact, my definition of privacy is the right to control access to one's person and to information about oneself.
The connections that you're gathered here to talk about building-between citizens and government, between government institutions, and between levels of government-have the potential to threaten that right, and even destroy it. I know that's not what you want. So let's talk about it.
E-government raises a number of privacy concerns. To my mind, the most important one, and the least obvious, is wrapped up in the "interconnection" idea.
It always sounds so appealing and seductive when people talk about merging databases, bringing down walls, and eliminating redundancy. Everyone wants cooperation and coordination between agencies that have a common goal. Everyone opposes duplication and waste. So who would argue against breaking down these walls?
As supportive as I am of improved service delivery, I am professionally required-as Privacy Commissioner-to be the one to rain on the parade. It's an unfortunate fact that some walls between collections of personal information are crucial to privacy. So we have to be very careful about breaking them down.
One of the basic principles of privacy is that personal information collected for one purpose shouldn't be used for another without consent. You've all heard that classic expression of concern for privacy: "I don't mind telling you this, but I don't want it spread around, or used against me, or coming back to haunt me." The walls between banks of personal information are a sort of built-in way of ensuring that this principle is respected.
Information about individuals and their interactions with government is collected for specific uses. The separate databases it's held in-the "silos"-reflect the purposes that justified the collection and retention of the information in the first place.
Because the information is compartmentalized, there are some inefficiencies. There's some duplication. There are tantalizing questions that you could answer easily if you could just merge a couple of databases. But these inefficiencies are a trade-off for real benefits-very important ones, even if they're not immediately apparent.
Without those silo walls, someone with a need to know only one piece of information can have access to lots more than he or she needs or has any right to. If I surrender information in order to get a CPP disability pension, the only person who should have access to that information is someone with a demonstrable need for it, for the purposes I agreed to when I surrendered it. And that person doesn't need to know anything else about me. I can only count on that being the case when there are walls between the different banks of information.
That's one reason for silo walls. Another is that, without them, information can be combined-data can be matched-to reveal new information. That can lead to profiling of citizens-and that's a distinguishing feature of surveillance societies.
Dossiers on individuals, tracking their activities and their interaction with government, have no place in an open, democratic society. We have a well-established and long-cherished right to go about our lawful, peaceable business anonymously and unmonitored. That's something that we can't allow to simply slip away from us in the pursuit of efficiency-especially when, as I'll discuss a bit later, it's not entirely clear exactly what that "efficiency" is.
We're living in extraordinary times. I'm conscious of the irony of telling this to a roomful of technology experts, but I'll say it anyway: The very technological and social developments that make e-government a possibility are revolutionizing our world. That's especially true if we look at it from the perspective of privacy and its protection.
It's not just that both government and private-sector organizations hold so much personal information about us. It's that the information is being used in such radically new ways. It's the staggering growth in the speed and sophistication with which personal information can be analyzed, combined, combed, matched, and mined.
That's why I believe that privacy is the defining issue of this decade. And that's why it's so important that I and my office join forces with people like you, responsible for the design and building of e-government systems, so that privacy protection can be built into these systems.
I'm sure that you've heard that before. But I'm asking you to bear with me anyway, because you may have heard it from people who thought that they were protecting privacy, but in fact were only protecting security and confidentiality.
That was the case for a long time with at least some of the IT people involved in Treasury Board's Government On-Line project. Maybe in their everyday work or among themselves they were clear about the distinctions between privacy, security, and confidentiality. But you'd never know it from the GOL website-and although I know their understanding has greatly improved, you'd still never know if from the website. A search for the term "privacy" doesn't get you anything about privacy. It gets you lots of information about security safeguards, public key infrastructure, and protection from unauthorized use and disclosure. Those things are important, and I would be the last person to belittle their importance. But although they're very necessary, they're not sufficient. They protect security and confidentiality-but that's not the same as protecting privacy.
So, what's the difference between these terms?
Privacy, when we're talking about information systems, is our fundamental right as individuals to control the collection, use, and disclosure of information about ourselves. The right to privacy means that individuals get to decide what and how much information to give up, and to whom it is given, and for what uses.
Why is this so important?
It's important because the old saw that "information is power" is nowhere truer than in the context of personal information. It's not only that those who know our deepest secrets can control us. It's that our ability to control the collection, use, and disclosure of information about us is the key to our freedom.
Privacy lets us live as free individuals-free to read what we please, think as we please, associate with whom we please. Our right to privacy means that we don't have to go through life with someone watching over our shoulders-watching our every move, every purchase, and every human interaction; someone analyzing patterns in our behaviour; interpreting, and maybe misinterpreting, our actions; judging, and maybe misjudging, our intentions.
It's often said that privacy is the right from which all freedom flows. That's because freedom of thought, association, conscience, and speech, to name just a few, are based on our having a private sphere of thought and action, something that's our business and no one else's.
So that's what privacy is. That's the beating heart of that dry expression, "control over collection, use, and disclosure of personal information."
Confidentiality? That's something different. Confidentiality is the obligation to protect personal information that you've been entrusted with. A promise of confidentiality means that there's a duty of care to maintain the secrecy of the information, and not misuse or wrongfully disclose it.
Security is something else again-and something that I'm sure many of you are expert in. It's the process of assessing the threats and risks posed to information, and taking steps to protect the information against unauthorised or unintended access, use, intrusion, loss, or destruction.
The crucial point about this is that privacy drives the duty of confidentiality and the responsibility for security. We've got to respect and address the right of privacy first, and then deal with the requirements of confidentiality and security.
If privacy's not respected-if information about someone is collected, used, or disclosed without their consent-it doesn't matter that confidentiality and security are assured. Encrypt the information, protect it with the best firewalls-its security may well be assured by that. But that doesn't change the fact that the individual's privacy has been violated.
As I said a moment ago, this didn't always seem to be well-understood in government IT circles. It looked like the people who were designing and implementing the IT systems in the GOL initiative had been trained to think about security and confidentiality, but not about privacy. Things have changed recently-and I'll talk in a moment about just how much they've changed.
I've been urging the government for some time to build privacy considerations into projects at the outset and not as an afterthought. The reason is that, contrary to some people's stereotype of a privacy advocate, I'm no Luddite.
If you've visited my office's website, you'll know how much we rely on the power of the Internet to get information out. Communication has been one of my highest priorities since I became Privacy Commissioner, and I think I understand as well as anyone the effectiveness of technology for disseminating information.
More than that, I really believe that privacy and e-government are compatible. I'm a supporter of electronic delivery of services. I'm enthusiastic about the prospect of improvements in the way programs are delivered, and a wider range of choice as to how people gain access to government services. Who wouldn't welcome an initiative that will make government more efficient and accessible? And if it contributes to making Canada a world leader in technology, and to the development of a thriving private sector, so much the better.
But we all need to rein in our enthusiasm a little. We've got to be sure that we don't trade away our fundamental human right of privacy.
I've explained my concern about interconnected databases. Here's a couple of the other important privacy concerns raised by e-government.
First, the involvement of the private sector in delivering services or benefits electronically.
There's nothing intrinsically wrong with this. If it leads to more efficient delivery of services, great. If it contributes to the economic health of the private sector, even better. With the right privacy protections built in, it should be fine. But without them, we may have a serious problem on our hands.
I come back to my point about walls between banks of personal information helping to protect privacy. If we start linking public and private networks, we could eventually find ourselves with one inter-operable system combining the personal information holdings of both the public and the private sector.
And privacy protection in the private sector, even with the Personal Information Protection and Electronic Documents Act in force, is patchy. Unless it's subject to stringent checks and guidance, involving the private sector in program delivery could lead to the emergence of uncontrolled databases on Canadians. That's something that I will strongly oppose. As I'm sure you can imagine, I'm not going to sit by and watch personal information that's been provided for a government program become the source of telemarketing or mailing lists. That sounds unlikely to some, I guess, but it's happened in the US with information from driver's licences.
Another concern is the need for an authentication, identification, and access device-an "e-identity."
Authentication is a big issue in a networked economy. Electronic commerce tends to make businesses want to know whom they're dealing with. It's not always very thorough or meaningful authentication-providing a mailing address and a valid credit card number doesn't really tell an e-business much about the customer that they're dealing with-but it happens nonetheless. And while ingenious minds are always looking for alternatives so that we can do business anonymously, I think it's likely that electronic commerce will continue requiring some level of authentication for the time being. But however it's accepted in the private sector, its use by the state is a different issue, and we need to be aware of the problems, right at the outset.
We need to ask two questions about authentication of clients of government services. First, to what extent is there actually a need to identify the client?
If people have a ready means of authenticating their identity in an electronic interface with government, it may be tempting to require authentication when it's not really needed.
Obviously, if you're seeking a government benefit, you have to identify yourself. The government has to be able to verify that you're who you say you are, that you're entitled to the benefit, and that you haven't already received it.
But there's no need to ask people to authenticate their identity in a transaction that can just as reasonably be done anonymously. A simple request for information, for example, requires no authentication of the client's identity.
E-government systems should require authentication only when it's necessary. The default setting should be anonymity. Things like cookies, digital certificates, and public key encryption all contribute to client identification and detract from anonymity. They should only be used where anonymity will not work.
The other question we need to ask is this: How deep do we need to drill to authenticate identity, and what evidence will satisfy us?
If you're interfacing on-line with the Department of Human Resources Development for Canada Pension Plan benefits, the department needs to verify certain things about you. If you're inquiring on-line about your passport, the Department of Foreign Affairs needs to verify certain things about you.
The kinds of information those departments need to know about you aren't the same, although obviously there's some overlap, like your name and your date of birth. Whatever means of authentication is chosen, the architecture of the system has to reflect that.
I'm sceptical about one-size-fits-all authentication devices, like a smart card that incorporates all the information about our interactions with government, and then makes it all available, indiscriminately, every time we interact with a government department. Authentication for one purpose must not elicit information from you that's only required for another purpose.
A single card that holds all the information about our interactions with government raises the problem of combined databases to a new level. And, unless we take steps to restrict it, the forces of convenience and efficiency will drive it towards becoming a national identity card.
In Canada we don't have to identify ourselves to anyone-agents of the state or anyone else-except for specific and limited purposes. This is not a country where you can be stopped by a police officer and casually required to produce your papers. In Canada you have the right to go about your business anonymously and peaceably, without having to justify yourself, without being subject to surveillance.
It wouldn't be acceptable to lose that.
So those are my concerns about e-government. And I really think that they should be yours, too. Because it seems to me highly likely that e-government will fail, at the cost of enormous investments, if privacy concerns aren't addressed.
Privacy isn't just an abstract right. It's true that sometimes people take it for granted, the way they take their health for granted. But as with their health, it doesn't take much to make them aware of their privacy, and protective of it-as soon as something goes wrong. We've learned from the development of electronic commerce that people are reluctant to engage in electronic transactions if they think their privacy is at risk. I think that e-government has the same hurdles to face.
That's why I'm always urging information system designers-in government and the private sector-to build privacy in from the start. The key to that is a Privacy Impact Assessment.
What exactly is a Privacy Impact Assessment? Very simply, it's an analysis of the likely impacts on privacy of a system that's either being proposed or being redesigned. It's based on an examination of the personal information practices that go into the system, the purposes and statutory authorities for collection, use, and disclosure of personal information, and the overall compliance of the system with privacy legislation and principles.
Let's suppose you've got a new system in the planning stages. How would you go about assessing its impact on privacy?
First and foremost, you'd want to know what kinds of information are collected, how consent is obtained, how and for how long the information is kept, how it's used, and who it's disclosed to. You'd also want to know what kinds of linkages there will be with other information, and how individuals will be able to exercise their right of access to their information.
You would want to ask whether your system is going to lead to data matching. Will it be possible to combine unrelated personal information to create new information about individuals? Will the system, especially its demands for identification and authentication, lead to profiling, monitoring transactions across programs, and other forms of surveillance?
Will it facilitate electronic misuse of publicly available personal information? For example, are you going to be creating databases of personal information that can be downloaded and electronically reconfigured into mailing lists?
Those are questions about possible violations of privacy. You also need to ask questions about the resources to deal with them-such as whether there's an accountability structure in place to deal with privacy issues.
In effect, this is a feasibility study from a privacy perspective. It's a way to forecast impacts on privacy, and determine what's required to overcome the negative impacts.
It also helps you to sensitize the people in your organization to privacy issues and create an organizational culture where privacy is part of the corporate goal.
And it's a useful tool for monitoring the system as it progresses. You've identified privacy risks, you can see whether your means of addressing them are working, and you're alert and attuned for new, unforeseen ones coming up.
Down the line, when you're reviewing the compliance of your systems with privacy legislation and principles, it's an excellent basis for ensuring that those involved in the review-system operators, management, and representatives of the oversight body-understand the system.
Ever since I first spoke on the subject in July of 2001, I've been urging the federal government to use Privacy Impact Assessments. That brings me to the subject of a significant development in the federal government, one that represents a big step for Canadians' privacy.
A week ago, on May 2nd, Treasury Board's new policy on Privacy Impact Assessments came into effect. This policy makes Privacy Impact Assessments a condition of funding for all new, substantially redesigned, or electronically driven programs and services that collect, use, or disclose personal information. That makes Canada the first country in the world to make Privacy Impact Assessments-PIAs for short-mandatory for all departments and agencies.
So what does this mean for you, and for the design of e-government systems? Well, it means that government institutions will have to be looking at privacy right from when a new program is nothing more than a gleam in their eye. If they don't yet have the detailed information required for a comprehensive PIA, they should still do a preliminary assessment as early as possible, and consult with my Office. They should begin a formal PIA as soon as they have the detailed information required, again consulting with my Office at that point. In both preliminary assessments and formal PIAs, we're encouraging departments to approach us directly with questions at an early stage. That allows us to help departments understand what we're looking for in a PIA.
A PIA will entail a description of the program, including business process diagrams and data flow tables, and an analysis of what will happen to the personal information involved. Then you assess the program's compliance with privacy principles, legislation, and policies, using the Treasury Board Privacy Impact Assessment Guidelines as a reference.
That means asking questions such as whether the program or project will involve a new or increased collection, use, or disclosure of personal information, with or without the consent of individuals. Will there be a broadening of target populations? Will there be a shift from direct to indirect collection of personal information?
Will it involve an expansion of personal information collection for things like program integration, program administration, or program eligibility? Will it involve new data matching or increased sharing of personal information between programs or across institutions, jurisdictions, or sectors?
Will it result in the development of new common personal identifiers, or extended use of existing ones?
Will changes to business processes or systems affect the physical or logical separation of personal information, or the security mechanisms used to manage and control access to personal information?
Will it involve contracting-out or devolution of a program or service to another level of government or the private sector?
All of this assessment will have to be documented in a report. The report will describe the program or service assessed, the stakeholders, the relevant legislation and policies, and the specific privacy impacts. It will also describe the options considered to avoid or reduce privacy impacts, the residual risks that can't be resolved, and a communication strategy.
Finally, completed PIAs will be forwarded to my Office for review and advice. This is something that I think is particularly significant: the policy requires government institutions to inform my Office of all PIAs being conducted, and to send completed PIAs to us as soon as possible.
My staff will review each PIA in cooperation with officials of the institution that carried it out. That means that we'll be able to provide advice and guidance to institutions and help identify solutions to potential privacy risks. It doesn't mean, of course, that we'll be approving or rejecting the projects that are assessed in the PIAs.
What will our office be looking for in reviewing PIAs? I can only give you some examples, because the PIAs are going to have to be looked at case-by-case.
Certainly, we'll want to make sure that the department or agency has the legal authority to collect the personal information in question.
We'll want to ensure that the PIA is very clear about the amount and type of personal information that will be collected, how it will be used, and if it will be disclosed to other departments or organizations.
We'll be looking to see if the project involves data matching-if it will involve combining unrelated personal information to create new information about individuals.
And we'll want to ensure that personal information will be adequately protected with security safeguards.
Overall, we want to be able to assure ourselves that the PIA accurately identifies all the privacy risks associated with the project, and that appropriate measures are being proposed to minimize these risks.
Our comments won't be intended to be made public, but as guidance and advice to the department.
We've been working for several months on developing our review procedures, in anticipation of the official launch of the policy. I'll be informed of every PIA when my staff receive it. We've created a unit with senior members of my staff specifically dedicated to review of PIAs. We'll do everything we can to assure quick turnaround times, so that there won't be any needless delay of projects. That will be facilitated by departments contacting us as early as possible so that potential trouble spots can be identified.
As you can see, the policy involves my Office at a number of points. The reality is that the more we work with departments and provide these comments, the better they'll be able to make these assessments themselves. But there will continue to be a role for my Office, no matter how good departments get at it. I'm enthusiastic about that continuing role. It makes for a collaborative, non-judgmental way of promoting the goals of the Privacy Act. Departments won't have to wait until they get a complaint to find out that they might have overlooked something about privacy.
Of course, there is a place, and a need, for the complaint process. Canadians must have the right to challenge organizations to respect their privacy. But it's not in anyone's interests to have me oppose something when it's gone too far and millions of dollars have already been spent. And it's better to protect privacy up front than to try to undo the damage after a breach of privacy. Sure, someone can complain to me when their privacy has been violated, and I can help find a remedy, and take steps to make sure it doesn't happen again. But let's face it: they can't get back their lost privacy.
I might point out that it's not just government institutions that should do Privacy Impact Assessments. They're just as useful to the private sector.
More and more organizations are coming under the scope of privacy legislation, and many of those that aren't required to under law are still concerned about protecting privacy.
I encourage them to do Privacy Impact Assessments, because being concerned about protecting privacy is usually not enough. Good intentions, on their own, don't protect privacy.
Privacy, in fact, is often most threatened by well-intentioned people. They don't see themselves as trampling privacy. They just see themselves as trading it off for some greater good. Sometimes that greater good is customer service. Sometimes it's public security. And often it's something called efficiency.
I say "something called efficiency" because it's too often forgotten what efficiency really is.
It's not just the shortest, fastest, leanest, cleanest way to do things.
Efficiency refers to the relation between means and ends, to the choice of the best means to achieve a particular end.
How we define those goals is the crucial question. And for anyone building an e-government program or system, protecting privacy should be part of those goals. Privacy isn't an impediment to efficiency. It's not something that you can sacrifice to be more efficient. It is something that you have to protect. That should be a fundamental element of your goal. The challenge for you is to find efficient means to achieve that goal.
I'm confident that this can be done. The key, though, is to recognize how important privacy is to our society, and how concerned Canadians are about it, and build your system accordingly. Getting it right on privacy will be your best guarantee of the success of e-government.
- Date modified: