Hardware, Software, Privacy: The Three Fundamental Elements of Information Systems
This page has been archived on the Web
Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.
Canadian Information Processing Society (CIPS) Annual Conference: Convergence 2001
May 14, 2001
Privacy Commissioner of Canada
(Check Against Delivery)
I'd like to talk to you today about my role as Privacy Commissioner, about the meaning and importance of privacy, and about building privacy into information systems. We should have time at the end for some questions. I'll do my best to answer them.
As the Privacy Commissioner of Canada, I am an Officer of Parliament, appointed for a seven-year term to be the independent guardian and champion of the privacy rights of Canadians.
The Privacy Commissioner of Canada doesn't work for, or report to, the government. I work for and report directly to the people of Canada, through our national Parliament.
I am mandated to oversee and enforce two critical pieces of national privacy legislation: the Privacy Act that governs the federal public sector, and the new Personal Information Protection and Electronic Documents Act, that began coming into effect in January and that for the first time gives Canadians clear privacy rights in their dealings with private sector organizations.
I also have a legislative mandate to raise public awareness and understanding about everything pertaining to privacy, and to research privacy issues and provide independent advice to Parliament and the government.
Why does Canada have an Officer of Parliament appointed to protect privacy, and two laws setting out privacy rights for Canadians? What's so important about privacy that Parliament has turned its attention to it in this way?
It's because privacy is a fundamental right.
Privacy is a critical element of a free society-it's "at the heart of liberty in a modern state," as Justice La Forest of the Supreme Court has said.
There can be no real freedom without privacy. In fact, many have suggested that privacy is the right from which all others flow-freedom of speech, freedom of association, freedom of choice, any freedom you can name.
That's why lack of real privacy is a distinguishing characteristic of so many totalitarian societies.
And that's why privacy is not only an individual right-it's also a shared value, a social, public good. Our society as a whole has a stake in the preservation of privacy.
We cannot remain the kind of society that we all want to be-a free, open and democratic society in which we all have the autonomy to fulfil ourselves-unless the right to privacy is respected.
We need to turn around the widespread idea of the privacy of the individual being balanced against the interests of society. The interests of society include the privacy of individuals. When privacy is lost, the individual feels it of course, but society is the real loser.
That doesn't mean that privacy is an absolute right. Sometimes some of it has to be sacrificed to advance other crucial social objectives.
But if we make too many trade-offs, accept too many calls to give up a little privacy here, a little privacy there, soon we'll have no real privacy, and no real freedom.
When someone proposes a limitation, a trade-off for some other objective, we need to scrutinize it very, very carefully. Is there really a need that clearly outweighs the loss of privacy? Will sacrificing privacy really achieve the objective? Is there a less privacy-invasive way to achieve the same objectives?
Though we all value privacy, it's not always clear what people mean by it. I think how you define it is important to understanding how it's at risk, and how to protect it.
It used to be common to think of privacy as the right to be let alone, and that's still how a lot of people understand it. It's that gut-level concern that people have about wanting to go about their peaceable, lawful business without being monitored or bothered.
But there's another kind of privacy invasion that's less obvious, and that's the collection and compiling of information about us without our knowledge or consent.
None of us wants to go through life feeling that at any moment someone may be, either metaphorically or literally, looking over our shoulder. If we have to weigh every action, every purchase, every statement, every human contact, wondering who might find out about it, judge it, misconstrue it, or somehow use it to our detriment, we are not truly free.
That's why I define privacy as the right to control access to one's person and to information about oneself.
And it's this broader, informational concept of privacy that leads me to believe that privacy will be the defining issue of this new decade.
That's because we are at a crossroads.
Until relatively recently, privacy was protected pretty much by default. As long as information about us was in paper records, and scattered over a whole lot of locations, someone would have to go to a lot of trouble to compile a detailed dossier on any individual.
So unless you were very famous, or very important, or... had done something really bad, your privacy was pretty safe.
But now the move to electronic record-keeping is eating away at those barriers-barriers of time and distance and cost-that once guarded our privacy from all but the most determined of snoops.
Now some stranger at a computer keyboard can compile an amazingly detailed dossier on your whole life, literally in minutes.
The choices we make in confronting these threats to privacy will determine what kind of world we leave for our children and grandchildren.
Parliament made an important choice when it passed the Personal Information Protection and Electronic Documents Act. This law accords privacy protection to Canadians in their dealings with the private sector.
What the new law says, in a nutshell, is this:
Apart from some very limited exceptions, no private sector organization covered under the law can collect, use or disclose personal information about someone without their consent.
It can collect, use or disclose that information only for the purpose for which they gave consent. And even with consent, it can only collect information that a reasonable person would consider appropriate.
They have the right to see the personal information that is held about them, and to correct any inaccuracies.
There is oversight, through me and my office, to ensure that the law is respected, and there is redress if their rights are violated.
This new law applies to the federally regulated private sector - banks, telecommunications, broadcasting and transportation - as well as the sale of personal information across provincial or national borders and it applies to the territories where the whole private sector is a federal work under the constitution. It does not apply to personal health information in these organizations until January 1 of next year. By 2004, where provinces have not passed similar legislation, it will apply to all commercial activities in Canada. Eventually, all of the private sector in Canada will be required to comply with the new federal law or a similar provincial one. Privacy protection will be seamless.
That means that you're going to have to ensure that your information systems are built to respect privacy. If they don't, you can expect citizens to complain about it to my office. And while my enforcement style is to be an ombudsman, my position comes with teeth, and I will bite down hard when necessary.
But compliance with the law isn't the only reason, or even the best one, for turning your mind to the issue of privacy when you're designing information systems.
Respect for citizens' privacy is, I firmly believe, the key to the success of information systems.
Canadians, like other people worldwide, are increasingly aware that their privacy is at risk. When people believe that a company or a government has disregarded their privacy rights, they will respond vigorously.
I'm sure you will recall the story last year about the Longitudinal Labour Force File at the federal Department of Human Resources Development.
This was a database of very personal, private information on over 30 million individuals in Canada. It included information about jobs they'd had, taxes they'd paid, social programs they'd used.
It outraged Canadians so much that the government was forced to dismantle the entire database almost immediately.
That database may have been compiled with the best of intentions, in pursuit of perfectly reasonable objectives.
But what mattered to Canadians was their privacy. The program failed because privacy was not built in at the outset.
You will be familiar with the long-running story about the US web-advertising company DoubleClick, which collects information about web-surfers viewing its ads on the Internet-what they look at, how long they look, what they click on, where they like to go.
That information is anonymous. But DoubleClick bought a company that tracks consumers in their off-line commercial exchanges. That information was not anonymous. Suddenly, we had the prospect of DoubleClick matching up on-line and off-line behaviours, and developing detailed, highly personal profiles of hundreds of thousands, possibly millions, of people.
The public outcry was deafening. DoubleClick eventually backed away from the proposal, but not before great damage was done to its reputation-the name DoubleClick virtually became synonymous with privacy invasion. DoubleClick has worked hard since then to undo the damage. But you can bet that they wish they hadn't stumbled into that one in the first place.
Those are just two examples, but there are many, many others. And they all tell the same story:
People care about privacy. You will lose their trust if you do not respect privacy. And you're far better off to build privacy into your systems at the outset. It's a lot easier than retrofitting, after a violation of privacy principles has damaged the reputation of your company and the trust of customers.
Government is still a bit slow on the uptake, as far as building privacy into its systems.
I've seen some encouraging signs that government departments, particularly in the move to electronic provision of services-Government On-Line as it's called-are starting to understand the problem, and taking up the challenge.
But there is still progress to be made. There's a lot of talk about respecting privacy, but when we look closer, what we see is that government has tended to focus on protecting the security and confidentiality of personal information.
Security and confidentiality sometimes get used interchangeably with privacy. That, I want to emphasize, is a mistake. They're entirely separate issues.
Privacy is our fundamental right to control information about ourselves-including the collection, use, and disclosure of that information.
Confidentiality is your obligation to protect personal information in your care, to maintain its secrecy and not misuse or wrongfully disclose it.
And security is the process of assessing and countering threats and risks to information.
It's privacy that drives the duty of confidentiality and the responsibility for security. If privacy is not respected, ensuring confidentiality and security is not enough. If you collect, use, or disclose information about someone, without their consent, you've violated their privacy. That fact doesn't change just because you ensure confidentiality and security of the information.
I still have some serious concerns about potential aspects of Government On-Line.
One of the visions of Government On-Line involves elimination of the walls between agencies and programs, within government and across levels of government.
That may sound wonderful, if you believe stories of how Canadians have to contact a lot of different government departments in order to get a single service.
But those walls are also walls between collections of personal information.
If government becomes a single, centralized body, the most profound impact will come from the merging of databases.
This is information about individuals and/or their interactions with government that's been collected for specific uses.
When it's held in separate databases specifically for those purposes-"silos" as they're called-the information is compartmentalized.
When the walls of those silos come down, two things can happen. One is that someone with a need to know only one piece of information can have access to lots more. The people processing your application for a CPP disability pension have a need to know your personal health information. No other government official needs to, or should.
That's one problem. The other is that information can be combined, to reveal new information. This can lead to profiles of individuals. That's a hot subject, whether it's undertaken by commercial organizations or by government. You'll recall what I said about DoubleClick and about the HRDC Longitudinal Labour Force File.
Profiling of citizens is the hallmark of surveillance societies. The building of dossiers on individuals, tracking their activities and their interaction with government, has no place in an open, democratic society. It is the end of anonymity. It is the end of our right to go about our lawful, peaceable business unmonitored. It is the end of the right to be let alone.
Sometimes there is justification for matching personal information from different sources. Both the Privacy Act and the new private sector law allow it in certain exceptional circumstances. But those circumstances are strictly limited and they have to be justified.
Separate databases are a built-in protection against unrelated uses and against profiling. The advantages of this can be lost when databases are merged-unless you take steps to build in protections.
So this is one of our biggest concerns about the Government On-Line initiative.
Another problem is client authentication.
Government has concerns about being able to identify who it is dealing with, particularly when people are actually accessing benefits and programs electronically. This has got them looking at identifiers, "e-identities," and smart cards.
Authentication mechanisms are necessary for a networked economy, but they're fraught with problems of which we need to be aware, right at the outset.
Smart cards, for example, have the capacity to store or access large amounts of personal information, relating to different government programs and services. If they're designed right-for example, if you have different cards for different purposes-they can protect privacy.
But a single card that holds all the information about our interactions with government would raise the problem of combined databases to a whole new level. It would accelerate the centralization and sharing of personal information. If all of an individual's transactions occurred through, or were recorded at, the same source, government would have a powerful centre of data on all citizens.
The issuing, revoking, or withholding of such a card could be used to control social behaviour, limit an individual's activities, or punish unrelated activities.
These are problems we see in the Government On-Line initiative. We've impressed upon government how important it is to get this right at an early stage.
And again, I want to impress the same point on you: privacy needs to be built into your systems at the outset. It can't be an afterthought. It can't be handled after the fact. You can't say, we'll cross that bridge when we come to it. Or worse: if someone complains we'll look at it.
Lest there be any confusion, I want to stress here that I am not talking about so-called Privacy Enhancing Technologies.
We've seen with electronic commerce and particularly with the Government On-Line initiative that there are technological means of protecting confidentiality and security-for example, developing a secure channel and a Public Key Infrastructure for encrypted communications. None of these technological measures is better than the weakest link in the system-usually a fallible human being-so I think it's important to be sceptical of the broad claims made for them. That said, they work reasonably well for protecting confidentiality and security.
But as I emphasized a moment ago, protecting confidentiality and security is not the same as protecting privacy.
There have been some claims made about using technology to protect privacy. An industry has grown up around Privacy Enhancing Technologies-things like anonymizers, encryption, user-controlled smart cards. The basic model is that individual citizens, consumers, are responsible for protecting themselves from any privacy invasion by government or commercial enterprises.
I don't have a whole lot of enthusiasm for technological solutions.
I know the arguments for them. People say that technology develops faster than legislation can keep up. They say that legislation just creates more surveillance and control, the very antithesis of privacy.
Whether or not you agree with those sentiments, to my mind the bigger problem is this. No matter how well things like encryption and anonymization work, and how easy they become to use, these solutions are basically a kind of "technological opt-out."
If you've followed the debates over privacy in the commercial world, you'll be familiar with the opt-out/opt-in distinction.
In an opt-out system, someone who wants to collect, use, or disclose our personal information gives us the option to say we don't want them to.
In the off-line world, it's often a matter of calling or writing, for example to the Canadian Marketing Association, to get yourself onto a "Do Not Mail/Do Not Call" list. It may be as simple as checking off a box that says we decline their kind offer of sending us information about new services from time to time.
If we don't take them up on this offer to opt out, they proceed as though they have our consent.
Most privacy advocates, myself included, consider opt-out to be pretty poor privacy.
Consent is a fundamental principle of privacy protection, maybe the fundamental principle.
Opt-out is basically a very weak form of consent-you are presumed to consent unless you indicate otherwise.
I share the view that this puts the responsibility on the wrong party. Someone wanting to collect, use, or disclose your personal information should be required to get your active consent-invite you to opt in.
Opt-out is one of those things that works better in theory than in practice. It assumes that you know there's something going on, that you have the right to opt out, and you know how to.
It also assumes that opting out is in fact a valid, realistic option.
Those assumptions might work for the informed, patient, literate, aware consumer advocate. They don't work well as the basis for protecting the privacy of all of the people, all of the time.
Opt-out simply doesn't extend the privacy net as widely as opt-in-which, of course, is why so many marketers and information-collectors prefer it.
Encryption and anonymization, and privacy-enhancing technologies generally, take all these shortcomings of opt-out in the off-line world . and add a whole new layer of problems.
Now it's not enough that you know what's going on, and that you know you can opt out if you want. Now you have to have more than just literacy, patience, and determination. You need a certain level-in some cases a high level-of technological sophistication.
And you have to have money. The makers of privacy-enhancing technologies are not in business for their health.
I will say it again: privacy is a fundamental human right. This is not, in my view, the way to ensure the respect and protection of a fundamental right.
Anonymization and encryption and privacy-enhancing technologies in general may be useful and important, but they are still technological opt-out.
And my position is that, in cyberspace as in the real world, we shouldn't have to opt out. The default setting should be that our privacy is respected.
Technological opt-out exists so that individuals can protect themselves from privacy-invasive systems.
I'm saying, don't build privacy-invasive systems.
Respect for privacy has to be as fundamental to the design of your systems as hardware and software. Privacy principles, as set out in the new private sector law, should be the touchstone for design of information systems.
I mentioned these principles earlier. I'm not going to go into detail here about them. They're widely known and widely accepted.
But let me just outline a couple of them. Information collected for one purpose should not be available for use for unrelated ones. Information should be kept only as long as it's needed for the purpose the individual consented to. Access to personal information should be limited to those with a need to know, for authorized purposes. Most importantly, collection, use, and disclosure of personal information has to be limited to purposes that a reasonable person would consider appropriate.
That's what you need to consider the starting-point for your information system.
The way to ensure that privacy is built in, when you're designing a system, is to do a Privacy Impact Assessment.
This allows you to forecast a proposed system's impacts on privacy, assess its compliance with legislation and principles, and determine what's required to fix any problems there may be. It helps you avoid the costs, adverse publicity, and loss of credibility and public confidence that could result from a system that hurts privacy.
You need to do Privacy Impact Assessments at the earliest point in your projects. Try to get an impartial review of them. Get someone who knows privacy to have a look at them. Let them give you a heads-up if there are obvious shortcomings. In my Office, we're looking at setting up a process for government departments where we can review at least some of their Privacy Impact Assessments and offer comments.
You might be wondering what a privacy-friendly information system looks like. To give you an idea, let me briefly describe two things we've proposed for the Government On-Line project.
You'll recall that our principal concern is the development of unified, centralized databases. That's a concern that we also have about the private sector.
The architecture of the system has to incorporate privacy principles. And I don't think that you can respect these privacy principles without retaining some walls between banks of data.
We've pointed out to the government that those horror stories about having to contact a couple of dozen different government departments to get a service are just that: stories.
What single service do most Canadians need to contact multiple departments for? They contact the Department of Human Resources Development for unemployment information. They contact the Canada Customs and Revenue Agency about their taxes. They contact the Department of Citizenship and Immigration when they want to renew their passport.
If there really are services where they have to contact multiple departments, is it a good idea to reduce the number? Of course. But do Canadians really need to have only one password for all of government, and a seamless interface between different levels of government? Quite frankly, I don't think so.
What we've said to government is, you have to water your wine. One-stop shopping for government services can't be done without gravely violating privacy principles.
This is not the kiss of death for the Government On-Line initiative. Government departments can still reap the key benefits of these information systems, while maintaining information in separate silos and protecting the privacy of citizens.
On the question of client authentication, identification, and smart cards, we've advised government that, when building these systems, they should ensure that the default setting is anonymity. I think the same thing applies to building information systems in the private sector.
Where people can be anonymous, allow it. If there's no reason that is beneficial to the individual for one organization to know what he or she did with another organization, have the system keep it separate.
When Canadians go looking for general information about government programs, as opposed to actually having a transaction with government, no one in government needs to know who they are. We've reminded government systems designers of this. We've told them, where identification isn't an issue, don't make it one.
And I would say the same to you: don't make it an issue.
Construct your system around the way people would want to conduct their lives. Let them reveal information about themselves where they need to, as much as they need to and no more.
We've also told the government, if you're going to use smart cards, build privacy into them. Segment information. Use multiple cards. Put the control of the information on the card into the hands of the individual whose information it is. Otherwise, just don't do it.
Again, that same advice goes for the private sector.
Some people might argue that some of what we suggest is less efficient than what systems designers are supposed to seek. The whole rationale for electronic government and information systems generally is that they reduce inefficiency.
So what's the story with these privacy advocates, saying that you should retain walls between banks of data; use multiple cards; accept authentication and identification that's less-than-perfect, or, even better, don't identify people at all? You may be asking, what are you up to, Commissioner: selling inefficiency?
No, I'm not.
"Efficiency" refers to a relation between ends and means. It refers to choosing the best use of resources to achieve defined goals.
What's critical is how we define the goals.
And what I'm saying to you is that respecting the fundamental human right of privacy is as much a part of your goal as the bottom line of the ledger. In fact, it's a key part of the bottom line.
So build it into your system.
Respect that fundamental human right of privacy, and you'll win the trust and respect of customers. You'll win and keep their business.
I said earlier that we are at a crossroads, that how we confront the threats to privacy will determine what kind of society we leave for our children and grandchildren.
But you know, the greatest threats to privacy seldom come from those who want to do harm.
They come from well-intentioned people who say that privacy needs to be sacrificed for some greater good-customer service, prevention of crime, efficiency.
I believe that it is possible to run a business or a country, and provide services efficiently and conveniently, without sacrificing privacy.
If you recognize how important privacy is to our society and to our freedom-if you recognize how concerned Canadians are about it, and build your system accordingly-you'll have Canadians behind you all the way.
I firmly believe that protecting privacy, and winning the trust of Canadians, is the way to ensure successful information systems design. My Office is here to provide advice and support in helping you build privacy-friendly systems. I'm looking forward to working with you.
Report a problem or mistake on this page
- Date modified: