Engineering privacy

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Carleton University

March 25, 2003
Ottawa, Ontario

Julien Delisle
Executive Director


Having someone from the Office of the Privacy Commissioner address a group of engineering students may seem like an odd match. Engineering has been aptly called "the art or science of making practical." As future engineers, you're interested in applied science-bringing the benefits of hard science to the real world.

Privacy, by contrast, is often viewed as being hardly practical-not really very far removed from philosophical speculation.

But my view is that your field and mine are in fact closely linked. And that's not just because I actually began my academic life studying to be an engineer.

Engineering is not just practical; it's also a philosophical enterprise. Scientific curiosity is one way that we contemplate our world. When we think like scientists, we are wondering about the world and how it works. When we think like engineers, we are wondering about how the world could work better. Engineering is a way of physically making sense of the world, and trying to make it a better place.

Privacy, on the other hand, is not just philosophical. It's practical. We think about privacy-and at the Office of the Privacy Commissioner we think about it, talk about it, write about it-because it's at the heart of a way of life based on respect for individual autonomy and freedom. We want to promote and strengthen it, to make the world a better place.

The British educator Sir Eric Ashby described engineers as the key figures in the material progress of the world. He said that engineering "makes a reality of the potential value of science by translating scientific knowledge into tools, resources, energy and labour to bring them into the service of man.." But he noted that, to make this contribution, engineers need "the imagination to visualize the needs of society and to appreciate what is possible." And they need not just technological understanding, but what he called "broad social age understanding," to bring their visions to reality.

I want to put before you today the idea that privacy is an absolutely crucial part of that "broad social age understanding." Ultimately, you as engineers are not going to be able to accomplish the things that interest you without considering their privacy implications, and building privacy in, as part of the engineering process.

So I want to talk to you about the meaning and importance of privacy, about the role of the Office of the Privacy Commissioner, and about building privacy into your projects. I'll focus my examples on information systems, but the same thinking applies broadly to the work of engineers.

Privacy is a fundamental human right. It's a critical element of a free society. Former Justice La Forest of the Supreme Court of Canada once wrote that privacy is "at the heart of liberty in a modern state."

There's no real freedom without privacy. In fact, some people call privacy "the right from which all freedoms flow." A private sphere of thought and action, one that's your business and no one else's, it is fundamental to freedom of thought, freedom of conscience, freedom of speech, and so on. As Privacy Commissioner George Radwanski often says, if you have to go through life wondering if someone is looking over your shoulder, watching your every movement, every purchase, and every human transaction, noting, judging, interpreting and maybe misinterpreting your actions-if you have to go through life like that, you're not free.

As engineers, you might want to reflect at some length on the freedom that flows from privacy-the freedom to think unconventional thoughts, to try out new things, to experiment with different approaches. You don't need me to tell you that this freedom is critical to scientific inquiry and to good engineering.

Now, what do I mean by privacy?

It's often called "the right to be let alone," and that's a good definition. It reflects people's visceral reaction to being monitored or scrutinized or bothered. That's what "invasion of privacy" means to many people.

But there's another kind of privacy invasion that's less obvious, and that's the collection, use and disclosure of information about us without our knowledge or consent.

The Privacy Commissioner of Canada, George Radwanski, defines privacy as the right to control access to one's person and information about oneself. This broader, informational concept of privacy is useful for understanding how privacy is threatened.

Until recently, privacy was protected by default. When information about us was in scattered paper records, it was a lot of trouble to compile a detailed dossier on any individual.

Unless you were famous or important, or notorious, your privacy was pretty safe. It was protected by barriers of time, distance and cost.

Those barriers are gone now, in a digital, connected world, where you leave informational fingerprints everywhere you go. Now a stranger at a computer can compile a detailed dossier on your whole life in minutes.

One way we've responded to this in Canada is to legislate to protect privacy.

The Privacy Commissioner and the staff of his Office oversee two pieces of legislation: the Privacy Act, which applies to the federal public sector, and the Personal Information Protection and Electronic Documents Act-the PIPED Act, as we call it-which applies to the private sector.

Right now the PIPED Act applies to federal works, undertakings, and businesses - chiefly banks, the telecommunications, broadcasting and transportation sectors - as well as to the sale of personal information across provincial or national borders.

By 2004, where provinces have not passed similar legislation, it will apply to all commercial activities in Canada. Eventually, all of the private sector in Canada will be required to comply either with the PIPED Act or with a similar provincial law.

As engineers, you'll be operating in this legal environment. Any system you design that involves the collection, use or disclosure of personal information, anywhere in the country, in the public and private sectors, is going to have to be built to respect the privacy principles that underlie Canada's laws.

What are those privacy principles? In a nutshell:

Personal information should be collected directly from individuals wherever possible and appropriate.

Individuals should be able to control the use and disclosure of their information by exercising or withholding consent.

Personal information collected for one purpose should not be available for use for unrelated purposes.

Information should be kept only as long as it's needed for the purpose the individual consented to.

Third-party access to personal information should be limited to those with a need to know, for authorized purposes.

Most importantly, collection, use, and disclosure of personal information should be for purposes that a reasonable person would consider appropriate.

These principles should be the starting-point for any system that uses personal information.

I want to turn now to the distinction between privacy and a couple of other concepts that people sometimes confuse with it-security and confidentiality. Though these terms tend to be used interchangeably with privacy, they're very different.

Privacy is our fundamental right to control information about ourselves-including the collection, use, and disclosure of that information.

Confidentiality is our obligation to protect personal information in our care, to maintain its secrecy and not misuse or wrongfully disclose it.

Security is the process of assessing and countering threats and risks to information.

The key point to remember is this: if privacy is not respected, ensuring confidentiality and security is not enough. If you collect, use, or disclose information about someone, without their consent, you've violated their privacy. Ensuring confidentiality and security of the information doesn't change that.

A concrete example makes this clearer. Look at the Advanced Passenger Information/Passenger Name Record database-API/PNR for short-that Canada Customs and Revenue Agency is assembling.

This is a database of personal information about every passenger flying into Canada from a foreign destination. It includes more than 30 data elements, such as every destination the person has been to, who they've travelled with, how they paid for their ticket, even their dietary preferences and health-related requirements.

This information is not collected about specific people suspected of involvement in criminal or national security issues. It's collected about every passenger, regardless of the fact that they are law-abiding, regardless of the fact that they've done nothing to incur suspicion.

I would guess that, to most of you, this comes as a bit of a surprise-that, any time you take an international flight, a dossier about you will be provided to the CCRA. I think you'd be even more surprised to learn that this information about you can be shared with police forces, the Department of Human Resources Development, Citizenship and Immigration Canada, the Financial Transactions and Reports Analysis Center, and any person legally entitled to the information by reason of an Act of Parliament. It can be disclosed for the purposes of administering or enforcing, among others, the Customs Act, the Employment Insurance Act, the Income Tax Act, the Special Import Measures Act or the Proceeds of Crime Act (Money Laundering) Act. It can in fact be shared with anyone, if the Minister believes that it's in the public interest to do so.

It gets worse, too. The CCRA intends to retain all of this information for six years from the date of its collection. So, if you fly to Paris for a holiday, all of the information about you and your trip will be retained by a government institution for six years, regardless of whether they do anything at all with the information, regardless of whether your trip in any way puts you on any of the radar screens, regardless of whether you clear all the hurdles of scrutiny under the laws governing security, income tax, money laundering, or unemployment insurance.

Now, there's no problem with the confidentiality of this information. It's not going to be broadcast far and wide; access to it is going to be limited to those who are authorized.

And security is not an issue. We're satisfied that it will be securely held and that the technological and human resources to protect it are adequate.

But none of that changes the fact that this is an unjustified collection and use of personal information in the first place. In other words, the problem is that privacy will be violated. Assuring confidentiality and security doesn't change that.

Now let me give you an idea of the kind of privacy concerns that can arise in the design of information systems. Here's some of what we've observed with the federal Government On-Line initiative.

Government On-Line proposes the elimination of walls between agencies and programs, within government and across levels of government.

That sounds progressive and efficient.

But those walls are also walls between collections of personal information about individuals and their interactions with government.

That information has been collected for specific purposes. When it's held in separate databases or "silos" specifically for those purposes, it's segregated.

When the walls of those silos come down, two things can happen. One is that someone with a need to know only one piece of information has access to lots more. The people processing my application for a CPP disability pension, for example, have a need to know my personal health information. No other government official needs to, or should.

That's one problem. The other is that information can be combined, to reveal new information and create profiles of individuals.

Profiling is the hallmark of surveillance societies. Dossiers on individuals, tracking their activities and their interaction with government, have no place in an open, democratic society.

Sometimes there's justification for matching personal information from different sources. Both the Privacy Act and the PIPED Act allow it in certain circumstances. But those circumstances are strictly limited, and they have to be justified.

Separate databases are a built-in protection against unrelated uses and against profiling. The advantages of this can be lost when databases are merged-unless you take steps to build in protections.

Another of our concerns is client authentication.

Governments often need to identify who they're dealing with, particularly when people are accessing benefits and programs electronically. So they've been looking at "e-identities" and smart cards.

Authentication mechanisms are fraught with privacy problems.

Smart cards, for example, have the capacity to store or access large amounts of personal information, relating to different programs and services. If they're designed right, they can protect privacy.

But a single card that holds all the information about our interactions with government would accelerate the centralization and sharing of personal information, and raise the problem of combined databases to a whole new level.

So these are a couple of our concerns. How would you address them?

I can't give you a prescriptive answer. That's going to be your area of expertise as engineers, not mine. I will simply say that you need to build privacy into your systems at the outset. It can't be an afterthought. You can't say, we'll cross that bridge when we come to it, or if someone complains. Privacy principles have to be as fundamental to the design of information systems as hardware and software.

If you're designing a privacy-friendly information system for a client, a good starting point is to assess the client's existing personal information practices and systems with the reasonable person test-asking whether personal information is being collected, used, or disclosed for purposes that a reasonable person would consider appropriate in the circumstances.

The responsibility for this personal information extends "from end to end." From the moment personal information first touches your client's hands, to the moment it's properly disposed of, the person's privacy has to be respected. That responsibility doesn't just start when the information is secured in a data base, or end when the information is passed on to another party.

You need to use the guideline of "who needs to know what." People should be able to reveal information about themselves where they need to, as much as they need to-and no more.

Above all, it's crucial to remember the importance of knowledge and consent. People should know what information is being collected about them, and how it's being used and disclosed. And they should have the opportunity to exercise control over that collection, use, and disclosure, through the power of consent.

So these are some of the considerations you would need to look at in designing privacy-friendly information systems. Again, the key is to build privacy in at the outset.

One way to ensure that privacy is built in, when you're designing a system, is to do a Privacy Impact Assessment.

This is something that the federal government has made mandatory for all its new or redesigned information systems, including Government On-Line systems. A similar kind of process, on a smaller scale, can be adopted by any organization.

What it means, simply, is analyzing the likely impacts on privacy of a project, practice, or system. It involves looking at all the personal information practices that go into it, such as what kinds of personal information are collected, how consent is obtained, how and for how long the information is kept, how it's used, and to whom it's disclosed.

It means looking at things like the purposes for collection, use, and disclosure of personal information, the authority for them, what kinds of linkages there will be between this and other information, and how individuals will be able to exercise their right of access to their information. And of course it requires assessing how the system complies with privacy legislation and principles.

Some of the kinds of questions you would pose in a privacy impact assessment might be:

Will the system limit access on a need-to-know basis?

Will it be possible to make inferences about an individual, based on matching disparate bits of information?

Will the system effectively amount to surveillance? Will it track customers' or employees' activities, or at least facilitate tracking? If so, is that justified?

Besides questions about possible violations of privacy, there are questions about the resources to deal with them-such as whether an accountability structure is in place. Is there somewhere in the organization that individuals can go if they have a privacy problem? Can people get timely access to their own personal information? Is there someone in the organization who understands privacy, who can advise them? And can that person swing some weight in the organization, and ensure respect for privacy?

These are not strictly speaking engineering questions, of course. But privacy is so critical to the engineering enterprise that you've got to ensure that they're looked at, even if it's not you doing the looking.

And of course, those of you who go on to run your own engineering firms, particularly if you're employing other people, will have to pay special attention to privacy. Doing this kind of assessment, in addition to identifying the privacy landmines on the road ahead, does something else as well: it enables you to sensitize the people in your organization to privacy issues. It can help you to create an organizational culture of respect for privacy, where everyone supports and understands privacy as part of the corporate goal.

And it's a very useful tool for monitoring information systems over time. You've identified privacy risks, you can see whether your means of addressing them are working, and you're alert and attuned for new, unforeseen ones coming up.

I might add that one of the interesting things we've noted, in the early stages of the implementation of the Privacy Impact Assessment Policy, is that this process is allowing project designers and managers to catch more than the impact on privacy. It's given them a valuable second look at their proposed projects-and sometimes that's sending them back to the drawing board, as they realize that they've overlooked something crucial. In short, this is a useful management tool for more reasons than just privacy.

Even with this advantage, some people-maybe even some people in this room-will look at this process of Privacy Impact Assessment, and throw their hands up. They'll be appalled. They'll say it's contrary to one of the principal objectives of engineers and systems designers-efficiency. It gets in the way. It complicates things.

So am I selling inefficiency? No, I'm not.

"Efficiency" means choosing the best use of resources to achieve defined goals.

What's critical is how we define the goals.

As engineers, your goal is to understand the world and build a better one. Strengthening and protecting privacy has to be integral to that. The fundamental human right of privacy has to be a part of your goal.

Protecting privacy is part of what defines a successful information systems design, and successful engineering in general.

So respect that fundamental human right of privacy. Build it into your systems and projects. Never stop believing that it's possible to run a business or a country, and provide services efficiently and conveniently, without sacrificing privacy.

As former Prime Minister, R. B. Bennett once said "That long {Canadian} frontier from the Atlantic to the Pacific Oceans, guarded only by neighbourly respect and honourable obligations, is an example to every country and a pattern for the future of the world".

That should not change in our digital world.

Report a problem or mistake on this page
Please select all that apply (required): Error 1: This field is required.

Note

Date modified: