Stormy Weather - Conflicting Forces in the Information Society

18th International Privacy and Data Protection Conference

September 19, 1996

Ursula Franklin

Introduction: (Bruce Phillips, Privacy Commissioner of Canada)

Welcome to the closing session of the 18th conference, and if I said I was saving the best for the last, it would certainly be no disrespect to all of the people who have come before to this podium in the last couple of days.

Ursula Franklin is the icing on the cake, as far as I am concerned. Your next speaker is a truly remarkable person. Doctor Franklin is one of those persons with a scientific background who has long since reached out with an exceedingly interesting and spacious mind, to embrace in her thinking the whole of society and its relationship to science, and she is well known in this country for the depth of her thinking on these subjects, her wisdom, her wit. I cannot say enough about her.

She received her doctorate in experimental physics at the Technical University in Berlin. She has produced more than 70 scholarly papers and made major contributions to books on the structure and properties of metals and alloys, and on the history and social impact of technology, which gives you some insight into the reach of her mind.

She is a Companion of the Order of Canada, which is among the highest of the awards given by the Government of Canada to its most distinguished citizens. She is a Fellow of the Royal Society of Canada. She has received honorary degrees from Canadian and other universities too numerous to mention. In 1984, she became the first woman to be honoured with the title of university professor by the University of Toronto.

In addition to all of her works and significant contributions as a scientist, Dr. Franklin is known, and known best among her fellow Canadians, for her achievements as a humanitarian through her community activities.

In 1987, she was awarded the Elsie Gregory MacGill Memorial Award for her contributions to education, science and technology. Two years later she received the Vigand Award, which recognizes Canadians who have made outstanding contributions to our understanding of the human dimensions of science and technology, which of course is precisely the subject that has engaged our attention here for the last two days. In 1990 she received the Order of Ontario.

That brief recitation, dry as it sounds, does not in any way come close to describing to you what Doctor Franklin means to her fellow Canadians, and I take the greatest of pleasure in asking her now to give us the last words to this conference.

Ursula Franklin (Transcription)

Thank you for that very kind build-up. I trust that the members and commissioners do know that our friend, Bruce Phillips, started his professional life as a journalist and so take this introduction with a grain of salt.

When Bruce asked me whether I would appear to sum up and put into a slightly larger context the deliberations of these two days, I very gladly accepted. Although, as the days went on, I found it more and more difficult to think what would be left for me to say.

Essentially, what I would like to do in this summary is step back a little bit from the details of the issues. As somebody who is not, in one sense as knowledgeable as you are, I would like to look at it a little bit from both a historic and systemic perspective. I think the work you are doing is extraordinarily important, not only as it relates to privacy and data protection, but as it relates to hammering out how people will and can live in a technological society.

Whether you are particularly happy about it or not, you are the breakwater of the new tide. It will be on your souls, on your actions, and on your deliberations, that people will measure how and whether it is possible to have a humane society, in the real sense of humanity, in the presence of a technological infrastructure that to many citizens seems overwhelming. So, it is with utmost respect that I address you. I don't envy the tasks that are ahead of you.

The reason I called this summary Stormy Weather is that I think there is turbulence ahead. What I want to do is outline the conflicting forces that produce this turbulence, if at all possible, and to help clarify it. Only you can navigate, but if others give you a little corner of their map, it might be of importance.

Now, the map that I have is personal, it is subjective, it is biased. I have a standpoint from which I look at the world. I am a Canadian. I am a scientist. But I am also an unrepentant feminist and an unrepentant pacifist. So, from where I stand, certain things look big and are in the foreground, which may not be big and in the foreground for you.

I think what we are all discussing are political issues. They are political in the best sense of the word, in the original Greek sense of the word, in that they affect the community, the very citizens who have to work and live together. When all the technology is disposed of,when we have understood or put aside all the details, what is left are the issues of how people live together. These political issues have existed ever since people have lived together and were articulate about their relationships.

One concept that has been underlying your deliberations, although it was not articulated as much as I initially thought it would be, is the concept of justice, because justice is one of the classic four cardinal virtues, justitia. The word "cardinal", of course, comes from cardos or"hinge", and it is justice that is the hinge upon which a civilized society hangs. If that hinge breaks, if that hinge does not keep the structure together, then everything else becomes threatened or at risk. So it is, in my opinion, essentially justice that we are talking about and the conflicting forces that I perceive really relate to how one can utilize technology without compromising justice.

Now let me, in spite of all the talk about technology, define technology for a moment. To me, it is important to understand that technology is practice, it is the way we do things around here. This definition takes machines and devices into account, as well as social structures, command, control, and infrastructures. It is helpful for me to remember that technology is practice. Technology, as a practice, means not only that new tools change, but also that we can change the practice. If we have the political will to do so, we can set certain tools aside, just as the world has set slavery and other tools aside. It is also the nature of modern technology that it is a system. One cannot change one thing without changing or affecting many others.

The typical characteristic of modern technology is its systems character. It is like a tight weave. When you pull one thread, all other threads move. So an exercise of replacing and pulling threads has to be done in the context of knowing that if one pulls one, the others will be affected. This may be good. This may be bad. But it is there. And, for us, I think this is technology in its total embeddedness, in both the economic and in the political systems.

I will now describe to you the forces I see at play at the moment. I will use a very simplified, and you might think simplistic, way of showing this. I do this to be able, in a short period of time, to highlight both "contrasts" and "comparisons". It is as if you have a photograph and instead of printing all the depth of its shades, you deliberately choose to overprint it into a very harsh contrast. It is that contrast, and I am not so naive as to think the world divides itself so there are no grey areas, I have the privilege of highlighting here.

I think when you discuss privacy and data protection, as you have done, there are two poles that emerge fairly quickly: does one primarily protect people, or does one protect data? Out of that polarity arise two essential climates or models which I have perceived from the discussion of your task.

There is the climate and the model of human rights that Peter Hustinx so beautifully described. All the notions of privacy can trace back their origin and validity primarily to considerations of human rights. When human rights informs the language in which the discussion amongst you, the general public, and Parliament takes place, then you speak, rightfully, about citizens and all that comes with that.

On the other hand, if the emphasis is primarily on the protection of data, one looks at a market model, one looks at an economic model, and all the things you have heard about the new economy. Then, it is the language of the market that informs your discourse. You and everybody who speaks with you, speak about consumers, about providers, about service.

While those who primarily locate themselves in the human rights climate speak about citizens, about the relationship between groups and power, those who use the market language speak primarily about stakeholders. And when one speaks about rights and obligations, the other speaks about binding contracts. When, out of the human rights climate, one derives instruments which require independent supervision, those in the market climate speak about functional analysis and choices. When one speaks about regulation and about the roots, and the moral and legal justification of law and regulation, the people using the economic model, the market language, talk about monitoring and voluntary guides.

When the human rights approach looks at infrastructures appropriate for enforcement, the people with the market mentality and language think about correcting market failures. When health and welfare data, for example, get into the wrong hands, our friends speak about correcting market failures, when I would speak about infringements of human rights. There are differences. Indeed, these are the conflicting forces that are shaping the stormy and turbulent weather that is ahead of you.

There are also different views of knowledge. Data processing is a field in which knowledge, both existing and reprocessed, and new knowledge, enter the public arena. The basic question for some (and I count myself among them) is: Is knowledge a common good? For others, knowledge is a private and institutional product. Out of those different views has to come an approach to data protection and privacy.

I think it might be helpful to realize that those two models have, of themselves, a somewhat different mental image. Those who deal primarily in the language and the forces of the market, see the world as becoming more and more a transparent, interlinked production site. Those of us who primarily come from, and are nourished in, the tradition of human rights and justice, have a view of the world that hopefully makes the world more and more like a garden in which we all can walk, and in which we all have to be vigilant about the weeds, the plants,and the behavior of all those who use the garden for food, living, habitat and recreation.

So, in this extreme situation of contrasts, you deal with the difference between regulating a garden or regulating a production site.

Of course, nothing is that stark, but I perceive these as some of the forces into which you will walk once you leave here. Now, where is the bridge? Where is the zone where those activities of watching over the garden and watching over the production site meet? That is where you will have to be. You cannot afford to retire into the garden or take early retirement from the production site.

I think it is, in fact, the concept of "adequacy" that will provide the bridge, as well as the discourse between the two languages. It seems very clear that, regardless of where people stand, one of the greatest needs is to assess the adequacy of protection measures. There is very little question of the need for privacy or data protection. However, there are some very different views of the context and the purpose. Where they meet is with the concept of adequacy, the assessment of what in different situations is considered adequate. This is where the discourse is taking place, and will take place, because here there comes the possibility of talking about what is adequate protection for the citizens, and what is adequate protection for the data collectors, for the data miners, for those who have stored data.

In a sense adequacy can only be assessed by those who, while they are not necessarily at home in both, have that Rosetta Stone of being able to speak both languages and translate one into the other for their own constituency. I think, therefore, the question for the datawarehousers, for the data miners, and the data gatherers, will be the same question the people who come out of the citizens' end will ask.

But, it will also be very necessary to look not only at the protection of individuals which I think each and every one of you is superbly attuned to, but also at the sectors among the population, sectors of society, that will need your protection in their privacy vis-à-vis the new technologies, the data mining.

The protection of the privacy of the powerful, whether it is the knowledge that the Minister has four children or the privacy of the Royal Family, seems relatively trivial compared to the need to protect the powerless. I think the "poor" have been extraordinarily invaded by datagatherers, social scientists, and by people who make policy and say we want to know all about the poor. I am always inclined to say to them why don't you try to know all about the rich. The poor are poor very often because the rich are rich. It may be so much easier to know a great deal about the rich, as a first step to eliminating poverty.

I think, in terms of data collecting and data mining, there needs to be respect for the privacy of the powerless as a group. The poor, the single mothers, the elderly, the Alzheimer patients, require a lot of protection. And they may need that protection, not so much in the sense that an individual might need it because their information can be misused, but in terms of the social image. The misuse, collectively, of information relating to the habits of the poor, the habits of the single mothers, are so much more likely to influence social policy than the unresearched habits of the powerful.

I would like to leave that point with you because I think we are entering a phase in which that protection is particularly necessary because we see, in the economic model, the fact that the misery of some becomes the livelihood of others, and that you will never get rid of poverty. We have all heard very persuasive arguments about the invasiveness of collecting data on drug users.

If one doesn't ask the question, "What are the data going to be used for?", the powerless cannot defend themselves. They will have to rely on you to say there is a limit to datamining, not only to respect the individual, but also to respect those designated social groups who are the villains of the month. In our province, it is the welfare cheats; in other provinces, it is the unemployed who don't want work. It is a great difficulty that datagatherers allow a collective image of the powerless. You may have to think as much about how you can provide sectoral privacy, as you are so keenly aware of privacy for the individual.

Now, in conclusion, let me just bring a little bit of history into this. While we are very much inclined to say all these problems are new and nobody has ever had to cope with them, that is only partly correct. There are certain things in society that are always present. The presence of new tools, of new technology, not only allows different things to be done, it also allows the old things to be done differently. Some of those things may be tasks which society has decided, some time ago, ought not to be done, but the new tools can rehabilitate old and discredited tasks.

I just want to illustrate that point for you before I close. Every society, every person, classifies and solves. Without that, we cannot cope with the information we have. A society particularly tries to classify people into groups of "us" and "them". Every mother throughout history has said: "My children can play with their children, but not with their children."

Now, we have as a society developed a very strong sensitivity, as well as legislation that does not allow discrimination. Think of what a sorting element religion used to be, not just the big division in the great denominations, but whether you were a Scottish Presbyterian or an Irish Presbyterian may have made an awful lot of difference to your future father-in-law.

We have begun to think that, while religion is important as part of human life, it is not an ordering principle for society. We have gone through the period where it was awful for a minister of the church to marry people of different faiths. The refrain was always: "I'm so, so tolerant, but what about the children? They won't know where to go on Sunday. And you can't possibly sanction a mixed marriage because of the children." Well, there are generations of children now around who have done pretty well in spite of that. The quality of their lives depended on what consequences their parents drew from their religion, rather than where they spent Sunday morning.

We have gone through the same argument on race: "I'm so tolerant but what about the children? My grandchildren are not going to be chocolate-coloured. Nobody will play with them." We have gone through that with class: "I won't let my children play with somebody who works in some disreputable factory." So, we have eliminated a number of sorting agents that society has used in the past to protect us and them. In fact, in many jurisdictions you cannot ask about religion and "race" is off the vocabulary.

Now comes biometrics and genetic engineering. Now you have chromosomes. Now you have information on genes. And now you have the question: should you inform the family because my daughter isn't going to marry somebody whose son has that horrible gene. What about the children?

So I would like to suggest to you that there are questions in that great menu of choices you and others have, for which the only option seems to me to be to say, so what? I think that one has to be very mindful that one gets co-opted through new technologies to revisit issues of both human rights and social cohesion that have been visited and re-visited.

I would suggest to you that one need not look only at the tool but at the task. If the task is a non-issue, it may be very well to articulate that, and to go back to the historical experience of labelling people which has not worked, which will not work, and into which we need not put an enormous amount of time and energy. I would suggest to you that much of the information that is gathered may be totally irrelevant for anyone except research scientists. You may want to ask some of your friends, colleagues and staff whether you cannot say: "this is lovely information but for us it has nothing to do with anything". It might cut out a very expensive phase of testing, verification, money-making, before you find you have only generated irrelevant information.

I would hope that you might give that a little bit of thought because there is historical evidence that the vendors of the new tools will invent essential uses, as they have always done, for whatever tools they invented, and for whatever new thing they produced. It is your sense of history, as well as the knowledge of your field, that will lead you to reject redundant information.

As you go out into that somewhat stormy world, you carry with you, not only a great deal of respect, and the good wishes of many people, but also something else, the expectations of citizenship. Ordinary people feel very much that their governments may be there for all sorts of things (information highways, and lots of Team Canada promotion around the world). But there is that question: what can I as a Canadian citizen rightfully expect from my government with a Charter of Rights and Freedoms and an assurance of the security of my person, even when that security is, in fact, the security of my informational personality. The German word is a much nicer word, meine informationnelle Person me, as a bundle of information, rather than as a physical person, whose security might be guaranteed. What security can I expect my government, by virtue of my being a Canadian citizen, to extend to my informational personality?

I think the expectations, and I think the rightful expectations, of citizens do go with you in that work. People need an assurance of what it means to be a citizen of a democratic country beyond paying taxes. So you go with the assurance and the conviction that people matter. You may even feel that all people matter. And if you have a revolutionary chromosome in you, you may even think that all people matter equally.

All people do matter. And so as you go on with your work I expect and hope from you very significant contributions to what Canadians have always termed (because the phrase comes out of the British North America Act) peace, order and good government. Thank you.

(Bruce Phillips, Privacy Commissioner of Canada)

You accuse me, Ursula, of inflating my build-up in my introduction to you. Whatever my journalistic excesses may have been in the past, I think you have proved my point. I did not exaggerate.

You have left with us food for thought about our role. I am personally going to try to remember what you have told us, that we are in some senses out there on a barricade that is terribly important that we defend for the future of democratic life, because I think myself that is at the core of what we are doing.

The remark I made at the opening of this session was that what we are really concerned with here, in all its contemporary complexity is really a very basic question. It is, whether after centuries of struggle to produce a workable, civil society, we are going to be able to manage its continuance in the face of all the kinds of challenges that technology poses and which it finds self-justifying and whether we will be able to continue to respect each other as individual, autonomous, free human beings?

I have never heard the issue put more squarely, more eloquently than you did this afternoon, and I cannot think of a better closing speech for a conference of this kind. We are in your debt. Thank you so much for coming.