Challenges to Privacy: What Should Keep you Up at Night

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Remarks at the Newfoundland and Labrador Access & Privacy Workshop 2011
Managing the Future of Access & Privacy

May 16, 2011
St. John's, Newfoundland and Labrador

Address by Chantal Bernier
Assistant Privacy Commissioner of Canada

(Check against delivery)


Introduction

The title of my presentation may seem alarmist, but as Kurt Cobain said, just because you're paranoid doesn't mean they're not after you. I do not suggest you stay awake at night, paralyzed in fear of the fate of your or others’ personal information, but I do suggest that as privacy professionals our vigilance must increase as the vulnerability of personal information increases. Significant breaches in public and private organizations, as mighty and well resourced organizations as the Secretariat of the Treasury Board of Canada was last January or Sony was last month, highlight the unprecedented nature of privacy challenges, and consequently, the need for new safeguards.

A new regime for protecting privacy is driven by imperatives of security, globalization and the explosion of online activities—in a nutshell, the new rules of the game for sharing information.

In order to meet these new challenges, the OPC has identified four strategic priorities, four areas where privacy is most at risk, to focus its work in the most effective manner.

I OPC Priorities

  • Firstly – The pressure of public safety policies on our established notions of privacy;
  • Secondly – The vulnerability of personal information held on new, untested platforms, through information technology;
  • Thirdly – The almost sudden rise of interest in genetic information for purposes of criminal justice, medical research or commercial ventures, giving access to the most intimate personal information without a full understanding of its potential; and
  • Fourthly – The threats to identify integrity, whether through the expansion of social media, new marketing practices online, or cybercrime.

I will explain how the OPC approaches each of these challenges, but first it may be helpful to give you a brief overview of the mandate and activities of our Office.

II Activities of the Office of the Privacy Commissioner

Our Office reports directly to Parliament and is responsible for ensuring compliance with the two federal acts protecting the right to privacy in Canada, i.e the Personal Information Protection and Electronic Documents Act, which applies to the private sector in seven provinces and the three territories, and the Privacy Act, which applies to 250 federal public agencies.

Both acts govern the use of personal information in their respective sectors and regulate an individual’s access to his or her personal information held by relevant organizations.

We accomplish our mandate through six clearly defined functions:

  1. Responding to requests for information – which numbered over 10,000 in 2009-2010;
  2. Receiving and investigating complaints – over 200 in the private sector per year and over 600 in the public sector;
  3. Reviewing the Privacy Impact Assessments (PIAs) prepared by federal agencies about their programs where there may be privacy implications, as mandated by the Treasury Board of Canada;
  4. Auditing the privacy practices of agencies subject to the Act at our discretion and based on a risk based audit plan for the public sector, on the basis of reasonable grounds in the private sector;
  5. Conducting and financing research and public education and awareness activities on emerging challenges in privacy protection; and
  6. Providing Parliament with advice on the legislative bills that touch on privacy issues.

We report to Canadians during our frequent appearances before parliamentary committees and in the two annual reports we submit every year, one for each of the acts that we are responsible for monitoring.

In fact our Annual Report on the Personal Information Protection and Electronic Documents Act, will be tabled in Parliament in a few weeks and I believe you will find it illustrates the challenges we face and the measures we take, particularly in relation to information technology and protecting identity integrity.

Our Annual Report on the Privacy Act will be tabled in the fall and it will particularly illustrate the extent of challenges in relation to public safety measures and to the vulnerability of information technology, calling for an unprecedented level of information management standards.

III Four strategic priorities

So let me now move to each of the priority challenges – so that you know exactly why you should stay awake at night … and hopefully what to do about it.

1. Public Safety

The first strategic priority that I will summarize this morning is the pressure that comes from public safety policies. More and more personal information is sought in the name of public safety. The trend is fuelled at the same time by a heightened sense of fear, a new context where the threat is more diffused, and consequently, a greater and greater appetite of law enforcement and national security authorities for personal information, and ever increasing technological capacity to gather that information.

It is pretty much a given that any initiative aimed at strengthening public safety and security will have an impact on privacy: think of CCTVs, scrutiny at the border, airline security, enhanced background checks, just to name a few examples that have been in the media recently.

But that does not mean that the objectives of security and privacy are at odds.

On the contrary, the role of privacy professionals, as well as public safety professionals, is to integrate privacy safeguards into public safety measures. The goal is to protect both the lives of our citizens and the values they hold. As most important things in life, it is easier said than done. It requires a very clear methodical approach to ensure we preserve well-established values while responding to new challenges.

To foster this methodical approach, we have issued a Reference Document, entitled A Matter of Trust: Integrating Privacy and Public Safety in the 21st Century. You can find it on our website. The document provides an analytical framework to guide policy makers, practitioners, academics and citizens in addressing privacy considerations in the context of public safety measures. The Document includes three case studies, one of particular relevance to provinces, namely the Enhanced Drivers License.

It shows, step by step, how Canadian law on privacy guides the integration of privacy protection in the development of public safety measures.

Let me describe this Reference Document, and the analytical framework it proposes.

The Reference Document was developed with the advice of experts in both privacy and public safety and it is structured around four stages:

  • Stage 1 is called “Making the Case” — at this stage, we analyze a public safety measure, be it legislation or a Privacy Impact Assessment for, say, an RCMP program, an airline safety measure, as the case may be, and we require that the proposing authority substantiate a four part-test, based on the 1986 Supreme Court of Canada decision in Oakes. Essentially, we require that the proposing authority demonstrate that, where a measure intrudes upon privacy;
    • It is necessary and justified in a free and democratic society,
    • The measure is proportionate to that necessity, which means that the encroachment to privacy is no greater than what is strictly necessary,
    • The measure is effective and therefore the encroachment to privacy is shown, empirically, to benecessary,
    • There are no less intrusive,alternative measures.
  • Once Stage 1 criteria have been met and therefore the encroachment to privacy has been justified by valid objectives of public safety, Stage 2 requires the proposing entity to demonstrate that it will protect the information it collects according to the Fair Information Principles, which include:
    • Establishing a clear accountability regime for the protection of the information, including safeguards, ensuring accuracy and limiting collection, use, disclosure and retention,
    • Clearly identifying the purposes for the collection of the information,
    • Seeking consent where it is applicable; in the public sector, consent is of limited application but there are cases where it was decisive in our analysis; for example, in the case of the body scanners at airports, we insisted that they be optional and they are;
    • Being transparent—for example, our guidelines on video surveillance include the requirement that the presence of cameras be clearly indicated.
  • At Stage 3, called “Running the Program”, we require demonstration that privacy protections are incorporated in the architecture of the program through specific mechanisms such as the designation of a Chief Privacy Officer, or plain language documentation addressing the privacy issues of the program or regular public reporting on the activities of the program. For example, the obligation to report the number of wiretap authorizations received and enforced each year.
  • We called Stage 4 “Calibrating the System” and by that we refer to the external review and oversight of public safety bodies, such as, for example, the Security and Intelligence Review Committee that reviews in detail the operational activities of the Canadian Security and Intelligence Service.

We apply this framework in our reviews of Privacy Impact Assessments as well as analyses of proposed legislation in matters of public safety.

2. Information Technology

New information and communications technologies are another key focus for the OPC. I mentioned earlier the attack on the Treasury Board Secretariat. I could have also mentioned the malfunction in the launch by Human Resources and Social Development Canada of the Access Key last September. I could have mentioned the attack on Agriculture and Agri-Food Canada in 2008, where the financial information of 60,000 farmers was exposed. In fact I could bore you with a long list of the breaches we are notified of in a single week.

The most egregious I saw in the last month was one where the evaluation of competencies of a public servant was sent electronically, by mistake of course, to 375 employees of that Department. A violation of this gravity and extent could hardly have occurred without information technology.

There is no question that information technologies make life easier. Most people today can barely imagine a world without the Internet and the many other advances that computers and the digital age have brought.

But every technological innovation also introduces new risks to privacy. With the power of modern computers, there is today no practical limit to how much personal information can be collected, stored and used. That in turn, makes it increasingly difficult, if not impossible, for individuals to control their personal data.

The advent of e-mail by itself brought forth new challenges to privacy:

  • Unprecedented volume of recorded personal information;
  • Unprecedented breadth of diffusion of personal information; and
  • Unprecedented risk of breaches with unprecedented magnitude of consequences.

Information technologies—and their impact on personal information protection—were the focus of two major audits conducted by the OPC during the last fiscal year. I believe our findings are relevant to most public and private institutions.

2.1 Audit on the use of wireless technologies

A first audit focused on the use of wireless technologies by five federal government institutions, namely the Canada Mortgage and Housing Corporation, Correctional Service of Canada, Health Canada, Human Resources and skills Development Canada, and Indian Affairs and Northern Development.

The objective of the audit was to assess whether personal information is protected when it is transmitted through wireless networks used by selected federal institutions or between federal employees’ BlackBerrys. We examined:

  • Network encryption;
  • Passwords on portable communications devices;
  • The security of data stored on smartphones and other portable devices;
  • The use of PIN-to-PIN messaging services;
  • Personnel training; and
  • Disposal of surplus devices.

We discovered the following:

  • None of the five institutions we audited had fully assessed the threats and risks inherent in wireless communications.
  • Passwords used for smart phoned did not meet the standard recommended by Communications Security Establishment Canada.
  • Wireless encryption and data storage were also inadequate.
  • Even though Communications Security Establishment Canada stated that PIN-to-PIN messaging should be avoided, some departments still make extensive use of this means of communication.
  • There are shortcomings in how portable devices are stored and disposed of, as well as the procedures to follow when a BlackBerry is lost or stolen.
  • There is little evidence to indicate that departments and organizations provide staff with adequate training on the secure use of wireless devices.

We have issued many recommendations to help federal institutions mitigate the risks to personal information associated with the use of wireless technologies. In particular, that:

  • Institutions undertake a threat and risk assessment for their wireless networks and smartphones;
  • Institutions ensure that employees are made aware of the privacy risks inherent in the use of smartphones;
  • Employees use strong passwords—as defined by Communications Security Establishment Canada;
  • Data stored on smartphones be encrypted;
  • PIN-to-PIN messaging be used in accordance with guidelines issued by Communications Security Establishment Canada;
  • Institutions adopt documented procedures to deal with the loss or theft of wireless devices;
  • All excess wireless devices be stored in secure locations; and
  • Control mechanisms be put into place to ensure that data stored on surplus wireless devices are deleted and purged prior to disposal.

A summary of the audit is included in our 2009-2010 annual report on the Privacy Act.

2.2 Audit of the Crown’s excess asset disposal practices

Our 2009-2010 annual report on the Privacy Act also includes a summary of a second audit concerning privacy protections surrounding the disposal of digital equipment and outdated hard-copy documents.

I will focus on our findings in relation to the disposal of digital equipment. Most of the federal government’s outdated computers are reconditioned and distributed to schools or Aboriginal communities through a program managed by Industry Canada.

We were interested in learning whether the hard drives of donated computers were wiped clean or whether they still contained potentially sensitive personal information.

During our audit of practices related to the disposal of certain federal agencies’ surplus assets, we not only discovered potential data breaches; we found real ones.

We examined a sample of almost 1,100 computers donated to the Computers for Schools Program. The hard drives of more than 1 out of every 4 computers contained residual data. Only 3 of the 31 institutions in our sample (or 10% of agencies subjected to the audit) had not donated computers that still contained information.

A forensic analysis of a sub-sample of those hard drives revealed that they contained very sensitive data, particularly highly personal information, documents protected by solicitor-client privilege, and even classified material.

The data remaining on some of the devices was so sensitive that we immediately returned the hard drives to their original departments to have them correctly disposed of—as should have been done at the outset in accordance with Treasury Board Policy.

Our 1994-1995 annual report stated that 95% of computers sent out for disposal still contained data. Fifteen years later, that rate is 42%. Far more progress needs to be made, and fast.

To secure personal information in surplus assets being disposed of by the government, we recommended to Industry Canada, which is responsible for the Computers for Schools Program, that they:

  • Ensure that all security weakness identified in the audit are analyzed and addressed in a timely manner; and
  • Work with the Treasury Board Secretariat to request that federal departments and agencies provide a signed declaration to the Computers for Schools Program certifying that information contained in donated surplus computers and related assets has been wiped.

3. Genetic information

Our third strategic priority is in the emerging arena of genetic technologies.

Until now, we have been preoccupied with safeguarding relatively prosaic bits of personal information such as names, addresses, phone numbers and credit card numbers. We now have to consider tissue samples, genetic information—the body, as personal information. Imagine the potential value of personal information derived from an individual’s genetic code for identification, insurance underwriting, employment? The possibilities are vast and only starting to emerge.

Genetic information can be used for many wonderful and amazing purposes. This province in particular has seen great progress in identifying genes related to specific illnesses and many Newfoundland families have generously contributed to the advancement of medical research. But genetic information can also be used in ways that intrude on our dignity and our sense of self.

Moreover, it is difficult to exercise control over things we do not understand and, at the end of the known universe of science, genetic technologies challenge our capacity to grasp their full implications. Science is evolving faster than its legislative and ethical frameworks.

Control over our own genetic material is also complicated by other factors. How, for example, can we give meaningful consent for the use of a tissue sample, when it can be stored for decades and used for purposes we cannot even dream of today?

In this regard, our minds are focused on the following trends:

  • The increase of medical research based on genetic information and how do we ensure meaningful consent in the context of such complex, scientific issues as genetics? How can that consent be meaningful in relation to yet unknown possible uses?
  • The commercialization of genetic testing, including the protection of the information gathered. And,
  • The increased appetite of the criminal law system for DNA evidence.

To illustrate this strategic issue, I will address the National DNA Data Bank, which is managed by the Royal Canadian Mounted Police, and of which the OPC is an ex officio member of the Advisory Committee.

3.1 National DNA Data Bank

Parliament first enacted the forensic DNA provisions in Canada’s Criminal Code close to 12 years ago. The goal of the legislation was to facilitate obtaining genetic samples from individuals suspected of having committed one of a clearly defined range of offences under the DNA Identification Act.

The Act created a National DNA Data Bank and authorized the collection and storage, for DNA analysis, of biological samples from everyone convicted of an offence designated under this Act.

Privacy protection and public safety objectives are incorporated into the data bank through a governance structure that ensures restricted access to data and a strict mechanism to control access and data use. In particular,

  • Genetic data is dissociated from personal identity and is only linked through a bar code;
  • Physical access to the data bank and personal data is strictly controlled;
  • The data bank is managed independently of police forces;
  • In addition, DNA samples are collected only in the event of a conviction.

The data bank is used to help law enforcement agencies investigate crimes by comparing the database of samples linked to convicted offenders with samples found on crime scenes.

The operation of the data bank is monitored by the National DNA Data Bank Advisory Committee. I represent the the OPC on that Committee, which also includes representatives of the police, legal, scientific and academic communities. The Committee is a forum for discussing policy and operational issues.

The National DNA data bank managers have gone to great lengths to integrate privacy safeguards in its management of genetic information.

However, we are concerned by some policy trends in relation to forensic DNA that have significant privacy implication. I will refer to three policy trends in particular:

  • Collection upon arrest;
  • Familial searching; and
  • Privatization of forensic labs.

3.1.1 Inclusion in DNA Data Bank upon arrest

The central privacy consideration here is this: According to the principles dictated by the courts on the reconciliation of privacy and security, does mere suspicion, which is all we have upon arrest, justify such a deep and consequential invasion of privacy as the retention of DNA in a DNA Data Bank? In 2008 the European Court of Human Rights has squarely answered this question in relation to the UK Data Bank: It's NO.

The argument of necessity to ensure public safety does not exist, or at least, does not outweigh the invasion of privacy because the personal information retained is not relevant.

It amounts to including in the DNA Data Bank, and therefore retaining, extremely sensitive personal information of persons who could be, in spite of the arrest, law abiding citizens. Arrest cannot lead to such a diminished expectation of privacy and the questionable relevance to law enforcement of retaining DNA from any person arrested does not justify the privacy intrusion.

The violation of privacy would not be proportionate to the law enforcement objectives since it has not been demonstrated to materially improve law enforcement.

In fact, the amount of DNA data, relevant and irrelevant, that would be retained if we opened it to collection upon arrest, could be so broad to be difficult to manage and in fact may decrease the effectiveness of law enforcement response.

3.1.2 Familial searches

There is increasing interest in allowing DNA testing of persons, even law-abiding citizens, where DNA analysis of a sample shows a close but not perfect match to an offender. In theory, it could point to a family relative.

The privacy considerations here are these: Is it justified to target members of a family, deliberately, because they are the relatives of a convicted offender? And, does being the relative of a convicted offender decrease a law abiding citizen’s right to privacy?

Perhaps before I go further, I should describe exactly what type of DNA analysis I am referring to. By “familial searches,” we mean the specific targeting of persons who are relatives of a convicted offender because, during a DNA search, comparing Crime Scene DNA to the DNA Data Bank, the analysis finds not a match, but a partial match.

This may indicate that the DNA sample could belong to a relative of a convicted offender, or it may not be the case at all—like two persons can look alike without being related. This partial match would allow for seeking authority to investigate relatives of that convicted offender.

The UK does allow familial searches, and on the basis of a strict framework for authorization and procedural safeguards.

The UK reports a 25% success rate in relation to familial searches, but I understand it comes at the end of a very cumbersome process, and a huge investment of efforts.

Moreover, this 25% success rate means that in 75% of the cases, the person whose DNA has been forcibly collected, simply for being the relative of a convicted offender, has seen his or her privacy rights significantly reduced for no other reason than kinship.

Would we consider it reasonable that relatives of an offender would have a reduced expectation of privacy just because of that relationship?

And yet, there is that 25% success rate, and there are those very compelling cases that make you take pause. For example, there have been very serious crimes, where forensic analysis yielded an almost perfect match and, through a familial search, led to the arrest of a brother, who happened to be a serial rapist. There have also been cases where an innocent man has been exonerated on the basis of a familial search.

But significant doubts remain on the legal, moral and operational aspects of familial searches:

  • Do we accept that the DNA Data Bank be, de facto, extended to anyone related to a convicted offender whose DNA sample is in the data bank, for no other reason than that relationship?
  • Are we comfortable with the deliberate targeting of presumed innocent people, whose DNA has never been subject to inclusion in the DNA data bank—a process that includes a judicial decision at the end of trial that leads to a conviction?
  • Do we want to use relatives as informants against each other?
  • And, is it even useful: A partial match can be anyone—not necessarily a relative; hence, it could send the investigation in a completely wrong direction.

Should familial search be considered in Canada, in the continued absence of robust evidence of its usefulness for law enforcement, our Office would object.

I refer you back to our Reference Document I mentioned on public safety and privacy: We would seek demonstration that familial searches are in fact necessary to ensure public safety, that the regime established to conduct them would ensure proportionality between the invasion of privacy and the law enforcement objectives that would be pursued and we would insist that the effectiveness of familial searches be demonstrated to be continued.

At this point, the effectiveness of familial searches is in doubt, the success rate and application is not well understood, even among law enforcement agencies and, therefore, we would not support it.

3.1.3 Privatization of Forensic Laboratories

The Government of Canada, supported by a Recommendation of the Senate, is exploring the possibility of entering into public–private partnerships with forensic laboratories to conduct DNA analysis for police agencies.

The privacy risks rest with the security of the information, and with the accountability for that security. Risk mitigation will require a strong governance regime through privacy clauses in contracts, and a proper oversight mechanism to ensure compliance.

4. Identity integrity

Our fourth priority focuses on the protection of identity integrity. By this, I am referring to people’s right to control the personal information that defines them to the rest of the world.

The fact is that, even if you never post a single word or image on the Internet, you still leave an electronic footprint. Today, with surveillance cameras, smartphones and global positioning systems, you create a rich trail of data about your movements, behaviours and preferences.

Each kernel of data taken in isolation may reveal little. But collated, cross-referenced and analyzed, all the pieces can yield an extremely detailed profile. Taken together, this can become your identity.

Managing your identity is a challenge, especially when you do not really control how it was created, how it is used or how it is shared with others.

And it can be used for good or ill.

You might, for example, enjoy VIP treatment at a shop you visit often. Or you could find yourself bombarded by irritating ads and wonder what happened to your privacy.

We are currently hard at work on prominent files regarding identity integrity, under the private sector privacy law: for example, we have issued a preliminary letter of findings, posted on our website, and are about to issue our final letter of findings in relation to Google Wi-Fi; we are working with Sony to identify the relevant facts and issues in the attack against Sony Playstation and Sony Entertainment Systems; and we have just started an investigation of Apple geolocation functions.

Our preliminary findings in Google Wi-Fi were revealing of the general challenges we face:

  • A Google engineer developed a program that was rolled out through Google Street View without any control of its privacy implications;
  • The consequence was that through rolling out Street View, Google captured personal information—and some of it was very sensitive personal information;
  • That there was no governance structure at Google to identify that vulnerability and correct it.

What is the general issue it raises? Internet companies are young, dynamic, and do not display the necessary maturity to fully assess the impact of their innovation before they roll it out. But the private sector is not the only vulnerability: as government goes online and interacts electronically more and more with citizens, the protection of identity integrity becomes an increasing challenge in that sphere as well.

We, privacy professionals, must insist upon the creation of a proper framework of privacy safeguards, commensurate with the vulnerabilities of new information platforms.

Conclusion

So, did Imake you jittery? I hope not—Ihope instead tohave spoken to your alertness and to your commitment to develop new privacy safeguards as privacy challenges evolve.

I hope to have shown you how the Office of the Privacy Commissioner is taking on that task. And we are at your service to address these challenges with you.

I hope that this symposium provides you with an opportunity for fruitful discussions and exchanges that will complement your work. I will now be pleased to answer any questions you may have.

Report a problem or mistake on this page
Please select all that apply (required): Error 1: This field is required.

Note

Date modified: