Language selection


Privacy: It's good business and it's everybody's business

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Remarks at the SC Magazine World Congress Canada

November 16, 2010
Toronto, Ontario

Address by Jennifer Stoddart
Privacy Commissioner of Canada

(Check against delivery)


It is a pleasure to join you today, the first time that SC Magazine is holding a World Congress here in Canada.

I am delighted to see so many people, influential in the public and private sectors, gathered in this room. What this tells me is that matters related to privacy and data security are no longer just the stuff of esoteric dialogues between systems engineers and regulators. The level of interest and concern is now widespread, profound and – to my way of thinking – most welcome.

So let me try to make the most of this opportunity!

I propose to share the view from my third-floor Office in Ottawa – of the current privacy landscape, the emerging challenges, and how the regulation of privacy is evolving as a result.

What I hope to leave you with is an appreciation of the scope of the challenges, and the role we all – including you – have to play in safeguarding the privacy and personal information of Canadians, Americans, and anyone else.


Let me set the scene with this question: Are your thoughts really private?

Of course, you say. Unless someone attaches electrodes to your skull to measure brain waves, your thoughts are surely the last refuge of true privacy.

Well, there’s a move afoot to penetrate even that bastion. And it comes from the Pentagon’s research arm, the Defense Advanced Research Projects Agency, or DARPA.

Just last month, DARPA solicited bidsFootnote 1 for a multi-million-dollar contract to find a mathematical way to sift through billions of other people’s text and e-mail messages for evidence of evil intent.

Not actual evildoing – evil intent.

The research project is called ADAMS, or Anomaly Detection at Multiple Scales. Its ostensible goal is to predict whether a good and trusted employee is suddenly going to turn rogue.

The Pentagon claims the project was triggered by last year’s shooting rampage at a Fort Hood, Texas, military facility, which left 13 dead and 43 wounded. Major Nidal Hasan, a U.S. army psychiatrist, is being court-martialled in connection with the tragedy.

DARPA’s request for proposals notes that Maj. Hasan may have posted some radical comments online before the tragic episode. And it concludes:

“Unfortunately, this investigation is taking place after the fact. The problem

ADAMS would address in this instance is that of detecting anomalies in Major Hasan’s alleged behavior in time to alert the proper authorities, who could intervene before the fact.”

This was the stuff of science fiction more than 50 years ago, when Philip K. Dick wrote his short story, “The Minority Report,” which served as the basis for the 2002 Steven Spielberg movie.

But today, ADAMS is just an extreme form of data mining, something that’s going on all around us, all the time.

Privacy challenges

In fact, there are already so many challenges to privacy that we don’t have to peer into the future to be concerned.

With social networking, GPS tracking, RFID tracking, online tracking, and widespread surveillance by governments and corporations, we’re fast forgetting the meaning of words such as “solitude,” “anonymity” and, indeed, “privacy.”

According to one prediction, in 20 years, an estimated 50 billion devices with microprocessors will be connected to networks. Every facet of our lives will be constantly monitored and reported through an Internet of Things, a combination of RFID, wireless sensor technologies and nanotechnologies.

This isn’t farfetched and futuristic. In fact, these kinds of “smart” technologies can already be found in supply chains for consumer products, and smart meters being introduced by utility companies.

Regulatory implications

It should be plain that privacy regulators face daunting challenges.

Consider, for example, that many regulatory regimes for the protection of personal information – including our private-sector law here in Canada – are founded on a series of Fair Information Principles.   (OECD principles of 1950)

These require organizations to, among other things, obtain consent for the collection of personal data, limit the collection of data to what is necessary for an identified purpose, build in safeguards, and make somebody accountable for the data.

  • But with RFID tags silently emitting data, how do you even detect that there is a flow of information – never mind applying concepts such as consent and accountability?
  • With cloud computing and an increasingly complex technological environment, how do you figure out who is collecting the information, who has access to it and who has jurisdiction?
  • And, with data flashing instantly around the world, how can you possibly know where it is being collected, stored, processed or shared?

Regulatory successes

Canada’s Personal Information Protection and Electronic Documents Act, or PIPEDA, is grounded in these Fair Information Principles. Designed to be flexible and technologically neutral, the law has stood us in good stead over the nine years since its passage.

In fact, it has enabled us to stare down some of the world’s dominant technological giants.

In September, as you may recall, we wound up a high-profile investigation of Facebook that actually began in 2008. In the wake of our investigation, coupled with lengthy and intensive discussions, the social networking site has significantly bolstered the privacy protections available worldwide to its half-billion users.

And last month we released our preliminary findings in an investigation into Google’s collection of Wi-Fi data for its geo-location applications.

While data-protection authorities in several countries have looked into this situation, I want to underline that we were the only ones to have travelled to California to examine the collected data at Google’s premises.

In doing so, we turned up important evidence contradicting what Google had previously maintained. Initially they claimed that any payload data they’d collected would be fragmentary and meaningless.

We, however, found complete e-mails, files, passwords, web page requests and so on – data that cannot, by any standard, be dismissed as meaningless.

Google has also consistently claimed that the collection was inadvertent – that they didn’t want or intend to collect payload data. But again, in the course of a detailed examination of their technical and business processes, we learned that the engineer who wrote the code did, in fact, intend to collect it – for reasons of his own – although he didn’t inform his bosses.

Based on our investigation, we concluded that the data collection was careless. Even so, it violated PIPEDA. We recommended that Google destroy the data as soon as possible, and gave the company until February to implement our recommendations for bringing its practices in line with Canadian law.

While we have not yet received a formal response, the company has publicly announced the appointment of a chief privacy officer, as well as plans for new internal controls and procedures and employee training.

The Google Wi-Fi case, in my view, demonstrates that we can act quickly, decisively and with a great deal of expert depth.

Under Privacy Act

Meanwhile, we have also helped strengthen privacy protections in the federal public sector, using our more limited powers under the Privacy Act.

For example, a privacy audit enabled us recently to highlight gaps in the government’s policies and practices around wireless networks and mobile devices such as BlackBerrys and smart phones.

A second audit drew attention to longstanding problems in the way the government disposes of surplus computers and waste paper documents.

Our Privacy Impact Assessment process has also yielded many gratifying enhancements to the privacy landscape. For example, in the wake of extensive discussions between my Office and the aviation security authority, the full-body scanners you now find at airports incorporate numerous measures to better protect your privacy and personal dignity. 

Changes at the OPC

Despite these successes, we know that our regulatory framework and tools won’t be adequate for tomorrow’s challenges.

On the public-sector side, we continue to advocate for a modernization of the Privacy Act, which was passed in 1982 – just as the Commodore 64 hit the market and E.T. was calling home.

Meantime, we look for other ways to fulfill that side of our mandate. For instance, earlier today in Ottawa my colleague, Assistant Commissioner Chantal Bernier, unveiled our first-ever privacy policy guidance document, this one in the field of national security and public safety.

Aimed at decision-makers as well as the informed lay public, the document sets out a rational and systematic approach for integrating privacy considerations into security initiatives.

We’re also trying to stay on top of some of the most significant developments in technology, and their impact on privacy.

For instance, we are just now wrapping up an in-depth, cross-country consumer consultation process that focused on cloud computing, and the tracking, profiling and targeting of consumers by marketers and other business.

Powers of Office

We also commissioned a pair of leading academics to examine the powers of my Office under PIPEDA, and to suggest ways to ensure the regulatory tools at our disposal are suited to the evolving needs.

In their analysis, Lorne Sossin, Dean of Osgoode Hall Law School, and France Houle, of the Université de Montréal, suggest that the current ombuds model has worked better with large industry sectors such as banking and insurance than with small- and medium-sized businesses.

One option they put forward proposes that my Office acquire certain targeted order-making powers, including the ability to impose penalties, such as fines. They also propose explicit guideline-making power, to assist with the fair and transparent implementation of new order-making powers.

We’re currently assessing this analysis and mapping it onto our views of the ombuds model and our experience with PIPEDA. The authors’ report will help shape my Office’s input during the next mandated Parliamentary review of PIPEDA, expected to begin next year.

I want to mention that some changes to my powers are already underway.

Parliament is currently considering Bill C-29, which would, among other things, make breach notification compulsory for private-sector organizations.

Bill C-28, another piece of legislation that was introduced at the same time, is designed first and foremost to curb electronic spam. The bill would give my Office specific regulatory responsibilities for combating this scourge of business and consumers.

It also gives us important new powers to control which complaints we investigate.

And it permits the sharing of information for the purposes of enforcing Canadian privacy laws. This is crucial, in light of the cross-jurisdictional data-flow challenges I described earlier.

Global regulatory response

Indeed, privacy protection in this era of big data is very much a global concern.

To be an effective regulator in Canada, we need to be part of a worldwide network of data-protection authorities. And that means we need to share information with our international counterparts.

I am pleased to say that there are plenty of gratifying developments on the world stage.

In Madrid last year, for instance, data-protection commissioners from around the world agreed to work toward international privacy standards. The OECD and the European Union are reviewing and renewing their privacy frameworks, and we’re involved in those efforts.

Canada recently also joined the Global Privacy Enforcement Network and the APEC Cross-Border Privacy Enforcement Arrangement, both of which aim to bolster compliance with privacy laws through better international co-operation.

I should add that some forms of collaboration are more informal in approach – yet just as effective.

For example, when Google disregarded privacy rights in the rollout of its Google Buzz social networking service last February, we led nine other data-protection authorities from around the world in an unprecedented tactic:

We published jointly an open letter that urged Google and other technology titans entrusted with people’s personal information to incorporate fundamental privacy principles directly into the design of new online services.

Privacy by default

Indeed, this is a message I try to underline at every opportunity. A few weeks ago, at an international conference of data-protection and privacy commissioners in Jerusalem, I co-sponsored a resolution, put forward by my Ontario colleague, Dr. Ann Cavoukian, that would see privacy considerations become embedded into the design, operation and management of information technologies.

As data-protection commissioners from four continents told Google’s CEO, Eric Schmidt, in our open letter about Google Buzz:

“It is unacceptable to roll out a product that unilaterally renders personal information public, with the intention of repairing problems later as they arise.  Privacy cannot be sidelined in the rush to introduce new technologies to online audiences around the world.

The letter went on to remind Google that data-protection and privacy laws apply online, just as they do in the physical world. 

And we encouraged Google – and, indeed, all companies developing innovative technologies with implications for people’s personal information – to engage with us whenever they’re developing products or services with significant implications for privacy.

It can certainly be done. The Fair Information Principles I mentioned earlier offer concrete, practical guidance for anybody who cares to implement them.

But today’s bottom line is that we expect enterprises to think of privacy – not as an add-on or an option in a drop-down menu, but as the default.

Technologies, after all, are just tools. They can be modified or redesigned.

The real challenge is getting entire organizations­ to understand the importance of privacy, to take the steps necessary to protect it in every aspect of their business.

Not just once in a while, but always.

That kind of thinking goes a long way toward keeping privacy regulators happy. It also helps mollify consumers.  

Indeed, you may recall that the awkward launch of Google Buzz provoked such a backlash among users that Google was forced to retreat, fix the problem, and issue an apology.

And Google’s collection of Wi-Fi data spawned so many lawsuits and regulatory confrontations around the world that some clever blogger recently created a Google map to keep track of them all!


To sum up, I want to underline that, as Canada’s privacy regulator, we do not stand in the way of innovation.

On the contrary: We embrace it.

We understand the advantages of technology, and the desire of Canadians to participate fully in the digital economy.

We also acknowledge the opportunities that technology presents for companies – the inventors as well as the users of electronic innovations.

But the entire construct rests on a delicate foundation – consumer confidence in the online environment, and trust in the organizations that do business there.

Confidence and trust hinge on different factors – some rational and predictable; others less so.

But, in my experience as Privacy Commissioner over the past seven years, one persistent and unshakable belief is that personal information is a valuable asset that deserves to be treated with respect.

Not in a haphazard and incidental way by a handful of organizations, but in a thoughtful, systematic and committed way by all.

Canadians look to people such as you for leadership.

I thank you for your attention.

Report a problem or mistake on this page
Error 1: No selection was made. You must choose at least 1 answer.
Please select all that apply (required):


Date modified: