The Age of Big Data A new, challenging frontier for privacy and our society
Remarks at the Conference Board of Canada’s Chief Privacy Officer Council
October 17, 2013
Address by Jennifer Stoddart
Privacy Commissioner of Canada
(Check against delivery)
Thank you for inviting me here today to address you about challenges which Big Data poses for commercial organizations such as those you represent as Chief Privacy Officers.
In my time today, I want to discuss the privacy challenges confronting your organizations which are also basic ethical ones confronting both you and society as a whole.
I’ll also address the challenges which Big Data presents for Canada’s privacy protection framework, and how I see this phenomenon as another reason why PIPEDA needs strengthening.
The emergence and challenge of predictive analytics
First let us look at what people mean by the term Big Data. Some definitions emphasize the data itself. For example, the McKinsey Global Institute says that “big data refers to data sets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze.”
Some focus on the “big” aspect. The New Scientist magazine said “big data describes the idea that everything can be digitized and ‘datafied’ thanks to cheaper storage, faster processing and better algorithms.”
This quantitative change in available data has given birth to a qualitative change in how the data is being used. The crux of that change is the desire of decision-makers in business, the public sector and government to know something before it happens. To intuit the future behaviour of individuals.
The reasoning is that if only we can learn as much as we possibly can about people, including their past actions and their attributes; then we might be able to understand their predispositions and predict their future actions. This in turn could prove invaluable for retailers in tailoring products, services and for intelligence and law enforcement in perhaps heading off security threats.
This further refinement of Big Data is known as “predictive analytics.” Defenders of privacy such as myself are very mindful of the increasing use of predictive analytics, and have been considering the implications it could have for privacy. Equally, it lies at the core of the new challenges which you face as CPOs.
Such use can yield benefits for commercial organizations. It may contribute to product innovation, more effective marketing and better targeted research. But equally it could result in discriminatory, invasive and intrusive practices upon the public. Dealing with these potential implications takes us well beyond traditional privacy protection and into the realm of ethical standards in our society.
Facing Big Data as a CPO
Now, it’s very possible that many of you know very well the potential of Big Data to bolster your organization’s bottom-line. As CPO’s, you may be wondering at what price would such initiatives come at privacy-wise?
And further, how should you, as a CPO, react when it comes to the emergence of such an initiative within your own organization?
Many members of your senior management groups may be advocating for the use of predictive analytics because they’ve read stories exemplifying the potential.
As a CPO, one of the most important things you can do is rather simple: remind your colleagues that most of these stories emanate from in the United States, which lacks an overarching private sector privacy law. And of course, as an organization doing business in Canada you are subject to PIPEDA – the Personal Information Protection and Electronic Documents Act – or substantially similar legislation in Alberta, British Columbia and Quebec.
Big data and PIPEDA’s privacy protection framework
Let’s explore now how the challenges posed by Big Data initiatives can be accommodated within a traditional privacy framework such as PIPEDA.
At first blush it may seem like an impossible fit. The protection of personal information has long rested on three of the fundamental principles of fair information practices endorsed by the OECD, and forming the bedrock of privacy protection frameworks of most countries:
- Transparency – providing people with a basic understanding on how their personal information will be used in order to gain informed consent;
- Limiting use plus consent – the use of that information only for the declared purpose for which it was initially collected, or purposes consistent with that use; and,
- Minimization – limiting the personal information collected to what is directly relevant and necessary to accomplish the declared purpose and the discarding of the data once the original purpose has been served.
Having enshrined these principles in national privacy laws and in international understandings, many defenders of privacy may fear that predictive analytics and Big Data are literally turning this framework on its head.
Big Data hasn’t simply increased the risk to privacy; it has changed the nature of that risk. And although it raises new challenges and questions about Canada’s existing framework, it’s surely not sufficient reason to ignore it. And in fact, the existing framework can still ably serve to guide an organization’s decision-making when it comes to this new frontier.
Predictive analytics in action
Consider this now iconic use of predictive analytics by the retail giant Target originally revealed by New York Times writer Charles Duhigg. Many of you may have heard this story before, but I think it’s worth repeating.
Target developed a “pregnancy-prediction algorithm” to apply to its mammoth data base of what women shoppers bought at Target stores in the United States.
By analyzing purchases of a carefully selected two dozen items, Target’s data scientists could not only assign a “pregnancy prediction” score to each female shopper, but also estimate her due date within a very small window.
This allowed Target to send these women coupons timed to specific stages of their pregnancies.
The company deemed it a marketing triumph. But the results hit home – and I mean literally –when the irate father of a teenage girl learned that she was pregnant – albeit in a roundabout way – based on coupons for maternity items, baby clothes and nursery furniture were mailed to the family home.
And don’t overlook another important lesson of the Target story. If the teen whose pregnancy was predicted had been here in Canada the resulting complaint might not have stopped with the store manager. Because of PIPEDA, it might have come to me, as well.
The importance of not just asking “can we?”, but “should we?”
This said, it shouldn’t be fear of “after-the-fact” regulatory consequences that should give an organization pause before diving headlong into a Big Data initiative.
Listen to Bryan Pearson, a Canadian who is CEO of the company which created and operates Air Miles and other loyalty programs.
“The explosion in the availability of data, plus the capacity to manipulate it, has created a kind of virtual cookie jar for marketers – and the temptation to keep reaching in for more and more. And therein lies the risk: How many times can we dip in before we’ve gone too far in compromising customer privacy?… Who decides how far is too far? Is there a simple mechanism to keep data greed in check?”
That’s Pearson in a commendably candid book published last year entitled The Loyalty Leap: Turning Customer Information into Customer Intimacy.
He’s saying that just because commercial organizations can collect personal information and run it through the revealing algorithms of predictive analytics, doesn’t mean that they should. I welcome this viewpoint from someone so prominent in a business sector called “measured marketing,” meaning marketing based on the actual measurement of consumer behaviour.
The same “should we” question is raised by PIPEDA in section 5.3, which states: “An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.”
The “reasonable person” standard is well established in Canadian law. But the phrase “appropriate in the circumstances” goes right to the heart of our national values. In some ways, it has echoes of another value-imbued phrase in Section 1 of the Canadian Charter of Rights and Freedoms. That phrase is “demonstrably justified in a free and democratic society.”
Those words are the test which courts apply in determining whether any law can put a reasonable limit on one of the fundamental rights guaranteed to individuals under the Charter.
To grossly over-simplify, both phrases challenge us to debate what is acceptable in our society and what is beyond the pale.
Consider another example in the realm of mobile marketing. We’re increasingly seeing new services that let advertisers pay to know when people with a GPS-equipped phones are near one of their stores. The advertisers can then send promotional text messages to entice the smartphone user to shop there.
Some major national retailers have already signed up for such programs to sell things like fast food, beverages and electronics to name a few examples.
But what if retail outlets equipped with electronic slot machines want to use this service to entice more people for spur-of-the-moment gambling – and target users who have a propensity for visiting gambling sites on their devices? Or aim weight-loss supplements at consumers who, through their search histories, express insecurity with their body image?
Facing the privacy concerns of Canadians
Novel technological developments like these underscore the concerns being increasingly expressed by Canadians about privacy and Big Data, concerns which you and your organizations need to take seriously. In a public opinion survey of more than 1,500 Canadians conducted for my Office a year ago, seven in ten said that their personal information has less protection in their daily lives than it did a decade earlier.
As well, the majority (56 per cent) are not confident that they have enough information to know how new technologies affect their personal privacy. That’s the highest vote of no-confidence in this regard since tracking began in 2000.
Perhaps even more germane to our discussion here was a finding in our 2010 opinion survey. More that eight in ten respondents wanted tough sanctions against organizations that fail to properly protect the privacy of individuals – such as publicly naming the offending organizations, fining them, or taking them to court.
As I noted earlier, it’s quite likely that some decision-makers in your organizations will expound on the supposed competitive edge that might be gained through predictive analytics or other uses of Big Data.
Instead they should heed the advice of Bryan Pearson, the Air Miles founder. What consumers must be told, Pearson writes, is what is being done with the personal data that companies collect and how it benefits them as individuals. He continues:
“At this moment we are on course for a grand reckoning, Either the industry will become accountable or legislation will hold it to account.”
I’m afraid that after a decade as Canada’s Privacy Commissioner, I’m not overly optimistic that self-regulation by business will prove effective in avoiding the excesses of Big Data which we have already seen elsewhere. In today’s competitive marketplace, the increased ability to both collect and connect data will drive organizations to innovate in areas of e-commerce, seeking a commercial advantage.
Invariably some of these innovations will careen across the privacy line and gain public notoriety, as happened to Target with its pregnancy prediction algorithm. While I can’t tell you there was a direct effect on Target’s stock price, the resulting public attention couldn’t have been good for the retailer’s public image.
Furthering the call for PIPEDA reform
That said, the prospect of possible brand damage to protect the privacy rights of Canadians from the vagaries of Big Data is not - and will not be - enough. All told, Big Data bring with it big opportunities for organizations and bigger risks for privacy. And organizations wishing to go down this path need to be incented as appropriate to accept the greater responsibility that comes with it. As it stands today, PIPEDA lacks mechanisms strong enough to ensure organizations invest appropriately in privacy. As a result, consumer trust in the digital economy is at risk.
In this era of Big Data, of rapid technological advances, of increasing power of data analytics and greater movement towards globalization – it is clear that, without amendments, PIPEDA will be even less up to the task of protecting our privacy in coming years. In a position paper published this past May, my Office offered four key recommendations:
- Stronger enforcement powers: Here the options include statutory damages administered by the Federal Court; providing the Privacy Commissioner with order-making powers and/or the power to impose administrative monetary penalties where circumstances warrant. Many jurisdictions in Europe have such powers.
- Breach notification: Require organizations to report breaches of personal information to the Privacy Commissioner and to notify affected individuals, where warranted. Penalties should be applied in certain cases.
- Increasing transparency: Add public reporting requirements to shed light on the use of an extraordinary exception under PIPEDA. This provision allows law enforcement agencies and government institutions to obtain personal information from companies without consent or a judicial warrant for a wide range of purposes.
- Promoting accountability: Amend PIPEDA to explicitly introduce “enforceable agreements” which will help ensure that organizations meet any commitments to improve their privacy practices following an investigation or audit.
Greater transparency and accountability and the proverbial “bigger stick” when needed, can ensure that the balance intended by PIPEDA remains; a balance that takes into account individuals' privacy rights and the legitimate needs of organizations to collect and use personal information for reasonable and appropriate purposes.
In short, this would provide greater incentive to ask not just “can we do it” but “should we do it.”
Report a problem or mistake on this page
- Date modified: