Privacy as a Selling Point: An Ethical Framework for Marketing in the Digital World

Remarks at the Understanding Privacy 2013 Conference

February 28, 2013
Toronto, Ontario

Address by Chantal Bernier
Assistant Privacy Commissioner of Canada

(Check against delivery)


Introduction

I will set the stage for my presentation today by reminding you of how L.L. Bean got his start in the direct marketing business. In 1912, when Leon Leonwood Bean got the idea of selling his iconic boot — or hunting shoe — through mail-order, he went to the registry office and got a list of names and addresses of non-resident Maine hunting license holders.

He knew these people were hunters: they had a license. He knew how to reach them: through the mail. The only thing he was missing was their address: he got that from the registry office.

I think this fits the definition of what the CMA has taught me to call “interest based advertising” — and it was a hundred years ago.

So for all the collective excitement we may feel about digital, mobile, apps and whatnots, marketers today are interested in the same things Leon was interested in 100 years ago: Who are my customers, what do they want, and how do I reach them.

The fundamentals of marketing don’t really change from one century to the next — and neither do the fundamentals of privacy.

That being said, we all have to learn to live and do business in the digital age in a way that takes into account these established principles in a completely new context.

With this in mind, to address our current and emerging challenges, I think we must take a step back to get our bearings. So today I will touch upon:

  • Privacy fundamentals;
  • How they translate into the digital world; and
  • How we are striving to develop relevant tools and to support legislative developments in this new context.

In the end, I hope to demonstrate how privacy protection is becoming a key competitive factor in the new digital economy, as consumers are increasingly vocal about the so-called “creep factor”, and the giants of the digital economy are striving to demonstrate they are less creepy than their competitors.

Perhaps the first step is to remind ourselves what we mean by “the right to privacy”.

Privacy is the right to control one’s personal information

Essentially, privacy is the right to control one’s personal information.

In fact, we have a visceral need to maintain control over our personal information, because it protects our personal space, our integrity, our reputation, our belongings, and in certain cases even our personal safety.  It is a prerequisite to intimacy and to the exercise of fundamental freedoms. Because it is a fundamental right, the importance of privacy transcends the distinction between different types of data holders and the purpose for which the data is being used. What it means is that, even though, in the context of the private sector, it exists in a legal framework of contractual fairness rather than a human rights regime, it still cannot be restricted to a mere issue of contractual fairness—it is always more than that.

To me, this characterizes the ethical framework for privacy in a commercial setting and elevates it above other aspects contractual fairness.  So — really going back to basics — what is personal information?

Personal information is information that can be traced back to an identifiable individual

According to sec. 2 of PIPEDA, “’personal information’ means information about an identifiable individual”.

The Federal Court has ruled that information will be about an identifiable individual where there is a serious possibility that an individual could be identified through the use of that information, alone or in combination with other available information. [Gordon v. Canada (Health), 2008 FC 258]

Personal information is a commodity we trade for specific purposes

Trading in some of your personal information in order to gain something is not a new phenomenon.

  • We hand over personal information to the state in exchange for governance.
  • We hand over personal information to commercial entities to get goods and services delivered to our homes.
  • We let others into our private sphere in exchange for personal relationships.

In each of these cases, we exercise our right to privacy by choosing what personal information to trade and for what purpose.

Privacy transactions are governed by ethical principles

These privacy transactions are governed by ethical principles of:

  • Informed consent;
  • Proportionate return;
  • Transparency of purpose;
  • Security of information.

We generally refer to them as “fair information principles”. And I know you may be thinking “motherhood and apple pie”.

But the IT revolution is challenging the established notions of motherhood and apple pie, bringing an unprecedented context for application. I will expand on eight specific impacts of the Internet that we see in our investigations and research as forcing a redefinition of compliance with the fair information principles.

New business models

The first impact comes from new business models where instead of the consumer buying a widget from Company A, the consumer is taking part in an ecosystem where she receives free content in exchange for Company A to sell ads about Company B to Company C, often with the help of Company D.

For marketers, the potential to get insight on how consumers think and react today is frankly breathtaking. Combining the power of analytics with the Internet is truly turning the art of marketing into a proper science. Thanks to the “department store knows you’re pregnant” headlines from late last year, all this is common knowledge now.

However, for all the benefits we gain from analytics, we need to put parameters around the use of analytics to ensure that there is no collection, use or disclosure beyond the purposes served by analytics.

This takes me to the second impact to take into account for privacy in the digital age — consumer expectations.

Consumer expectations of privacy

Consumers are more and more talking about the “creep factor” as their way of expressing their visceral opposition to an organization not treating their personal information in a fair and ethical way. And consumers tend to avoid doing business with creeps—that’s not new.

Our Office conducts a biannual public opinion survey of Canadians, precisely to gauge the importance they give to privacy and what their concerns are.  Our most recent survey, the full results of which will be released later this month, shows that: 

  • 66% said they are very concerned about the protection of their privacy; and
  • 71% said protecting personal information will be among Canada’s most important issues for the next decade; that’s a rise from 62% just four years ago.

Further research shows there is a direct relationship between the privacy performance of businesses and their customer loyalty. For example, in a study completed earlier this year by global PR firm Edelman:

  • 46% of individuals reported leaving or avoiding companies who were affected by data breaches.
  • Of those who were breach victims, 39% reported telling a friend about the incident while 29% shared their experience online.
  • Just 32% agreed that their privacy is adequately protected by today’s business practices; and
  • 85% say businesses need to take data security more seriously. 

So clearly, consumer privacy needs to be built in as integral to customer service, whatever the nature of the business.  In the age of Internet, it means businesses provide proper IT security and control use and disclosure of information online.

Definition of personal information

A third impact comes from the emergence of new forms of personal information that challenge the existing definition of personal information.

Just look at the raging debate over the last year around Bill C-30, the so called “Internet surveillance bill”. Is an IP address personal information? What about a MAC address, a UDID or device fingerprinting?

In as many as three distinct investigations into the practices of two ISPs and one email service provider, our Office has determined that a customer’s IP address constitutes personal information if it can be linked to an identifiable individual.

As technology has the potential to drill down much deeper, to capture greater quantities of information through cellphone data, or Internet search records, we need to apply privacy principles in accordance with the specific challenges this new technology brings.

Informed consent online

A fourth impact of the Internet is on the assurance of meaningful consent within the complexity of the online world.

Schedule 1 of PIPEDA is clear: “The knowledge and consent of the individual are required for the collection, use, or disclosure of personal information, except where inappropriate.” And organisations shall make an effort to ensure that this consent is meaningful by making clear how the information will be used or disclosed.

As our relevant investigations have shown, the Internet brings its own specific challenges on meaningful consent.

Let me explain by borrowing an analogy from Dr. Alessandro Acquisti, an expert in behavioural economics and the economics of privacy at Carnegie Mellon University.

Dr. Acquisti explains the challenges of transposing meaningful consent from the physical world to the online world through what he calls the abstraction of the Internet.

He gives the example of being in a hotel room, preparing to turn in. If we found ourselves standing in front of a window, we would have the reflex of closing the blinds, because our senses can perceive the level of exposure.

To the contrary, sitting in a closed room, alone in front of a computer screen, although our level of exposure may be a million times what it would be in front of a hotel room window, our senses fool us into an impression of intimacy. I’m all alone, I see no one, so no one sees me.

So we still have not fully understood our level of exposure online and that undermines meaningful consent in our sharing of information online.

A case in point is the Facebook Timeline affair of last fall: as users were getting migrated from the “Wall” model to the “Timeline” model, some of their old wall postings were resurfacing.

People were shocked to read these old posts, and several could not accept that they had indeed posted this content on their home page and not in a personal message. The level of exposure was the same, but was more apparent to us in 2012.

Consent as a condition of service

This brings me to the fifth impact — what about the rule prohibiting consent as a condition of service when the service is free and funded by ads, served on the basis of personal information?

Since our very first statement on online advertising, in our Facebook investigation of 2009, we made it clear that the legitimacy of online advertising had to be judged in the context of a free service that could only be funded by advertising, and the consumer should expect that.

However, we draw a line at:

  1. Disclosing  personal, individual information to advertisers without consent; and
  2. Tracking consumers across the Web for the purpose of serving ads and making this a condition of service. This is one of the key points of our Office’s guidance and policy position on online behavioural advertising, and it has come up in high profile investigations.

According to PIPEDA (Principle 4.3.3 of Schedule 1), an organization shall not,

as a condition of service, require an individual to consent to the collection, use, or disclosure of information beyond that required to fulfill the explicitly specified and legitimate purposes.

In our investigation into youth-oriented social networking site Nexopia, which was made public last year, we similarly determined that the youth website’s use of personal information for advertising purposes and its serving of interest-based advertisement was acceptable as a condition of service, provided individuals were made fully aware of how this practice works.

However, Nexopia also allowed third parties such as advertising networks to place cookies in the browsers of users and visitors to its site in order to collect their information. This was not reflected in the site’s privacy policy, and users were not provided with a means to opt out.

We recommended that Nexopia better inform its users of this practice and provide a means to opt out; the site did agree to implement these recommendations.

This brings me to the sixth impact — youth as direct consumers and the specific challenge of ensuring informed consent of youth.

Informed consent and youth

In our investigation of the Nexopia website, we also examined consent with regard to its target audience—teenagers.

The complainant in this matter claimed that the language used in Nexopia’s Privacy Policy needed to be understandable and appropriate for youth, if meaningful consent was to be obtained.

And we agreed. In fact, we made the point that the notion of meaningful consent must be interpreted in relation to the audience to be properly implemented.

We recommended:

  • that the site review its privacy policy and other site material to ensure that they are presented in a language and format that is appropriate for its user base;
  • and that the site develop ways to inform users of its personal information handling practices that are appropriate for its young audience, and to require users to actively  consent to the purposes of collection at the time of registration.

The seventh concept I will mention as impacted by its application online, particularly in relation to online advertising, is transparency.

Transparency and third parties

Those of you who are familiar with Schedule 1 of PIPEDA know that openness is one of its 10 principles. Clause 4.8 states that “an organization shall make readily available to individuals specific information about its policies and practices relating to the management of personal information”.

How does this translate into business models where third parties—and often, several layers of third parties—are entrusted with users’ personal information for the purpose of serving ads?

Our Office had an opportunity to reflect upon this recently in the course of our study of personal information leakage on popular Canadian websites.

Our testing found that some sites were disclosing information such as email addresses, user names and location to a number of analytics and marketing firms, in some cases not keeping with statements made in the organizations’ privacy policies.

Mostly, it boiled down to a matter of transparency: how clearly was the company indicating to the consumer that it was disclosing personal information to third party advertisers?

If yes, and it was doing so with meaningful consent, then we accept that the consumer may even benefit from the disclosure. But if not, then personal information was disclosed without consent, in contravention of PIPEDA.

Safeguards

Finally, the move to digital impacts on the notion of safeguards. PIPEDA requires that personal information be protected by security safeguards appropriate to the sensitivity of the information, and that an organization’s security safeguards shall protect personal information against loss or theft, as well as unauthorized access, disclosure, copying, use or modification.

Transmission security was one of the aspects of our recently concluded investigation into WhatsApp, which we conducted in collaboration with the Dutch Data Protection Authority. We looked into public reports that account confirmation messages were being sent by WhatsApp using ordinary web traffic ports, allegedly without encryption or safeguards. Our investigation confirmed that upon becoming aware of the potential for a security breach in registration (prior to our investigation), WhatsApp took measures to correct the problem.

Safeguards are also of particular relevance to the massive data breaches that are coming to light with alarming frequency. In conversation with our technologists on the heels of a breach that occurred in early 2011 (the first year to be dubbed “the year of the breach”…), it became clear to me that data security is an obligation of means, not of results. To borrow from FBI Director Robert S. Mueller, “there are only two types of companies: those that have been hacked and those that will be.”

But breaches are avoidable to a certain extent, and breach response demonstrates an organisation’s commitment to privacy.

LinkedIn gave us an example of both how an organisation must strive to always meet the highest levels or safeguard and how an accountable organisation responds to breaches.

When the breach occurred, we entered into a dialogue with LinkedIn and identified areas where they could have had higher safeguards. In fact, they proceeded to implement them immediately.

They kept us apprised of their breach response step by step, confidentially but openly, demonstrating how they were doing everything right to close the breach and enhance the safeguards.

The “light fall-out”, shall we say, of the LinkedIn breach demonstrates that accountability for protecting personal information has much greater value to organizations than simply complying with privacy laws. It’s good business practice too. Customers and clients are likely to develop greater trust in a privacy accountable organization, which provides the organization with a competitive edge.

In fact, that is exactly the underpinning of PIPEDA as stated in section 3:  to balance the right to privacy of individuals with the need for organisations to collect personal information for appropriate purposes.

This brings me to a few remarks about the regulatory framework.

 The regulatory framework

Embedded in the official title of PIPEDA is the requirement to promote electronic commerce. We are always mindful of this goal.

We endeavour to understand new business models:

  • in order to  provide useful guidance to businesses;
  • In order to provide relevant advice to Parliament.

The strategic outcome is a marketplace that is fair for both businesses and consumers, where privacy is a key competitive factor.

In this third part of my presentation today, I will say a few words about recent guidance we issued on online behavioural advertising (OBA) and app development that may be relevant to you or your clients, and give you an update on the implementation of Canada’s Anti-Spam Legislation (CASL) and possible PIPEDA amendments.

Online behavioural advertising

The OPC published a Policy Position and Guidelines on OBA in June 2012. Both were developed to help businesses ensure their practices are fair, transparent and in compliance with PIPEDA.

User choice:

  •  Purpose of collection must be explained to users in a clear and transparent manner;
  • Collection must be done with the knowledge and consent of users;
  • Opt-out can be considered reasonable under certain circumstances.

Restrictions:

  • Techniques that cannot be controlled by the individual (e.g. zombie cookies, super cookies);
  • Tracking youngsters. It is important to recognize that the CMA has taken an early and strong stand against tracking children.

Good privacy practices for developing mobile apps

In October of 2012, we issued Seizing Opportunity: Good Privacy Practices for Developing Mobile Apps, a document developed jointly with our counterparts in Alberta and British Columbia.

We issued this guidance to help app developers in Canada address the unique characteristics of the mobile environment which we believe create special challenges for protecting privacy.

Among the challenges specific to mobile devices are:

  1. the potential for comprehensive surveillance of individuals;
  2.  the fact that conveying meaningful information about privacy choices, a demanding task in a desktop environment, is compounded by a mobile device’s small screen and intermittent user attention; and
  3. the lightning speed of the app development cycle and the potential to reach hundreds of thousands of users within a very short period of time.

Our guidance document puts forward five key privacy considerations for mobile app developers:

  1. You are accountable for your conduct and your code;
  2. Be open and transparent about your privacy practices;
  3. Collect and keep only what your app needs to function, and secure it;
  4. Obtain meaningful consent despite the small screen—by layering the information, providing a privacy dashboard, or using graphics;
  5. Communicate notices to users and obtain their consent at the right time—when they download the app, and when they’re using it; in other words, both in advance and in real time.

Canada’s Anti-Spam Legislation (CASL)

Moving now to the latest recruit of the privacy legal framework, Canada’s Anti-Spam Legislation (CASL).

I know that advertisers, marketers and public relations professionals have been closely monitoring the progression of CASL and of its regulations. Therefore, you are aware that enforcement duties are shared between the CRTC, the Competition Bureau and the OPC.

While our area of responsibility with regard to enforcing CASL is confined to email address harvesting and spyware, as the guardians of Canadians’ right to privacy, we are of course in favour of eradicating spam. An anti-spam law is essential to combat criminal activity and to address vulnerabilities. In order to be effective, the application of such a law must distinguish between fair use of electronic communication and unwarranted use of electronic communication.

We agree that exemptions are needed in order to make that distinction. However, we cannot agree with specific exemptions that would curtail the fundamental aim of anti-spam legislation, which is to respond to Canadians’ justified demands to be protected from spam.

Possible amendments to PIPEDA

One final point I wish to touch upon with you today is an update on PIPEDA review.

Bill C-12, which is currently before Parliament, would bring a form of breach notification requirement across Canada. However, these proposed changes stem from recommendations made to parliamentarians back in 2006. When the government first made these proposals, our Office viewed them as a good first step. A lot has changed over the years — and the data breach reporting proposals of C-12 are simply falling behind the times.

In recent years, we’ve seen — and we continue to see — very serious, large-scale breaches. And C-12 may no longer be sufficient to create the kind of incentives we need to ensure that organizations take data security more seriously.

In light of this reality, serious consideration should be given to putting more teeth into the current proposals. And this is why we are encouraging the federal government to explore real enforcement options to create stronger incentives for organizations to adequately protect personal information.

Privacy as a competitive advantage

In conclusion, I hope to have given you a better understanding our Office’s expectations with regard to your industry. And I would like to conclude on the competitive advantage of protecting privacy. When I see things like Microsoft’s online “GMail Man” campaign, the sole aim of which is to convince people that they’re more mindful of their client’s privacy than the next giant, I am comforted in my long-standing belief that privacy is a selling point. Consider the significant evolution of Facebook’s public discourse surrounding user privacy over the course of its few years of existence. Consider how the world’s largest social network makes a selling point out of protecting its users’ privacy.

The continued evolution of the digital economy has had a significant impact on the import of privacy in today’s world.   The interconnected nature of our consumer and societal dealings have resulted in ever-increasing challenges for individuals to maintain control over their personal information; information that has become increasingly commoditized and counted as a key asset in many contemporary business models. It is an economic axiom that scarcity of an asset increases its value, and that is what we are seeing with privacy—whether in its individual or aggregate form, personal information is more valued than ever by Canadians and businesses alike.

Consequently, more than ever, privacy matters for Canadians.  The care, manner and transparency with which a business treats personal information have become a key consideration for Canadians in deciding who to do business with and where to spend their money.  Rather than a hindrance, this needs to be viewed as a clear opportunity for businesses. 

Those that take care in their collection, use and disclosure of personal information will gain a competitive advantage against those who do not.

Date modified: