This 9 September 2024 decision by Canada’s Federal Court of Appeal is the latest in a long and international line of decisions resulting from the infamous use of Facebook users’ personal data by the app “thisisyourdigitallife” (TYDL) and Cambridge Analytica (to whom TYDL sold the data) and Facebook’s Privacy practices at the time.
References in square brackets are to paragraphs in the decision.
The decision may not be wholly surprising, but it’s a goldmine of reminders and clarifications about some of the key tenets of Privacy law, and on the legal impact of public statements on your own legal case…
There was, respectfully, considerable probative evidence that bore on the questions before the Court, including; the Terms of Service and Data Policy, the transcript of Facebook’s Chief Executive Officer, Mark Zuckerberg’s testimony that he “imagine[d] that probably most people do not” read or understand the entire Terms of Service or Data Policy, that 46 % of app developers had not read the Platform Policy or the Terms of Service since launching their apps, that TYDL’s request for information was beyond what the app required to function, and the decision to allow TYDL to continue accessing installing users’ friends’ data for one year in the face of “red flags” regarding its non-compliance with Facebook’s policies.
Here are our 8 key takeaways!
Consent plays a key role in Canada’s PIPEDA. Consent has to be meaningful, and there’s a ‘double reasonableness test’.
As the Court noted [29]: ‘Meaningful consent is described in clause 4.3 of Schedule 1 of PIPEDA as “Principle 3”.’ [emphasis ours]
PIPEDA 4.3.2: The principle requires “knowledge and consent”. Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed.
PIPEDA 6.1: For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.
The Court recognised the above ‘double reasonableness’ test in PIPEDA [71]:
And the Court made the only logical conclusion on which part of the test is key, the individual’s ability to understand the information [72]:
Put more simply, if the reasonable person would not have understood what they consented to, no amount of reasonable efforts on the part of the corporation can change that conclusion.
Data protection laws are a balance between the rights of business to process personal data in their business and the rights of the individuals concerned – it’s in the very title of EU GDPR. This Canadian decision echoes European decisions that, in the balance, the rights of the individual generally wins over the right of the business, and makes it crystal clear [62, emphasis ours]:
[PIPEDA] speaks of a corporation’s need for information. It does not speak of a corporation’s right to information. This is critical. The legislation requires a balance, not between competing rights, but between a need and a right.
UK and EEA practitioners are used to considering the reasonable expectations of individuals, particularly in the context of deciding if legitimate interests is an appropriate legal basis, in evaluating transparency and fairness, and in evaluating compatible purposes.
As the Canadian court noted, PIPEDA clause 4.3.5 expressly states that an individual’s reasonable expectations are relevant in judging the boundaries of consent [emphasis ours]:
In obtaining consent, the reasonable expectations of the individual are also relevant. For example, an individual buying a subscription to a magazine should reasonably expect that the organization, in addition to using the individual’s name and address for mailing and billing purposes, would also contact the person to solicit the renewal of the subscription. In this case, the organization can assume that the individual’s request constitutes consent for specific purposes. On the other hand, an individual would not reasonably expect that personal information given to a health-care professional would be given to a company selling health-care products, unless consent were obtained. Consent shall not be obtained through deception.
GDPR does not contain such express language – it only refers to reasonable expectations when talking about legitimate interests in Recital 47 and compatible purposes in Recital 50, but not the related Articles.
Whether, in practice, this wording in PIPEDA extends the theory of reasonable expectations beyond its use in GDPR is a topic for another blog!
In a lesson applicable around the world, the decision [74 to 75] reminds us to consider whether there are different populations of individuals to be considered in the same overall activity and, if so, to considering the journey of each population and therefore the correct Privacy notices to provide to each, and how and when in the activity [emphasis ours]:
Here, the circumstances of consent differed between two groups of Facebook users whose data was disclosed: users that downloaded third-party apps, and friends of those users. … Only those who installed the third-party apps, and not their friends, were given the opportunity to directly consent to TYDL’s (or other apps’) use of their data upon review of the app’s privacy policy. …
This distinction between users and friends of users is fundamental to the analysis under PIPEDA. The friends of users could not access the GDP process on an app-by-app basis and could not know or understand the purposes for which their data would be used, as required by PIPEDA.
Facebook had therefore failed on collecting consent from the ‘friends’ category. However, the Court also decided that Facebook had failed to obtain meaningful consent from both populations as the consent was not meaningful.
Sprawling, global, multi-brand, multi-player solutions such as Facebook have a naturally hard time presenting the right information at the right time so that users will have not only the ability to read them, but to understand the ways their data can be used. It can be very tempting to present hugely long notices and policies, but length doesn’t mean clear and accessible.
Echoing many aspects of the 2012 Irish DPC decision on WhatsApp, the Canadian court’s decision [86] stated:
Terms that are on their face superficially clear do not necessarily translate into meaningful consent. Apparent clarity can be lost or obscured in the length and miasma of the document and the complexity of its terms. At the length of an Alice Munro short story, the Terms of Service and Data Policy—which Mark Zuckerberg, speaking to a U.S. Senate committee, speculated that few people likely ever read—do not amount to meaningful consent to the disclosures at issue in this case.
This has now been settled in the EEA: you can be a compliant controller or processor and still suffer a breach. Canada’s Federal Court of Appeal put it very pithily [109]:
An organization can be perfectly compliant with PIPEDA and still suffer a data breach. However, the unauthorized disclosures here were a direct result of Facebook’s policy and user design choices. Facebook invited millions of apps onto its platform and failed to adequately supervise them.
If you invite a stampede you can’t claim to be overrun [115]:
It may be true that reading all third-party apps’ privacy policies would have been practically impossible. But, this is a problem of Facebooks’s own making. It invited the apps onto its website and cannot limit the scope of its responsibilities under section 6.1 and Principle 3 of PIPEDA by a claim of impossibility.
The lesson here is to have careful plans in place that will scale with your success:
[116] ‘… Admittedly, the question before this court is not one of negligence — but it is one of whether Facebook took reasonable steps to protect the data of users that it invited onto its site. This observation has even greater resonance when considered in the context of Facebook’s business model: the more apps, the more users, the more traffic, the more revenue. Having created the opportunity for the data breach, Facebook cannot contract itself out of its statutory obligations.’
In the UK and EEA (despite what the last UK Troy government kept pushing), we’re used to the risk-based nature of GDPR and using measure that are ‘appropriate’. As the UK ICO notes when discussing data protection by design and by default [emphasis ours]:
You must put in place appropriate technical and organisational measures designed to implement the data protection principles effectively and safeguard individual rights.
There is no ‘one size fits all’ method to do this, and no one set of measures that you should put in place. It depends on your circumstances.
And the Canadian Federal Court of Appeal stressed the same principle [120, emphasis ours]:
This denies the importance of context. While it is true that the normative law applies equally, to all, its application varies with the context. Facebook’s business model centres around aggregating information and maintaining users on its platform for the purposes of selling advertising. The raison d’être of Facebook shapes the content and contours of its obligations to safeguard information and to obtain meaningful consent. There are no internal limits or brakes on Facebook’s “need” for information, given what it does with information, the demographic of its clientele, and the direct link between its use of information and its profit as an organization. A car dealership’s “need” for information is entirely different; the nature of the information and its uses are reasonably understandable, predictable and limited. The analogy to a car dealership is inapt.
That’s our 8 key takeaways from this thoughtful decision on PIPEDA. There’s a lot more to the decision, so do have a read!
Take a demo of our award winning Privacy Management SaaS and see how we help you rapidly implement appropriate Privacy Governance, with instant reporting and transparency, at your organisation.
The Schrems II decision came out nearly 2 years ago, on 16 July 2020. Given the enormous data flows from the EEA and UK to the USA, and many unanswered…
You may have noticed we skipped our July newsletter covering June, apologies, so here’s a summer bumper edition! As always, so much to talk about with the new Privacy triumvirate…