Much of the 4 July 2023 decision by the European Union’s highest court is worth digging into. Stick with us for a longer read. We’ll start with the key points and work down.
This was about Meta’s ‘off-Facebook‘ collection of personal data on Facebook users and combining that data with ‘on-Facebook‘ personal data, ie: data about that user within Facebook itself, for personalised advertising and more, all based on the general terms of use of the Facebook service.
‘Off-Facebook’ includes not just collecting personal data on Facebook users when they were using other Meta properties such as Instagram and WhatsApp. It also includes personal data about Facebook users when they visited third party sites and apps through the use of programming interfaces, cookies and similar technologies.
It’s very important to bear this actual processing in mind when you hear or read about the decision. It wasn’t simply about targeted advertising in and of itself. It was about collecting off-Facebook data and combining that with on-Facebook data to analyse for personalised advertising and more.
It’s shocking but true, GDPR isn’t the only law, and it doesn’t exist in a vacuum. It’s so obvious when you think about it but it is worth saying, particularly with current debates about AI and scraping data (intellectual property anyone?).
Unlike all the Meta cases and decisions we’ve been reading recently (did you know Ireland’s DPC has fined Meta €2.5bn in the last 3 years?) this case didn’t start life as a data protection case. It was a competition law case initially in the German Federal Cartel Office, the Bundeskartellamt.
We’ve put a short primer on the relevant competition law at the end of this blog for those interested. All quotations here are from the English version of the CJEU decision.
As a competition law case, GDPR’s one-stop shop doesn’t apply, so the German Federal Cartel Office brought their case against:
If this had been a GDPR case, most likely it would just involve Meta Ireland, as Facebook’s EU main establishment in the EU. Because this is competition law, there rules are different and it’s against Meta USA, Meta Ireland and Facebook Germany.
The CJEU gave unsurprising but much needed clarity on special categories. While we’ll just talk about Facebook here to keep it simple, the CJEU made its decision applicable generally to social media companies, not just Facebook.
First, we need to set the scene. Imagine a Facebook user visits a website or app which, by its nature, is all about special categories. The referring court gave examples ‘such as flirting apps, gay dating sites, political party websites or health-related websites‘.
Now, in that context, the CJEU clarified certain aspects for us.
If the Facebook user enters details into, or places an order with, that other website or app and Meta gets data on that interaction through its interfaces, cookies or similar, that data is special categories if the processing allows information falling within Art 9 ‘to be revealed’:
Nothing surprising, but still helpful. See paras 155(2) and 68-73 of the decision for more.
The CJEU gave us two main examples, with some guidance:
In practice, this means the presumption is against data being manifestly made public unless Facebook could prove that the user had been able, with full knowledge of the facts, to choose settings such as ‘make public’ or ‘friends only’ and had knowingly chosen to make it fully public for all to see.
Again, this all makes sense though it is very helpful to have it set out in a decision, particularly for those working in Marketing. See paras 155(3) and 78-85 of the decision for more.
The CJEU ran through all the other legal bases in Art 6 GDPR, with particular emphasis on contract and legitimate interests given the case. While it did so with its usual hospital pass to the referring court, the CJEU made its position very clear.
You’ll know that there are 6 legal bases in Art 6 GDPR and, apart from consent, the processing needs to be necessary for the legal basis to be available.
There’s a body of case law on the meaning of ‘necessary’, and analysing this decision in the context of that case law is for another day – no doubt much will be written on this topic.
What we will say is the CJEU seemed to give two different definitions of ‘necessary’ that mixed aspects together, that do make sense, and that are in the scope of that case law.
In practice – which is our focus – there’s not much that’s surprising.
The CJEU defined ‘necessary’ in the context of ‘necessary for the performance of a contract to which the data subjects are party’ in Art 6(1)(b) as meaning that:
the processing is objectively indispensable for a purpose that is integral to the contractual obligation intended for those users, such that the main subject matter of the contract cannot be achieved if that processing does not occur
See paras 155(4) and 98-99 and 125 of the decision for more.
The two elements the referring court asked the CJEU to look at under ‘necessary for contract’ were:
The CJEU rapidly confirmed that Meta’s processing in question for each of these purposes was not necessary for the contract, subject to verification by the referring court, noting on the second point: ‘The various products and services offered by that group can be used independently of each other and the use of each product or service is based on the conclusion of a separate user agreement.’
See paras 101 to 104 of the decision for more.
In the context of ‘necessary for the legitimate interests of the controller’ under Art 6(1)(f), the CJEU defined ‘necessary’ as meaning that:
the legitimate data processing interests pursued cannot reasonably be achieved just as effectively by other means less restrictive of the fundamental rights and freedoms of data subjects, in particular the rights to respect for private life and to the protection of personal data guaranteed by Articles 7 and 8 of the Charter’ [and the processing has to satisfy the data minimisation principle]
See paras 108 and 109 of the decision for more.
Whether your processing is ‘necessary’ for your legitimate interests depends on what legitimate interest you’re pursuing, and the results of the balancing assessment, or ‘legitimate interests assessment’ or ‘LIA‘ to see if your legitimate interest is outweighed by the data subject’s interests.
The CJEU ran through 4 main legitimate interests named by the referring court (the referring court had named 8, but the CJEU didn’t look at two as there was no supporting explanation by the referring court, and the CJEU seemed to forget about two others).
The CJEU said an emphatic no.
While Recital 47 GDPR says direct marketing may be a legitimate interest, the CJEU decided that such processing for personalised advertising without their consent is not in the reasonable expectation of the data subject, so the LIA falls in the data subject’s favour.
From public LinkedIn posts, we understand NOYB has carried out a survey on this, which backs up the CJEU’s statement. It will be interesting to see what surveys come out of the woodwork going forward on this.
In this regard, it is important to note that, despite the fact that the services of an online social network such as Facebook are free of charge, the user of that network cannot reasonably expect that the operator of the social network will process that user’s personal data, without his or her consent, for the purposes of personalised advertising.
On top of that, in para 188, the CJEU stated clearly how massive a scope the processing at issue has, calling it ‘particularly extensive since it relates to potentially unlimited data and has a significant impact on the user, a large part – if not almost all – of whose online activities are monitored by Meta Platforms Ireland, which may give rise to the feeling that his or her private life is being continuously monitored’.
Takeaway: Para 117’s drafting is not limited to the processing in question, and doesn’t call for referring court verification, so this looks like a straightforward CJEU statement that personalised advertising needs consent and cannot be done on legitimate interests. This will cause lots of discussion.
The CJEU said ‘possible, but doubtful here’.
While product improvement ‘cannot be ruled out’ as a legitimate interest, the CJEU noted the processing must be necessary and, given the scale and intrusion already noted in the processing in question, it’s doubtful Meta could rely on it here, particularly if there’s a child’s data involved.
This makes sense and the reference to a child’s data is a major practical point: if there’s one child’s data in a dataset (or one piece of special categories) and you cannot separate that out, in practice you’ll need to treat the entire dataset and processing activity as being 100% children’s data (or special categories).
See paras 122-123 of the decision for more.
Takeaway: Product improvement is a legitimate interest but whether it’s one that you can hang your processing on depends on passing an LIA including looking at necessity, scale, impact and vulnerable data subjects.
The CJEU basically said ‘are you serious?’
The purposes for the processing based on public task or official authority were stated as:
While no material had been provided by the referring court here, the CJEU gave these short thrift for the processing in question – remember this is about collecting off-Facebook data and combining it with on-Facebook data about Facebook users.
The CJEU instructed the referring court that this was a possible legitimate interest but unlikely to be available given, again, the scale and impact of the processing and whether such processing was ‘strictly necessary’. But the nail in the coffin was that the CJEU instructed the referring court to consider
whether Meta Platforms Ireland was entrusted with a task carried out in the public interest or in the exercise of official authority, in particular with a view of carrying out research for the social good and to promote safety, integrity and security, bearing in mind that, given the type of activity and the essentially economic and commercial nature thereof, it seems unlikely that that private operator was entrusted with such a task.
See paras 127-139 of the decision for more.
Takeaway: It looks like the CJEU was saying that a controller has to be entrusted with a public task or exercising offical authority and this will be interpreted strictly, as with all 5 bases that need necessity.
Again, the CJEU basically said ‘are you serious?’
it must be held that that objective is not capable, in principle, of constituting a legitimate interest pursued by the controller, within the meaning of [Art 6(1)(f)]. A private operator such as Meta Platforms Ireland cannot rely on such a legitimate interest, which is unrelated to its economic and commercial activity.
The CJEU did say that Meta might be able to do this if it was under legal obligation to do so, but that also seems very unlikely given the court’s comments on that legal basis.
See para 124 of the decision for more.
Takeaway: Nothing surprising here given the processing in question. It simply can’t be right that Meta was off-the-bat collecting and combining off-Facebook data with on-Facebook data to share with law enforcement to prevent crime. That’s a huge over-reach reminiscent of the telco retention directive that was held contrary to the EU constitution.
The Federal Cartel Office had decided consent wasn’t an appropriate legal basis and, while the case was mostly about the legal bases of necessary for contract and necessary for legitimate interests, the referring court also had some questions for the CJEU.
The CJEU was clearly sceptical of the applicability of consent here, and had particular concerns over free choice, transparency and notice, separation of consents, and consent when unrelated to contract. But the CJEU dealt with three key areas.
The CJEU clarified that, just because a controller has a dominant position under competition law, that does not automatically invalidate consent as a legal basis under GDPR. However, it is an important factor in determining if any consent was valid, particularly if it was freely given.
See paras 147 & 154 of the decision for more.
The CJEU held that it was ‘appropriate’ to obtain separate consents for processing of on-Facebook data and off-Facebook data particularly given:
And the Court went so far as to hold that, to the extent that consent for off-FB data cannot be separated out, we must presume consent for off-FB data is not freely given.
See para 151 of the decision for more.
The CJEU had already decided (see paras 102-104) that the processing at issue is not strictly necessary for the performance of the contract between Meta and the Facebook user (subject to referring court verification).
And in the already-famous para 150, the Court said:
Thus, those users must be free to refuse individually, in the context of the contractual process, to give their consent to particular data processing operations not necessary for the performance of the contract, without being obliged to refrain entirely from using the service offered by the online social network operator, which means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations.
On balance, at present, and with all the usual forecasting caveats, we do not think is not groundbreaking. Nor do we think this will open huge gates to fearsome practices or obligations.
Regulatory guidance back to the Art 29 Working Party days has consistently said that consent is needed for behavioural advertising (see [opens the PDF] Opinion 2/2010 on online behavioural advertising). And this case isn’t simply about personalised advertising because of use of a particular website or app: this is about Meta gathering data about Facebook users from Meta’s other services and digital estates of third parties and combining that with the data on that user from within Facebook.
As the AG and CJEU noted, you can use one of Meta’s services without needing to use the other. And we have to agree, you don’t need to have your personal data from third party sites combined with your Facebook data to use Facebook.
The CJEU’s comments on the other legal bases were less newsworthy, but still interesting.
Unsurprisingly, the CJEU said ‘maybe, but …’.
Again, no material had been provided by the referring court here. The CJEU stated that this legal basis is only available when the processing is ‘actually necessary for compliance with a legal obligation’ to which Meta is subject [para 155(6)] and this requires an EU or Member State law to which the controller is subject and which, under Art 6(3) GDPR: ‘shall meet an objective of public interest and be proportionate to the legitimate aim pursued’.
As well as taking into account the scale and impact of the processing, and whether that processing was ‘strictly necessary’, the CJEU asked the referring court to consider ‘whether Meta Platforms Ireland is under a legal obligation to collect and store personal data in a preventive manner in order to be able to respond to any request from a national authority seeking to obtain certain data relating to its users’.
It looks pretty clear that they’re not.
See paras 127-139 of the decision for more.
Takeaway: If you’re going to rely on legal obligation, you want to point to a law that you’re subject to, that meets the objective of public interest, and is proportionate to the legitimate aim.
The CJEU effectively said ‘No, but …’.
While this is of course a legal basis set out in GDPR, the question was whether it applied to the processing in question. The CJEU left it to the referring court with a strong indication of the CJEU’s feelings:
in view of the nature of the services provided by the operator of an online social network, such an operator, whose activity is essentially economic and commercial in nature, cannot rely on the protection of an interest which is essential for the life of its users or of another person in order to justify, absolutely and in a purely abstract and preventive manner, the lawfulness of data processing such as that at issue in the main proceedings
See paras 136-139 of the decision for more.
On 6 February 2019 – over 4 years ago – the Federal Cartel Office in Germany (the Bundeskartellamt) gave its decision against Meta USA, Meta Ireland and Facebook Germany, prohibiting them from:
The Federal Cartel Office also:
The Federal Cartel Office said such processing ‘constituted an abuse of that company’s dominant position on the market for online social networks for private users in Germany‘ as defined by German law (see below).
In particular, they decided that the ‘general terms’, as a result of that dominant position, constituted an abuse ‘since the processing of the off-Facebook data that they provide for is not consistent with the underlying values of the GDPR and, in particular, it cannot be justified in the light of Article 6(1) and Article 9(2) of that regulation‘.
In EU competition law, abuse of dominant position is set out in Article 102 of the Treaty on the Functioning of the European Union or TFEU (older readers will have first learned this rule as Article 82 but that’s another story):
‘Any abuse by one or more undertakings of a dominant position within the internal market or in a substantial part of it shall be prohibited as incompatible with the internal market in so far as it may affect trade between Member States.
Such abuse may, in particular, consist in:
(a) directly or indirectly imposing unfair purchase or selling prices or other unfair trading conditions;
(b) limiting production, markets or technical development to the prejudice of consumers;
(c) applying dissimilar conditions to equivalent transactions with other trading parties, thereby placing them at a competitive disadvantage;
(d) making the conclusion of contracts subject to acceptance by the other parties of supplementary obligations which, by their nature or according to commercial usage, have no connection with the subject of such contracts.’
The Federal Cartel Office’s decision was based on German competition law and in particular Paragraph 19(1) and Paragraph 32 of the Gesetz gegen Wettbewerbsbeschränkungen (Law against restrictions on competition).
Unsurprisingly, yes, they can. The competition authority should liaise with the GDPR DPA if GDPR is involved, and must respect any existing decisions by the DPAs, but there’s no reason they can’t look at GDPR as they do other laws.
And the CJEU confirmed that, when the competition authority does look at GDPR, it’s not acting as a supervisory authority for GDPR.
if you’re still with us you’re a total Privacy geek! So why not take a demo of Keepabl’s award-winning Privacy Management Software?
Happy Halloween from the team at Keepabl! To celebrate one of our favourite seasons, here are some of the best horror-themed Privacy puns we could think of: The Excelcist Right…
Original posted on FinTECHTalents on 6 April 2021 The following is the first in a six part part series on GDPR & Financial Services from Keepabl. Keepabl will review how…