This article was first published in Thomson Reuters Regulatory Intelligence on 20 September 2023 and are the personal views of the author, Robert Baugh. Subscribers link. Free trial link.
Biometrics have long been used in financial services but there are significant risks under the General Data Protection Regulation (GDPR) and other existing laws. In response to the increasing use of biometrics and the explosion of commercial AI solutions, the UK Information Commissioner’s Office (ICO) recently opened consultation on its draft guidance on the use of biometrics.
Security is perhaps the main driver of the use of biometrics for identification in financial services, with typical use cases revolving around authentication and authorisation, leading to correct levels of secure, audited access to: financial transactions and account information; devices such as corporate laptops and mobiles; premises such as offices and, within offices, to particular data rooms, and data within corporate networks, and subsets within that data.
Many jurisdictions have or are bringing in laws on biometrics and artificial intelligence (AI) (which is usually used to process biometrics), so firms should check applicable laws. There are already clear examples in Europe and the United States.
In March 2023, a journalist used AI voice tools to breach the Centrelink services in Australia (which did need a reference number, but that reference number is, for example, printed on correspondence to customers). The point here is to ensure businesses are confident about the actual security of the solution they implement.
There are plenty of articles on bias in AI, and firms will need to check that their choice of biometrics solution — which will most likely use AI — is not biased because of the training data used, difficulties for sections of the population to use the chosen solution, and other grounds.
As the UK’s National Cyber Security Centre notes: ‘With fingerprint systems in particular, there is a great deal of operational and experimental evidence to show that older members of the population, and those who work in some industries, tend to produce lower quality fingerprint readings. For some people, the problem can be so severe that they simply cannot use the biometric system.‘
There are plenty of reports and studies warning of inaccuracy and bias on a range of biometrics including difficulties for those with voice disorders.
Compliance teams will know it is crucial to confirm the exact definitions in relevant laws to judge whether those laws are engaged by their chosen use case.
The draft guidance parses down GDPR’s definition of biometric data into three elements, namely that biometric data:
Examples of physical or physiological biometrics from the draft guidance include: ‘a person’s facial features, fingerprints, iris, voice and even their ear shape’.
Examples of behavioural characteristics from the draft guidance include: a person’s handwriting or method of typing, their gait when walking or running, or their eye movements.
‘Specific technical processing‘ is clearly met by using biometric software security tools as they analyse or digitally represent biometric data to compare it to stored values.
What is interesting is that, while ‘biometric data’ already has identification baked into its definition, that data is still not yet ‘special categories biometric data’ covered by Article 9 of GDPR — that requires purpose.
Special categories of personal data are set out in Article 9(1) and include (author’s emphasis): ‘biometric data for the purpose of uniquely identifying a natural person‘.
The ICO confirms that what turns biometric data into special categories biometric data is a person’s purpose for processing that data, which makes biometrics unique among special categories.
If a firm does wish to process special categories biometric data, it needs an exemption under Article 9(2). The ICO notes that, ‘In most cases, explicit consent is likely to be the only valid condition for processing special category biometric data.‘ However, consent is hard to use, particularly when there is a power imbalance such as in an employment relationship.
The ICO’s draft guidance highlights other bases that financial service firms should consider with their legal advisers, such as necessary for the ‘prevention and detection of unlawful acts’. The draft guidance states that this condition applies if:
Consent can always be withheld or later withdrawn, making it an inappropriate legal basis for strategic use.
Robert Baugh is the founder and CEO of Keepabl, privacy management SaaS based in London. Prior to Keepabl, Robert was general counsel of technology growth companies for more than a decade.
In a world where we’re more connected than ever, we understand the importance of effective communication, especially for groups and international businesses. Keepabl is committed to making our Privacy Management…
Much of the 4 July 2023 decision by the European Union’s highest court is worth digging into. Stick with us for a longer read. We’ll start with the key points…