Biometric data in financial services: existing law, deep fakes, and the UK ICO's draft guidance

Biometrics have long been used in financial services but there are significant risks under the General Data Protection Regulation (GDPR) and other existing laws. In response to the increasing use of biometrics and the explosion of commercial AI solutions, the UK Information Commissioner's Office (ICO) recently opened consultation on its draft guidance on the use of biometrics.
Fingerprint biometrics access

This article was first published in Thomson Reuters Regulatory Intelligence on 20 September 2023 and are the personal views of the author, Robert Baugh. Subscribers linkFree trial link.

Biometrics have long been used in financial services but there are significant risks under the General Data Protection Regulation (GDPR) and other existing laws. In response to the increasing use of biometrics and the explosion of commercial AI solutions, the UK Information Commissioner’s Office (ICO) recently opened consultation on its draft guidance on the use of biometrics.

 

Biometrics in financial services

Security is perhaps the main driver of the use of biometrics for identification in financial services, with typical use cases revolving around authentication and authorisation, leading to correct levels of secure, audited access to: financial transactions and account information; devices such as corporate laptops and mobiles; premises such as offices and, within offices, to particular data rooms, and data within corporate networks, and subsets within that data.

 

Risks

 

Regulatory

Many jurisdictions have or are bringing in laws on biometrics and artificial intelligence (AI) (which is usually used to process biometrics), so firms should check applicable laws. There are already clear examples in Europe and the United States.

  • Using biometrics in breach of EU/UK GDPR can lead to the highest fines of 4% of global turnover or 20 million euros/£17.5 million, whichever is higher.
  • If a business is based in the United States, it will need to consider biometric laws in states such as Illinois and Texas. The Illinois Biometric Information Privacy Act (BIPA) allows for liquidated damages for negligent violations of $1,000 and $5,000 for reckless violations (or actual damages if greater) as well as reasonable attorneys’ fees and costs. A recent BIPA fine of $228 million was overturned in July 2023 and sent back for jury determination.
  • New York City has recently introduced a law on the use of AI in hiring.

 

Security and deepfakes

In March 2023, a journalist used AI voice tools to breach the Centrelink services in Australia (which did need a reference number, but that reference number is, for example, printed on correspondence to customers). The point here is to ensure businesses are confident about the actual security of the solution they implement.

 

Bias

There are plenty of articles on bias in AI, and firms will need to check that their choice of biometrics solution — which will most likely use AI — is not biased because of the training data used, difficulties for sections of the population to use the chosen solution, and other grounds.

As the UK’s National Cyber Security Centre notes: ‘With fingerprint systems in particular, there is a great deal of operational and experimental evidence to show that older members of the population, and those who work in some industries, tend to produce lower quality fingerprint readings. For some people, the problem can be so severe that they simply cannot use the biometric system.

There are plenty of reports and studies warning of inaccuracy and bias on a range of biometrics including difficulties for those with voice disorders.

 

Legal definitions

Compliance teams will know it is crucial to confirm the exact definitions in relevant laws to judge whether those laws are engaged by their chosen use case.

  • GDPR defines ‘biometric data’ as ‘personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic [fingerprint] data‘.
  • For comparison, Illinois’ BIPA defines ‘biometric identifier’ as ‘a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry‘ before listing items that are not biometric identifiers. This is a very different definition to GDPR.

 

UK ICO’s draft guidance

The draft guidance parses down GDPR’s definition of biometric data into three elements, namely that biometric data:

  1. relates to someone’s behaviour, appearance, or observable characteristics (e.g., the way someone types, a person’s voice, fingerprints, or face);
  2. has been extracted or further analysed using technology (e.g., an audio recording of someone talking is analysed with specific software to detect things like tone, pitch, accents and inflections); and
  3. can uniquely identify (recognise) the person it relates to‘.

Examples of physical or physiological biometrics from the draft guidance include: ‘a person’s facial features, fingerprints, iris, voice and even their ear shape’.

Examples of behavioural characteristics from the draft guidance include: a person’s handwriting or method of typing, their gait when walking or running, or their eye movements.

Specific technical processing‘ is clearly met by using biometric software security tools as they analyse or digitally represent biometric data to compare it to stored values.

 

The importance of purpose

What is interesting is that, while ‘biometric data’ already has identification baked into its definition, that data is still not yet ‘special categories biometric data’ covered by Article 9 of GDPR — that requires purpose.

Special categories of personal data are set out in Article 9(1) and include (author’s emphasis): ‘biometric data for the purpose of uniquely identifying a natural person‘.

The ICO confirms that what turns biometric data into special categories biometric data is a person’s purpose for processing that data, which makes biometrics unique among special categories.

 

Help on legal basis

If a firm does wish to process special categories biometric data, it needs an exemption under Article 9(2). The ICO notes that, ‘In most cases, explicit consent is likely to be the only valid condition for processing special category biometric data.‘ However, consent is hard to use, particularly when there is a power imbalance such as in an employment relationship.

The ICO’s draft guidance highlights other bases that financial service firms should consider with their legal advisers, such as necessary for the ‘prevention and detection of unlawful acts’. The draft guidance states that this condition applies if:

  • a firm needs to use biometric data for crime prevention or detection purposes; and
  • asking for people’s consent means the firm would not achieve those purposes.

Consent can always be withheld or later withdrawn, making it an inappropriate legal basis for strategic use.

 

Eight action points

  1. Audit where the business wishes to use — and already uses — biometric data in the organisation anywhere in the world. Identify the purpose of each processing activity and all relevant context.
  2. Identify relevant, applicable laws.
  3. If the business is covered by UK or EU GDPR, or a modern comprehensive Privacy law in a U.S. state, carry out a data protectionimpact assessment or risk assessment.
  4. Decide on the appropriate legal basis for each of those processing activities:
    1. For example, consent may be the only option in some cases (in which case a different option must be provided to individuals who do not consent) and prevention and detection of unlawful acts may be appropriate in others.
    2. Where the firm does not use consent, document why the processing of special category biometric data is ‘necessary’ for your chosen legal basis (such as the prevention and detection of crime and for reasons of substantial public interest).
  5. Document how the firm’s use of biometric data is targeted and proportionate ‘to deliver the specific purposes set out in the condition, and that you cannot achieve them in a less intrusive way’.
  6. Document how the firm has investigated how the solution avoids bias in this particular situation with these particular individuals.
  7. Make sure, as required, that the firm has an appropriate policy document for UK GDPR and appropriate policy for BIPA.
  8. Provide prior information to individuals to meet the firm’s transparency obligations.

 

Robert Baugh is the founder and CEO of Keepabl, privacy management SaaS based in London. Prior to Keepabl, Robert was general counsel of technology growth companies for more than a decade.

 


Related Articles

Languages in Keepabl
Blog
The Launch of Languages in Keepabl

In a world where we’re more connected than ever, we understand the importance of effective communication, especially for groups and international businesses.  Keepabl is committed to making our Privacy Management…

Read More
CJEU Facebook July 2023
Blog
Meta loses again: this time at the CJEU

Much of the 4 July 2023 decision by the European Union’s highest court is worth digging into. Stick with us for a longer read. We’ll start with the key points…

Read More