Biometric data – why organisations keep getting it wrong

The ICO has recently issued a reprimand to a secondary school for using children’s biometric data to take cashless payments from students, in breach of the Data Protection Act 2018 and UK GDPR.

Chelmer Valley High School was issued with a reprimand for failing to undertake a Data Protection Impact Assessment (DPIA) prior to its use of facial recognition technology (FRT) in 2023. The school did not obtain opt-in “explicit consent” to the use of such technology, as required under Article 9 of the GDPR for biometric data, nor did it seek such consent from its students, most of whom the ICO considered were old enough to provide their own consent.

The ICO issued a reprimand based on its current approach to data protection breaches within the public sector - which is not to issue fines to public sector bodies. Instead, it offered recommendations to the school as to the steps it should consider taking to improve its compliance with the UK GDPR. A copy of the ICO’s reprimand is available here: https://ico.org.uk/action-weve-taken/enforcement/chelmer-valley-high-school/

Processing biometric data: the importance of explicit consent

Failure to obtain explicit consent for the processing of biometric data has a chequered history in the annals of ICO enforcement action concerning breaches of data protection law. The Chelmer Valley case is another in a line of cases where organisations are neglecting to consider data protection law (in particular, the need for a DPIA and consultation with a DPO) for biometric data prior to full deployment:

  • In 2018, the ICO undertook regulatory action against HMRC in respect of their use of the Voice ID service for customer verification. The ICO found that HMRC had (i) failed to give customers sufficient information about how their biometric data would be processed; and (ii) failed to give them the chance to give or withhold consent. HMRC were required to delete all biometric data held under the Voice ID service for which they did not have explicit consent.
  • In January 2023, the ICO issued a letter to North Ayrshire Council following their use of biometric data to manage “cashless catering” in school canteens. The Council introduced into nine of its schools, which was subsequently reported to the ICO. The ICO issued the Council with a letter advising them that (i) the most appropriate legal basis for processing FRT for cashless catering would be explicit consent; and (ii) they should have undertaken a DPIA in advance of the processing commencing. Following the ICO’s intervention, the Council advised its schools to cease processing FRT for the purpose of taking payments from children.
  • Earlier this year, the use of biometric data technology in the context of adult consent arose in the case of Serco Leisure (“Serco”), who run a number of Community Leisure Trust centres across England. Serco, along with other Community Leisure Trusts, were using fingerprint technology to monitor the attendance of staff at work. Fingerprint reading technology is similar to facial recognition technology insofar as it converts the dactyloscopic image into a numerical identifier. The numerical identifier is then compared with similar numerical reference images held in the database for authentication / verification purposes.

Serco and the other Leisure Trusts were using this technology to monitor their staff, which the ICO deemed to be too intrusive and not necessary insofar as the purpose for which it was being used (to monitor whether staff were attending work). Serco and the Trust centres had not obtained explicit consent, and nor was any consent obtained free and fair, bearing in the mind the context for which it was obtained.

The ICO issued an Enforcement Notice requiring Serco and the Trust centres to cease all biometric processing and destroy all such data within 3 months. It also recommended that they introduce an alternative method of monitoring which staff could sign up to without any duress or penalty for so doing.

Comment

In our experience, vendors of biometric data systems regularly misunderstand the data protection principles which lie at the heart of their systems – principally, that the initial taking of an image and it being converted into a numerical identifier retained in a database still amounts to the processing of personal data, even where the facial image or fingerprint is then immediately deleted. 

The identifier itself is information which comes within the definition of personal data, i.e. it is information which could indirectly identify someone when combined with other information. The education sector, in particular, appears to be coming in for frequent ICO scrutiny in this area, particularly given the use of children’s personal data.

These aspects of biometric data processing when considered within a DPIA should enable the data protection implications of using such systems to be identified within the design phase. We strongly recommend that organisations engage a DPO at the planning stage of the potential use of such applications and to consider carefully the wording of any consent form and privacy notice to ensure that the risk of future regulatory action is duly minimised.

Kennedys has a global data privacy and cyber risk team able to advise on a full range of privacy compliance issues faced by your organisation in the UK and further afield. Please contact us should you require assistance in this area.

Related items: