In 2021, the North Ayrshire Council (NAC) decided to implement Facial Recognition Technology (FRT) to nine schools across North Ayrshire in Scotland. The education authority introduced this new technology to schools in order to provide a contactless meal service to its students. More specifically, when students would reach the cash desk, FRT would convert the child’s physical characteristics into a unique digital signature. This would give students the ability to pay for their school meals simply by giving the software permission to access their catering accounts, without the need to carry cash or a card with them at school.
Following a letter by the ICO issued to the NAC detailing all data protection related concerns (portrayed below), the NAC paused the use of FRT in schools in light of these concerns and deleted all facial templates collected and stored in its databases. Below we will provide an overview of the ICO’s concerns and then explain what an organisation would need to consider when introducing new technologies.
Lawfulness of processing
Following the ICO’s enquiries, the NAC was unable to demonstrate that it relied on a valid lawful basis for processing special category data, and more specifically the use of biometrics. On the Data Protection Impact Assessment (DPIA) conducted by the NAC and provided to the ICO, the NAC argued that ‘the processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’, article 6 (1, e) GDPR. However, the NAC later stated that it relied on consent as a basis for processing children’s biometrics. The ICO’s view was that specifically for the processing of children’s biometric data, the NAC as a Data Controller needed to rely on consent (article 6 GDPR) and explicit consent (article 9 GDPR) and be able to demonstrate same. On a similar case in Ireland in 2020, the DPC considered that ‘exposure to intrusive methods of surveillance without sufficient legal basis or justification can desensitise students at a young age to such technology and lead to them ceding their data protection rights in other contexts also’.
Right to be informed
Information addressed to children about the processing of their data should be in ‘clear and plain language that the child can easily understand’ (Recital 58 GDPR) accompanied with ‘clear privacy notices so that children are able to understand what will happen to their data and their rights’. Although there had been communication efforts from NAC’s side to comply with the requirement to inform children by providing FAQs and an information flyer to parents, the ICO concluded that the information conveyed to children did not come in a concise, transparent, intelligible and easily accessible form, using clear and plain language. Specific attention needs to be paid to the right to inform children of the processing of their data, as they may not be in a position to fully appreciate the risks associated with the processing of their special category data.
Retention
Although the GDPR does not provide for specific data retention periods, this is a requirement falling under the Data Controller’s (in this case, NAC) remit to be set and justified. Specifically in relation to children, it is particularly important to keep retention periods under review. The ICO was not satisfied with the NAC’s data retention period of ‘5 years after leaving data or the date they reach their 23rd birthday, whichever is later’. No further justification was provided from the NAC as to why the retention period of 5 years was chosen.
Data minimisation and accuracy
Considering that FRT systems have been shown to provide less accurate results with specific gender or ethnic groups, as identified in the ICO’s guidance on Human bias and discrimination in AI systems, the NAC should have documented the risks associated with bias and discrimination to children being subjected to this technology and consider how these can be mitigated, in their DPIA, to ensure data accuracy in the algorithm.
Data Protection Impact Assessment (DPIA)
Although a DPIA had been conducted by the NAC, the ICO’s review concluded that it failed to comply with article 35 GDPR requirements. More specifically, the ICO found that:
- When the processing commenced, there was a residual high risk relating to unlawful access when the first facial templates were captured. Although the NAC stated that a mitigation measure of encryption had been applied, the DPIA had not been updated to reflect same.
- The risk assessment did not address risks to individuals’ rights and freedoms and specifically did not consider risks related to bias and discrimination.
- No prior consultation was undertaken with pupils and/or parents.
- The DPIA was not signed off and was provided to the ICO in draft form.
As a result, an updated DPIA reflecting accurate data risks and mitigation measures would have highlighted the issued raised by the ICO and potentially help the NAC address same before implementing the FRT.
The ICO further stressed that it is necessary for a Data Controller to demonstrate their compliance with the data protection framework as part of their accountability obligations under the GDPR. Furthermore, considering that biometric data “is more permanent and less alterable than other personal data; it cannot be changed easily. Biometric data extracted from a facial image can be used to uniquely identify an individual in a range of different contexts. It can also be used to estimate or infer other characteristics, such as their age, sex, gender or ethnicity”.
How can you be proactive and ensure that your business or service is complying with the data protection framework?
Trilateral’s Data Protection and Cyber-Risk Team has significant experience conducting Data Protection Impact Assessments (DPIA) tailored to the novel use of technologies and complying to the GDPR. Moreover, Trilateral Research develops practical solutions to complex problems to help organisations and other entities address ethics requirements and implement ethics best practices by conducting AI bias assessments. For more information, please feel free to contact our advisers, who would be more than happy to help.