The use of CCTV in private homes and public spaces has become increasingly commonplace with such systems often having the capacity for enhanced functionality, including Facial Recognition Technology (“FRT”). This enables people to be identified based on the system’s analysis of their geometric facial features and the potential to compare this data against any other available images. In response, the Dutch DPA has said that this technology risks making us all ‘walking bar codes’.
Of course, such functionality can be beneficial in a myriad of circumstances, from unlocking phones and doors, to assisting law enforcement bodies police crowded areas. From a data protection perspective, however, FRT within CCTV systems elevates the ordinary images being captured to biometric data. The use of such technology, particularly in workplaces can be controversial. This article considers the challenges for data controllers in lawfully processing such data in public spaces and in the workplace.
Lawful Processing of Biometric Data
The processing of biometric data is subject to a general prohibition by the GDPR, except for certain limited exceptions under Article 9, including where:
- the data subject explicitly consents,
- the data has manifestly been made public by the data subject, or
- processing is necessary in the public interest.
Identifying the categories of data subjects concerned is critical in any assessment of the necessity or proportionality of using FRT. Due consideration should be given to evaluating a controller’s legitimate interest in using such potentially intrusive technologies against the rights and freedoms of data subjects, particularly vulnerable data subjects such as employees.
Supervisory authorities across Europe have been clear that data controllers will generally be required to rely on explicit consent for the use of FRT, as the threshold for processing such data in the ‘public interest’ is exceptionally high.
For areas accessed by the general public, the EDPB has stated that a data subject simply entering a monitored area, even with signage, “does not constitute a statement or a clear affirmative action needed for consent”. The EDPB has also rejected the possibility that data controllers could rely on Article 9(2)(e) (‘processing of data manifestly made public’) to process biometric data using video surveillance in a public place, maintaining that the “mere fact of entering into the range of the camera does not imply that the data subject intends to make public special categories of data relating to him or her”.
It is clearly not practical, therefore, to rely on consent when using FRT in public areas. Furthermore, while it may seem possible to obtain consent from employees to use FRT in the workplace, there are significant difficulties in doing so.
For consent to be considered valid and freely given, the data subject should not be contractually required to consent to processing of their biometric data (e.g. via an employment contract), unless it is actually necessary for the performance of that contract. In any event, consent in the context of the employment relationship will not generally be deemed freely given due to the clear imbalance of power.
Approach of Supervisory Authorities
The Irish Data Protection Commission has previously found the processing of employee biometric data in the Irish Prison Service to be unlawful – where employees could not be compelled to use ‘thumbprints’ to enter a secure area of a prison. Indeed, the Dutch SA imposed a fine of €725,000 on an employer for a similar breach, where fingerprint scanning was being used to track employee attendance. Employee consent was deemed invalid because it was found to be neither informed nor freely given. Furthermore, the use of biometric data was not considered necessary to achieve the identified objective (i.e., security).
Considerations for Data Controllers
There is a very high threshold for establishing necessity and proportionality and the EDPB has advised data controllers to carefully consider the general principles of the GDPR (Article 5) when examining the use of FRT through video surveillance. This is a direct result of the amount of data generated and the risks of the data being used for a secondary purpose, including the risks of personal data being misused.
Indeed, data controllers must also consider current prevailing attitudes to FRT and potential reputational impact. Citizens are increasingly viewing the use of FRT through video surveillance, specifically outside of law enforcement, with a significant degree of negativity and suspicion. This is increasing in parallel to citizens becoming more concerned about the risks posed by technology to their privacy and the security of their personal data. Experts have also expressed concerns about the accuracy of FRT systems and potential for inherent bias in the algorithms used. This can increase the risk to data subjects that inaccurate data is being processed and/or that they are being subjected to discrimination (based on ethnicity, age, gender etc).
The Trilateral Research Data Protection and Cyber-risk team can help your organisation introduce new and emerging technologies while ensuring ongoing compliance. Please feel free to contact our advisors, who would be more than happy to help with DPIAs and policy development.