Google is the latest tech company in the spotlight over large-scale processing of sensitive data.
Law firm Mischcon de Reya is bringing representative action on behalf of over 1.6 million patients of the Royal Free London NHS Foundation Trust whose information was processed by Google’s AI firm DeepMind. This follows an investigation by the ICO which found that the processing of sensitive patient data occurred in breach of the UK Data Protection Act 1998.
An agreement between the Trust and DeepMind was reached in 2015 to develop a clinician support app, later formalised under the name ‘Streams’. Its development included the processing of partial patient records for the purpose of carrying out clinical safety testing.
This article outlines the steps all organisations should take when developing or implementing a new tool or programme. It also highlights that enforcement action by supervisory authorities is not the only vehicle through which organisations may experience financial consequences as a result of data protection violations.
The ICO investigation
The ICO found that when clinical safety testing of Streams was conducted, patients were not adequately informed their data would be used as part of safety testing and would not have reasonably expected their data to be used in such a way. This raised concerns regarding fairness and transparency. In the absence of sufficient transparency, the ICO determined that patients would be unable to exercise their right to opt-out of the processing of their personal data.
The ICO also raised questions as to why such a significant number of patients records were required for the purpose of clinical safety testing, and whether this was necessary and proportionate. Additionally, the ICO found that a full privacy impact assessment had not been carried out prior to processing.
Representative action
While the investigation by the ICO did not lead to a fine, this case could result in pay-outs to those affected. The representative action is described by the law firm bringing the case, Mischcon de Reya, as a step to address public concerns regarding technology companies’ access to and use of health data at a large scale.
The form of collective action brought by the law firm requires only one individual plaintiff to represent all affected parties, subject to all parties having the “same interest” in proceedings. In this case, one individual plaintiff is representing 1.6 million Trust patients. The action is one of several similar ongoing cases in the UK, including action brought against TikTok. In this case, the plaintiffs allege that TikTok failed to provide both children and their parents with sufficient transparency surrounding the extent of the children’s data it processes and the purposes for which children’s data is collected.
A win for those bringing the action could result in an influx of similar cases. It could be used as a model to hold organisations publicly to account for breaches of data protection law, through a separate avenue to that of enforcement by data protection authorities. One thing is certain, regardless of the outcome: scrutiny around the scale of and categories of personal data that organisations are processing, third party access to data, and whether data subjects are adequately informed of the processing of their data will continue.
Google has recently confirmed its intention to decommission Streams following the shutdown of its health division, although this will have no bearing on the case moving forward.
Recommendations
Taking proactive steps both at the initial project planning stages and throughout can help an organisation avoid future headaches, comply with data protection requirements from the outset, and minimise the risk of delays or enforced holds on a project due to non-compliance. With this in mind, it is crucial when organisations are considering new projects to:
- Clearly determine the intended purpose(s) for processing.
- Apply the principle of data minimisation to the purpose(s), both in the categories and volume of personal data.
- Establish a valid legal basis for the processing proposed.
- Undertake an initial review to determine if a Data Processing Impact Assessment (DPIA) is required to be undertaken. A DPIA can be undertaken to identify and minimise any risk, even if you are not obligated to undertake one.
- Create robust data-sharing agreements with any processors involved in processing.
- Provide transparency by informing data subjects how their data will be processed in the form of a privacy notice and where applicable, offering the opportunity to opt-out.
- Conduct regular reviews to ensure continued compliance.
Trilateral’s Data Governance and Cyber Risk Team have extensive experience supporting organisations undertaking complex projects to comply with their data protection obligations. We offer a range of data governance services, including compliance support. Please feel free to contact our advisors, who would be more than happy to help.