There is increasing hype about Artificial Intelligence (AI) and machine learning progressively being integrated into more domains of our personal, social and professional life; dating applications “choose” the “right” match, computational tools are employed to increase productivity.
However, at the same time, recent scandals such as Cambridge Analytica’s collection and processing of personal data of millions of Facebook users for political purposes and Amazon’s retraction of an AI recruitment tool showing bias against women have given rise to public debates over technology’s flaws.
The employment of Privacy Impact Assessments (PIA) to ensure a privacy-by-design approach is an essential process to prevent, and therefore effectively respond to, these flaws and concerns.
The PIA is not only guaranteeing the compliance to the EU General Data Protection Regulation (GDPR) but allows to build privacy preserving measures into the system, so that privacy is not limited to data protection but also covers further potential harms on a personal and societal level.
It is considered very important, if not mandatory, for research projects and services that deal with personal data. The introduction of the GDPR in May 2018 continues to be a strong incentive for companies, agencies and institutions to consider their obligations regarding the protection of personal data and privacy.
At Trilateral, we have carried out pioneering work in safeguarding privacy within the private and public sectors, one of our current focuses in the European Union funded EUNOMIA (user-oriented, secure, trustful & decentralised social media) project. This three-year project brings together ten partners who will develop a decentralised, open-source solution to assist social media users (traditional media journalists, social journalists and citizen users) in determining the trustworthiness of information.
As the collection and processing of personal data are necessary for the development and operation of the EUNOMIA solution, we are leading a task to undertake a PIA+. The PIA+ is being undertaken from the very early stages of the project to safeguard privacy and data protection and to minimise potential risks considering societal and legal issues as well as ethical.
What is a PIA+?
A PIA+ is not another “tick box” exercise to prove compliance with relevant laws and regulations. It is a collaborative, continuous process that spans throughout the lifecycle of a project from the early design stage to the deployment of the product or service. The PIA+ process is not a one-size-fits-all. It is a process that is adjusted to the specific needs of each project as it evolves.
It analyses the system architecture and intended information flows to identify potential privacy, social and ethical risks. Based on the analysis, fictional scenarios are developed anticipating the identified risks to be used as stimuli for the consultation with relevant stakeholders (e.g., technical experts, citizens, lawyers, etc.).
The PIA+ results in specific organisational and technical recommendations and measures, anticipating future consequences on an individual, organisational and societal level.
A preliminary PIA+ on EUNOMIA
The PIA+ that we are developing for EUNOMIA runs from the beginning of the project, consulting the technical partners on the development of the tools. The preliminary analysis explored the envisioned system and tools as they are described in the proposal exploring emerging ethical, social and legal issues. The considerations that emerged from this analysis include:
Ethical Considerations
- Autonomy (e.g., Do EUNOMIA users freely decide upon their participation – do they have the choice of withdrawing?)
- Dignity (e.g., How could users’ trustworthiness scoring impact the users’ reputation and thus their associations – with society, their organisation, future employers, etc.)?
- Privacy (e.g., How could users’ trustworthiness scoring impact on the disclosure of information that users do not wish to share and that is linked to their identity?)
Social Considerations
- Discrimination (e.g., Can the system be misused resulting in EUNOMIA’s users being discriminated?)
- Bias (e.g., Do the input data and/or data processing carry/create any form of bias?)
Legal Considerations
- Compliance with the GDPR and relevant ISO standards (e.g., Are data collected, processed and stored lawfully? Can data subjects exercise their rights?)
Recommendations for EUNOMIA
The findings of the preliminary analysis resulted in detailed recommendations for the technical partners to minimise the identified risks. The recommendations suggest measures regarding the data collection, processing and storage including informed consent and pseudonymisation amongst others. We have also proposed design measures for the EUNOMIA solution to address the identified ethical, social and legal concerns.
“Thanks to Trilateral’s Privacy Impact Assessment (PIA+) workshop, technical project partners were able to take a measured view on how data collection, workflows, analysis and security would be managed within the project. The workshop improved our thinking on a very important aspect of the project, enabling the technical teams to think more carefully about the wider system of interest for user and technical requirements.”
Research Partner, EUNOMIA project, May 2019
Next steps in the PIA+ process
The preliminary PIA+ analysis of the EUNOMIA solution and the resulting findings will be discussed with all the project partners during the next consortium meeting. To engage all partners in the PIA+ process and to raise awareness of the ethical, social and legal concerns a hands-on session will be designed. Additionally, as the PIA+ is a continuous process, one-to-one interviews and/or focus groups with different partners and stakeholders will be scheduled as necessary resulting in further reports and consultation at key stages of the project.
For more information contact our team.