Security, privacy and trust in software – assessing the ethical impact of technology

Reading Time: 3 minutes


Tim Jacquemard | Senior Research Analyst

Date: 8 July 2022

Protecting privacy and digital security are becoming increasingly important in everyday life. A lack of trust in software and how it deals with or exposes personal information could negatively impact consumers and the wider EU digital market. Privacy, security and trust in software are promoted by the TRUST aWARE project. They are also, in themselves, philosophical and ethical concepts, but what do they mean in the context of TRUST aWARE? 

To answer this question, Trilateral Research are conducting an Ethical Impact Assessment (EIA). An EIA is a systematic approach to maximising the positive ethical impacts of a technology or project by mapping information flows, identifying and assessing challenges and opportunities, and providing a set of recommendations to consider during the design, development and deployment of a system. The EIA helps developers and end-users identify and address ethical implications associated with the development, implementation and use of the technology. As part of the EIA, we focus on three values central to TRUST aWARE with strong ethical connotations: security, privacy and trust. 

The term ‘security’ is an umbrella term that includes different types of security, e.g., personal security, home security, food security, cybersecurity and national security. TRUST aWARE promotes cybersecurity and personal security. Broadly, cybersecurity is the collection of processes and tools a person or organisation puts in place to protect the confidentiality, integrity and availability of data and other digital assets. Cybersecurity is promoted in TRUST aWARE both identifying security concerns in software but also by building secure technology. In TRUST aWARE, we consider someone to enjoy personal security when they do not face danger or threats against their interests or rights. For example, in the TRUST aWARE project, TRUST aWARE partners identify inappropriate advertisements targeting minors.

Privacy is notoriously difficult to define, but many interpretations associate digital privacy with norms around access to or control over personal information.  These privacy norms differ depending on the situation. The request for certain data from one app, e.g., proof of identity for a banking app, may be appropriate in the context of that app, but not in the context of another app, e.g., a torch light app. For the TRUST aWARE project, privacy is relevant in several ways. The project aims to promote privacy digitally: for example, TRUST aWARE checks if privacy policies of apps are clearly formulated, as unclear policies indicate that a person cannot make an informed choice whether or not to share their personal information. Secondly, the project itself needs to address issues around privacy with the tools it develops. To develop security and privacy enhancing tools, TRUST aWARE needs to monitor and analyse online behaviour of individual users and therefore accesses much personal information. 

Trusting means that you rely on someone or something to act in a desirable way. When we trust, we risk that our trust could be violated and that we might suffer negative consequences. In the TRUST aWARE project, trust is important. Consumers and organisations need to trust the products and insights generated by the TRUST aWARE consortium for them to be used in the future. Further, TRUST aWARE aims to enhance trust when using software. Strictly speaking, the scanning of software for malware, advertisement on social networks for harmful content or the accuracy of privacy policies by TRUST aWARE is not a sign of trust in software. In contrast, even though prudent and warranted, the scanning is a sign of distrust. TRUST aWARE protections aim to reduce the dangers associated with software and increase transparency distinguishing the trustworthy from the untrustworthy. 

How do cybersecurity, personal security, and privacy relate to each other? In some cases, actions to protect or promote one of these values also help protect or promote another value. For example, by protecting cybersecurity, you might also help protect personal security: preventing hackers from using personal information protects people against blackmail or identity theft. In other contexts, these values can conflict and provide conflicting guidance on how to act. For instance, encrypted devices protecting cybersecurity could facilitate criminal activities harming personal security. In case of conflict between values, we must look at the context: neither cybersecurity, personal security nor privacy would in any situation trump the other value. Instead, we should try to design the technology in a way that all are promoted as much as possible. One of the objectives of the EIA is to mitigate possible conflicts of values and ensure the technology is developed in a way that facilitates all values.

For more information, please contact our team.

Related posts

Let's discuss your career