Using digital surveillance tools – what are the privacy and ethical issues to take into account?

Reading Time: 3 minutes

Authors:  

Panagiotis Loukinas
- Research Analyst

Date: 6 May 2020

A variety of digital surveillance solutions are ready for use by law enforcement agencies (LEAs), such as advanced algorithms, network analysis, data mining and machine learning. Although the gathering and processing of information by LEAs is not a new phenomenon, the enormous volume of data available allows for higher levels of surveillance.

The RAMSES project designed an intelligence-gathering platform with different integrated tools to help police authorities combat financially motivated malware and the use of banking trojan horses.

Within the project, Trilateral Research focused on the privacy and ethical aspects of RAMSES and on the broader legal and social implications of digital surveillance used for law enforcement purposes. Building on insights gained during the project but going beyond the framework of RAMSES, we note here some of the legal, social and ethical issues that should be considered during the development and use of digital surveillance tools.

Human Dignity and Non-discrimination

Any negative impact of digital surveillance on human dignity must be prevented, especially since the impact of what people may perceive as surveillance on their lives may be so significant as to affect their capacity to live a dignified life. Similarly, the principle of non-discrimination must be respected. Surveillance practices do not always fall equally on everyone and the deployment of sophisticated policing tools should not reinforce existing social divisions along the lines of age, ethnicity, gender and class.

As we observed during the RAMSES project, a high percentage of end-users trust the intelligence provided by algorithmic tools. This means that the developers of such technologies must enhance the accuracy of their products in order to avoid misleading results. Accuracy in such cases depends on the quality of the data used both during the building of a tool and its deployment in practice. It is also necessary to avoid potential data biases that might be present in a surveillance system, even unintentionally, at different parts of its development or use.

OSINT, SOCMINT and the Chilling effect

Innovative surveillance tools can scrape the internet and collect various types of data. It is essential to point out that while Open Source Intelligence (OSINT) gathers publicly available content published on the open internet and clearly intended and available for everyone to read, it should not be equated with Social Media Intelligence (SOCMINT) that can be deployed on private content as well.

SOCMINT can be better characterized as open when it is accessible to all or closed where access to this type of information is restricted in different ways.

In any case, individuals should be able to enjoy their privacy in online public spaces, such as social media platforms, and surveillance technologies should not be used at any cost and without carefully balancing the potential benefits of the use of surveillance techniques against citizens’ rights.

Respecting citizens’ rights online is important since digital surveillance of high intensity can create a chilling effect on the surveilled populations preventing them from exercising their rights and liberties such as freedom of opinion and expression and the freedom of assembly and association online.

Privacy and Data Protection

Developers and LEAs should identify the potential risks to privacy and data protection, the fundamental rights that are most likely to be affected by big data surveillance. During the design of a surveillance tool, relevant measures should be integrated in order to mitigate these risks. Whenever a system of surveillance is deployed, adequate and effective guarantees respecting these rights must be in place and its use would be justified only if it is strictly necessary for the enhancement of public security.

LEAs must also comply with their data protection obligations, whenever personal data are collected and processed. In particular, as enshrined in the Law Enforcement Directive, the collection and processing of personal data in the investigation and detection of criminal offences must be lawful and fair and must comply with the principles of purpose limitation, data minimization, data accuracy, storage limitation and data security.

Finally, to mitigate the risks to privacy and the potential chilling effect, LEAs should follow the principle of transparency that allows for democratic control and emphasizes the importance of openness in the field of policing. At the same time, a certain level of secrecy can be maintained in order for specific measures and policies to be effective, but the reasons and justifications for using specific tools should be explained to the general public.

To support LEAs, Trilateral’s impact assessment ensures transparency and scrutiny on the solution developed. For more information, read Ada Lovelace Institute’s report on algorithmic impact assessment.

For more information, please contact our team.

Related posts