Trilateral Insights – June 2024

Reading Time: 8 minutes

Authors:  

Trilateral Research |

Date: 1 July 2024

Get the latest insights from Trilateral in our new monthly article, featuring the latest developments from across our innovation and research teams.

Insights | Law Enforcement and Community Safeguarding

A new resource to support the victims of trafficking and exploitation

The HEROES project investigates how new technologies and strategies can be used to reduce and prevent child sexual abuse and exploitation (CSA/CSE) and trafficking in human beings (THB). Frontline responders frequently lack the resources and expertise to support those affected, and Trilateral Research’s Jana-Isabelle Dilger and colleagues are proud to share a practical guide to assist practitioners and victims. The Manual for Early Identification of Potential Victims of Trafficking in Human Beings, Child Sexual Abuse and Child Sexual Exploitation we developed will assist responders across this field, from law enforcement and health services to NGOs and educators. It provides concrete guidance on the indicators of victimisation, showing how support services and agencies can help potential victims.

The manual sets out the key legal definitions of CSA/E and THB, and tools to help identify victims at national and international levels. It provides detailed case studies from Bangladesh, Colombia, Spain and the UK, and describes the particular dimensions of victimisation in these territories. Input from the Trilateral team included outlining the UK case study, which explores aspects of modern slavery, the county lines drug trade and so-called ‘cuckooing’ (or the appropriation of the homes of vulnerable people) which have been little examined from a trafficking perspective.

The manual adds to existing work from authorities, NGOs and charities on the signs that an individual is being exploited by offenders, producing a more comprehensive list of hallmarks such as limitations in personal freedoms or subdued demeanour that a professional may be best placed to identify. By collaborating on the production of this manual, Trilateral adds to its track record of impact on new areas of exploitation; epitomised by its CESIUM tool.

Read the full manual here.

Insights | Data Science, Research and Sociotech Innovation (DARSI)

Trilateral’s data scientists win second prize in exclusive AI competition to support UK security

Data scientists at Trilateral Research were invited to participate in Challenge AI, a competition organised by the Defence and Security Community, represented by the Ministry of Defence, GCHQ and MI5. The competition involved over 200 participants from more than 50 organisations. More than 200 solutions were entered into the challenge, and Trilateral’s solution won second place.

Our team chose to compete in ‘Mission AuthorID’, a digital forensics challenge that required developing Natural Language Processing models to predict an author’s identity from conversational-style text messages. To tackle this challenge, the team applied several Large Language Models (LLMs) to predict the author of each message.

The team are thrilled with the outcome, and the experience was invaluable for exploring new techniques at the forefront of LLM development. The solutions generated in this competition will help inform UK defence strategies, particularly in the area of digital forensics, thus ensuring that UK defence sectors stay at the forefront of technological advances.

Our data scientists are using their advanced Natural Language Processing expertise in other projects too. For example, in the EU-funded CYBERSPACE project, the team are developing Natural Language Processing solutions to enhance Law Enforcement Agencies’ digital forensics capabilities for tackling cybercrime.

If you’d like to read more about the project, click here.

Insights | Cybersecurity

New guidelines and recommendations for policymakers and stakeholders in the context of Security & Privacy (S&P) tools

Lack of regulatory enforcement and certification methods, inappropriate software practices, and bad user habits all combine to increase a user’s exposure to security and privacy (S&P) threats when using digital services. The TRUSTaWARE project aimed to address this situation by providing actionable intelligence and tools to minimise the impact of cyberthreats, encourage trustworthy S&P digital products in compliance with regulation, and empower users to exercise control over their own security and privacy online.

Trilateral’s role in the project was to analyse the socio-economic, ethical, legal and standardisation aspects of the project to maximise the positive impact of the TRUSTaWARE cyber S&P protection tools. As a result of its work and activities, Trilateral, in collaboration with all TRUSTaWARE partners, has released a set of guidelines and best practices for developers, IT organisations, policymakers and other relevant stakeholders to support the development, implementation, and regulation of user-friendly Security & Privacy (S&P) tools.

Some recommendations for policymakers include the need for:

  • Prioritising “by-design” methodologies: it is important to ensure that “by-design” methodologies such as privacy by design, ethics by design and security by design are standardised parts of a technology development policy. This guarantees the early and consistent integration of ethical principles into the design, development and eventual deployment of new technologies, a key to maximising the overall social benefits while simultaneously minimising the potential harms.
  • Defining parameters for measuring the impact of S&P tools: defining clear parameters for measuring the impact of developed S&P tools both during the development phase and after.
  • Creating universal usability standards for S&P tools: creating usability standards for S&P tools that are particular to vulnerable persons and those with reduced digital literacy. These standards should be universally applied to S&P tools, even where vulnerable people are not the target audience.

Some practical guidelines for stakeholders developing and promoting user-focused S&P tools include the need for:

  • Addressing the impact of S&P tools: addressing the legal, ethical, security and privacy concerns that arise with digitals tools promoting S&P.
  • Monitoring impact through the research and design (R&D) process: monitoring the impact of the S&P tools throughout the research and design process and iterating when the S&P tool has shown to have adverse effects.
  • Being understandable and accessible: When providing S&P advice, guidance or alerting users about S&P threats, information should be provided and explained in a transparent, understandable and accessible manner that holds those providing the information accountable.

See our infographics for more guidance and recommendations.

Insights | Crisis and Security

Improving ethical assurance for non-university researchers in crisis settings

Social science and humanities research that involves engaging individuals experiencing crisis runs the risk of deviating from democratic norms of equality and inclusivity, and of exposing participants to retraumatisation. Trilateral’s Dr Leanne Cochrane and Dr Orla Drummond have recently co-authored an article in which they assert that such research must always have adequate ethical oversight, arguing for a new model of ethical assurance for research conducted outside the university setting.

Much social science and humanities research takes place outside academia, such as in international organisations, public bodies, non-governmental organisations (NGOs) and private companies. Many of these actors commission, conduct or sub-contract research activities involving human participants, especially during times of crisis, where researchers scramble to come up with new knowledge necessary to respond to and prevent further harms.

In their article, Leanne and Orla observe a gap in ethical assurance for research taking place in crisis and emergency settings, such as pandemics, humanitarian or other fragile contexts, especially where non-university researchers engage with survivors and affected individuals. Ethical assurance is the independent examination of the ethical practices that guide research and guarantee the participants privacy, voluntary and informed participation choice, and the option to opt out without consequence at any point of the research.

The article also looks at some of the concerns of university Research Ethics Committees related to ethical oversight for non-university research in crisis settings and argues for a new model of ethical assurance which draws upon democratic norms of equality and inclusivity.

Read the full article here.

Insights | Climate, Energy and Environment

Defining the conditions for responsible research into Solar Radiation Modification (SRM)

With climate impacts becoming ever more present and tangible, there is a growing debate around possible additional ways to alleviate climate change impacts, including the use of technologies that act on the Earth’s climate system, such as Solar Radiation Modification (SRM).

Trilateral is leading an analysis of legal and governance aspects of SRM research, as part of Co-CREATE, an EU-funded project that is examining the principles and guidelines for a possible governance framework for responsible SRM research.

SRM refers to a set of techniques that aim to limit global warming by reducing the amount of solar radiation reaching the Earth’s surface (i.e. reflecting sunlight or heat back into space). Example techniques include injecting reflective aerosols in the Earth’s stratosphere or in marine clouds. Whilst intended to alleviate some of the impacts of climate change, SRM comes with its own risks, such as unintended changes to climate and ecosystems. A comprehensive scientific basis and understanding of the impacts of large-scale SRM deployment is currently lacking.

From international environmental law and law of the sea, to space law, intellectual property law, and indigenous rights law, researchers of Trilateral’s Climate cluster are investigating how existing legal frameworks are applicable to SRM research. This in-depth legal analysis will feed into the broader research question of defining the conditions for responsible SRM research. This work is crucial to facilitate informed discussions about SRM and prevent hasty or unilateral deployment of a technology that is not fully understood. In the end, responsible SRM research may inform our understanding of the risks of SRM deployment against the risks of harm caused by anthropogenic climate change.

Read the latest news from the Co-CREATE project here.

Insights | Ethics, Human Rights and Emerging Technologies

Re-defining the ecosystem of trust in science: the role of “Stewards of Trust” in changing research environments

 This month, our team members Dr Agata Gurzawska and Dr Evren Yalaz represented the EU funded VERITY project at the World Conference on Research Integrity (WCRI) 2024 in Athens, Greece. VERITY aims to enhance trust in science by rebuilding the relationship between science and society.

At the conference, Dr Agata Gurzawska participated in a symposium titled “On the implications of research integrity for public trust in academic research.” This session, held in collaboration with the IANUS and POIESIS projects, explored the relationship between public trust in research and key factors like research integrity and ethics, open science and co-creation, and science communication. Dr Gurzawska’s presentation focused on “Re-defining the ecosystem of trust in science” and the role of “Stewards of Trust” in changing research environments. Her presentation featured preliminary findings from the VERITY project on addressing trust in science and its challenges for different stakeholders. A discussion with the audience followed to expand on the implications of these findings for policy and global contexts.

At the event, Dr Gurzawska and Dr Yalaz also presented a poster, showcasing key findings from the VERITY project on how to build trust in science. Some key takeaways from their poster presentation include the need for:

  • Forming Trusted Partnerships
    • This includes amplifying the messages of experts in the field and using their expertise to explain the latest scientific understanding against conflicting evidence.
  • Fostering Collaboration
    • This highlights the importance of fostering open collaboration with citizens, scientists, policymakers, and educators to discuss research findings and concerns.
  • Diversifying Participation
  • There needs to be a focus on involving underrepresented groups like women and global south researchers and actively recruiting participants from these groups for research activities.
  • Advocating for Meaningful Participation
    • Encouraging active and meaningful citizen participation that is not tokenistic or merely consultative, but instead impactful.

Learn more about the project here.

Insights | Health

Building trust in health data sharing: perspectives from patient advocates

Researchers and technology developers constantly seek innovative approaches to make their work more effective. In the healthcare sector, much of this innovation centres on personal health data, often used in AI tools to improve medical care. But does the pursuit of innovation risk driving a wedge between researchers and the patients they seek to support?

Trilateral’s Health Cluster recently facilitated a workshop with patient advocates at the MPNEconsensus 2024 conference. The group discussed the proposed legal framework for the European Health Data Space, which aims to promote the exchange of electronic healthcare data to support research, healthcare services, and policymaking. Following the discussion, the patient advocates were asked to reflect on what health data means to them. While some focused on the promise of this approach, an undercurrent of cynicism ran through many of the group’s responses, which associated health data with words such as “weapon” and “power.”

In a blog about the event, Trilateral’s Zita McCrea reflected on this gap between research priorities and patient concerns. In an era when social media companies openly grant third-party advertisers access to users’ data, discussions about sharing highly sensitive personal data raises alarm bells among citizens. As such, researchers, healthcare professionals, and policymakers need to better communicate the details of regulations addressing sensitive data and emerging technologies, foregrounding concerns about privacy, consent, and patient rights. Without this approach, distrust may prevent new technologies from being adopted.

The workshop was organised as part of the EU-funded iToBoS project, which seeks to develop an AI- and data-driven tool for early diagnosis of melanoma.

You can read Zita’s blog about the event here.

 If you’d like to find out more about the ground breaking research and development we’re involved in, visit our website. If you’d like to find out more about how we could support your organisation with research and development, get in touch.

 

Related posts

Let's discuss your career