Get the latest insights from Trilateral in our new monthly article, featuring the latest developments from across our innovation and research teams.
Wildfire risk management – collaborations for a more resilient future
As climate change heats up, so does wildfire risk across Europe. There are many projects and organisations tackling wildfire risk, but they often work in isolation.
FIRELOGUE, one of Trilateral’s leading projects in the climate space, is bringing together experts to share knowledge and create best practices for wildfire prevention.
Last November, Trilateral researchers attended the project’s 2nd Wildfire Risk Management Clustering Event organised in collaboration with the EU Directorate-General for European Civil Protection and Humanitarian Aid Operations. With seven wildfire-prevention initiatives in attendance, Trilateral researchers met with a variety of stakeholders working on wildfire prevention and management — from researchers, to front line respondents and European Union policymakers. Common themes of discussion included engaging with local communities and strengthening justice in wildfire responses, effective communication strategies, and creating a resilient future that incentivises nature-based solutions for wildfires.
Our researchers will use the insights and lessons learned from the event in our future workshops and recommendations for wildfire risk management.
To learn more about how Trilateral is facilitating dialogues for better disaster prevention from the frontline to policy level, visit https://trilateralresearch.com/work/firelogue
An inclusive approach to ‘citizen’ engagement in crisis and security research – and why it matters
It is vital to hear from citizens when developing crisis and security technologies. This ensures that the concerns held by the persons most affected by the technologies inform their development at an early stage. Yet citizens are notoriously difficult to engage in security research, a challenge the TRANSCEND project, coordinated by Trilateral’s Crisis & Security cluster aims to improve.
In a recent blog, our team members Dr Richa Kumar and Dr Beki Hooper set out how the first step in engaging citizens well is to first understand the concept of ‘citizens’ inclusively. This means detaching the term from its legal meaning, and simply consider the impact of the technologies on everyday people to ensure that individuals on the periphery of established civil society are more likely to be included.
Learn more about our approach to the concept of citizen and our work in engaging citizens in crisis and security technology research here.
Digital assistive technologies – can they help persons with disabilities become more independent?
Last year, we participated in the EU-Arab Cooperation Forum on the Rights of Persons with Disabilities, which aims to facilitate the inclusion of people with disabilities in societies across the EU and Arab region by encouraging cooperation between governments, NGOs, and researchers. Trilateral’s Health cluster was represented by Senior Research Analyst Tally Hatzakis, who attended the virtual proceedings and contributed to the conference’s final report with . The study departed from previous research by seeking direct input from people with disabilities about which technologies would most effectively enable them to live an independent life in their own communities. Our survey found that people with disabilities were willing to use a broad range of assistive technologies so long as the tools responded to their actual needs and they felt comfortable operating them, and that the most popular options included wearables, AI alerts, assistive robots, and autonomous wheelchairs and vehicles.
Our contribution to the conference report was informed by the EU funded TRIPS project and produced in collaboration with TRIPS partners. TRIPS designed accessible public transport systems for European cities in collaboration with persons with disabilities, experts, and NGOs.
View the conference report here.
Operationalising Ethics in AI – Sign up for Trilateral’s course
Recently Trilateral developed a course on “Operationalising Ethics in AI” for the Innovate UK BridgeAI programme, hosted by the Alan Turing Institute. The course was the most popular on the platform, and following its success, we were asked to lead eight live training sessions on the same topic.
Emily Maitland, of Trilateral’s DARSI team, kicked off the first batch of sessions in December, with a lesson directed at practitioners in the creative industry sector on AI bias, fairness and explainability. Together the eight sessions, all led by Trilateral researchers, aim to educate participants on the concept of ethics in relation to transparency and fairness in AI, enable them to describe different conceptualisations of fairness and transparency, and teach them about the algorithmic tools and technologies needed to accommodate fairness and transparency.
If you’d like to enroll in this course, you can do so here.
Exploring the Boundaries of Research Ethics: Insights and Recommendations from the BEYOND project
On 7 September 2023, Research Manager Ian Slesinger from Trilateral Research conducted a workshop with Susanne van den Hooff from the University of Humanistic Studies at the 2023 ENRIO Congress on Research Integrity Practice in Paris. The workshop focused on factors contributing to research ethics and integrity (REI) and the causes of research misconduct, which is being explored in the EU funded BEYOND project. The findings from the workshop shed light on the challenges faced by researchers and practitioners in balancing universal principles of research ethics with the diverse contexts of REI policy and education.
This workshop, and an in-depth literature review, shaped Trilateral’s report on the current landscape of REI, research misconduct and questionable research practices. Some of our key findings include that environmental factors significantly shape how and if research is conducted responsibly; that early-career researchers face a higher risk of making mistakes; and that there’s no widely agreed-upon view regarding gender differences as risk factors.
Our recommendations for universities and experts in the field include the development of user-friendly reporting mechanisms, international collaboration for due diligence, the establishment of formal rehabilitation processes for researchers involved in research misconduct, and further exploration of its connection with mental health.
Responsible Research and Innovation – recommendations for ethical high-tech policing tools
The EU funded DARLENE project concluded at the end of 2023, after three years exploring how AI-based Augmented Reality technologies can revolutionise law enforcement operations. The project combined wearable smart glass technology with powerful computing techniques to provide real-time enhancements to law enforcement personnel, as they assess dangerous situations and tackle offenders. Our work in the project ensured that ethical and legal considerations were fundamental, with public buy-in vital to the sustainable development of high-tech policing tools.
DARLENE marked its end with a series of research briefs on key technical and ethical aspects of the project. In our brief on responsible innovation, we emphasise how researchers should promote the ethical development of new AI technologies and translate responsible development from theory into practice. We provide useful background on EU AI regulation, recommending that risk assessments should be incorporated into the earliest stages of technology design. This allows developers to understand and mitigate any hazards, improve design, and ultimately demonstrate compliance with new standards.
For more insights, read our full brief here.
The hidden costs of cybercrime
When we hear about the effects of cybercrime, we often think in terms of the financial costs. However, the costs of cybercrime go way beyond the financial.
Last year, Dr Richa Kumar teamed up with Trilateral’s Chief Research Officer Dr David Wright to co-author a publication discussing the non-financial costs that should be considered when assessing the cost of cybercrime. The article builds on research we conducted in the EU funded CC-DRIVER was coordinated by Trilateral and aimed to understand the drivers of cybercriminality and develop research methods to support law enforcement agencies and policymakers in preventing, investigating and mitigating cybercriminal behaviour.
The methodology we developed to assess the socio-economic impacts of cybercrime shows that many of the hidden costs of cybercrime tend to be social and, while they might be difficult to quantify, they are real, nevertheless. For example, individual victims of cybercrime attacks often experience anxiety, stress, and fear of being victimised again, while communities suffer from a lack of trust and social polarisation.
By having a more accurate assessment of cybercrime, policymakers and law enforcement authorities will be able to prioritize combatting cybercrime and decrease its deleterious effects on vulnerable populations.
For more insights, read the full article here.
If you’d like to find out more about the ground breaking research and development we’re involved in, visit our website. If you’d like to find out more about how we could support your organisation with research and development, get in touch.