CESIUM Application
Safeguarding children with ethically designed technology.
In a 2022 validation exercise, CESIUM identified 16 vulnerable children up to six months before they were referred. Findings from a 2022 validation workshop forecast at least a 400% capacity gain with a multi-agency deployment of CESIUM.
CESIUM transforms safeguarding decision-making with new levels of protection for children who are vulnerable to exploitation
Child exploitation is among the most hideous crimes in our society. Through leveraging the value of the data held by safeguarding partnerships through the secure sharing of insights among partners, CESIUM augments multi-agency decision-making by identifying and prioritising vulnerable children.
The Problems
of the serious incident notifications and high profile inquiries, including the inquiries into the deaths of Victoria Climbié and Peter Connelly have cited problems with poor critical information exchange [source]
Challenges with identifying and risk assessing vulnerable children
In more than 60% of the incidents, the child protection system had previously identifed the children as vulnerable. Despite that identifcation, the system was unable to prevent their death or serious harm. Practice learning from national and local reviews in 2020 shows continuing weaknesses in risk assessment and decision-making, often the result of agencies not sharing information. It is a perennial theme of very many historical inquiries about children who have died. [source]
Current Safeguarding practices are operationally inefficient
Data sharing, analysis and record keeping create an administrative burden which reduces the focus on safeguarding. Current practices rely upon general purpose tools like spreadsheets and word processors to collate and analyse data. In addition to the administrative burden, general purpose tools also mask the insights that modern AI has to offer [source]Â
Vulnerable Children Are Hidden within Data Silos
Many reports and policy documents identify the criticality of data sharing for safeguarding children. Nevertheless, too often, data is held in siloed databases and spreadsheets. Vulnerable children then become more difficult to identify. While data sharing raises legitimate concerns, reviews repeatedly find data silos increase the risk of exploitation [source]
People are often an afterthought
While technology presents new opportunities for risk assessment, humans remain integral to system design. A system’s outputs need to be communicable to analysts and decision makers while being testable for biases. However, too often, humans are not considered. Safeguarding systems, therefore, must be designed around effective human-machine interaction [source]
Improving digital literacy with safeguarding professionals.
For defensible risk assessment using AI, practitioners require basic digital literacy to understand and critique a machine’s output. Just as practitioners may critique a colleague’s assessment, they must be able to question why a machine has generated a particular output. AI, however, is too often mystified and difficult to understand [source]Â
Algorithms are not transparent enough for effective decision–making
ICO guidance recommends high algorithmic transparency for decision making to overcome the problems with opaque systems [source]
Features
CESIUM augments professional judgement with evidence-based insights for safeguarding decision-making.
Subject Profile: Assured access to multi-agency records
CESIUM merges different records about the same child from multiple databases. Role-based access ensures safeguarding partners can search for and only view the records they are authorised to access. Access rights are changeable in response to a child’s evolving needs. This single view reduces the administrative burden of data sharing.
Usage Analytics: Auditable insights into CESIUM’s usage
CESIUM’s usage analytics provides insights into how analysts use CESIUM. These analytics provide insights into CESIUM’s usage and an audit trail of data access.
Insights algorithm: Assess the requirement for a safeguarding referral
CESIUM’s insights algorithm assesses whether a child requires a safeguarding referral. The algorithm addresses ethical concerns by outputting an additional rationale to explain its reasoning and enable critical analysis for assured decision making.
Pre-screening insights: Identify and prioritise at-risk children
CESIUM’s insights algorithm enables the pre-screening of children for safeguarding referrals. Pre-screening insights identify children who may have otherwise been missed.
Explainability insights: Critically analyse machine learning outputs
CESIUM enables critical analysis of its outputs with explainability insights. We aim to lower the barrier of entry into complex analytics.Â
Case history timeline: Understand a child’s lived experience
CESIUM visualises a child’s case history in a timeline to provide insight into their lived experience. The timeline provides quick and contextual insight into the events leading to a child’s referral.
Associates network: Understand relationships and identify threats
CESIUM visualises a child’s social network in an associate graph. The graph presents recorded information about a child’s associates and the nature of their relationships.
Text analytics: Identify harm in return home statements:Â
CESIUM uses text analytics to identify potential harm in return home statements. Highlights help analysts more quickly identify a child’s experience of harm.Â
Making CESIUM Ethical

Preserving human dignity
CESIUM’s algorithmic outputs, insights and digital literacy training seek to mitigate against de-humanising children through quantification and datafication.

Mitigating unintended discrimination
CESIUM’s algorithmic insights uncover potential biases and blind spots in machine learning algorithms that lead to discriminatory decision-making.

Promoting dialogue
CESIUM is co-designed to promote dialogue between safeguarding professionals.
Our Data Protection team leads CESIUM’s privacy-by-design journey by ensuring full compliance and adherence to good practices. Partnerships are regulated with data sharing agreements setting out the purpose of data sharing, regulating what happens to the data during processing, and outlining roles and responsibilities for the involved parties. The Data Protection team co-authored CESIUM’s DPA with safeguarding partners for the responsible sharing of case data. A DPIA is a process to help identify and minimise data protection risks. CESIUM’s DPIA mandates the necessary controls to protect to mitigate the high risk of sharing sensitive data about children. The DSA and DPIA govern CESIUM’s implementation and operation. The result is a secure system that enables the assured multi-agency sharing and analysis of safeguarding data.Â
Trilateral Research’s cybersecurity team, in conjunction with Amazon Web Services, has implemented a secure environment for protecting safeguarding data. We align to the National Cyber Security Centre self-assessment. CESIUM is also subject to annual penetration testing. Identity Access Management ensures a dynamic access to data whereby practitioners only have access to the data to which they are entitled to view. Each measure ensures the tightest of controls, while preserving a fluid user experience.Â
Trilateral Research’s Ethics Innovation team deliver CESIUM’s approach to ethics-by-design with an ethical impact assessment (EIA). Trilateral’s ethics-by-design approach puts humans at the centre of technological development. As such, an EIA prioritises the proper use and understanding of Trilateral Research’s technology with the aim of promoting trust and to impact societal good. The resulting ethics work incorporated into CESIUM’s EIA sought to identify and integrate the protection and promotion of ethical values and fundamental rights into its development. The team employed an agile, interactive method and worked closely with end users to understand their needs, communicate ethical priorities and translate ethical values into functional requirements. The result is a system that supports and maintains fundamental human rights.
Trilateral’s Innovation & Research team support ensuring the latest research in ethical AI and law enforcement are fed into the design and development of the application. Further subject matter experts and ethicists continue to support innovation efforts for CESIUM in considering how to advance network analysis insights in child exploitation and organised crime during wider innovation efforts.Â
CESIUM Ecosystem
Today’s societal problems are complex and multi-faceted. Technology has the power to help in incredibly valuable ways, but we believe it’s people who are (and should) remain at the heart of solving our biggest problems and making the right decisions. That’s why our ethical AI products are embedded in an ecosystem of sociotechnical services, from data protection and privacy to in-depth subject matter expertise.
Latest News
CESIUM shines a light on the hidden exploitation of children

What Are The Risks Of Not Sharing Data For Safeguarding Children?

CESIUM: Innovation in child safeguarding – An interdisciplinary journey

Leading the way in protecting the most vulnerable – interview with Chief Superintendent Jon McAdam from Lincolnshire Police

The Interplay of Explicit and Tacit Knowledge with Automated Systems for Safeguarding Children

Purchase Options
UK Public Sector

UK public sector organisations can acquire CESIUM through the government’s G-Cloud marketplace.
Schedule a demo
Interested in understanding how CESIUM can help your organisation? Schedule a demo here.
CESIUM Success Stories
Chief Superintendent Jon McAdam
from Lincolnshire Police
We have been on a fascinating co-design journey with Trilateral Research over the last three years to develop CESIUM. There have been significant benefits of using a research and development company committed to working with a public sector service such as Lincolnshire Police. With a shared vision of safeguarding vulnerable children from harm, this has been an exciting journey, with the findings now demonstrating how a platform such as CESIUM can maximise our operational capability as a single agency and in a partnership setting. We now look forward to the next steps of further integrating CESIUM into our operational processes to fully realise the potential of the system and to enhance our safeguarding arrangements.Â
Chris Todd
from PSNI
London today with @Trilateral_UK team, @lincspolice & #NWG, a 3rd sector org focusing on child protection. Discussing ways of protecting vulnerable children. So refreshing to work with a tech provider that so fundamentally embraces #DataEthics at its heart. #DataAnalytics