Child exploitation is among the most hideous crimes in our society. Through leveraging the value of the data held by safeguarding partnerships through the secure sharing of insights among partners, CESIUM augments multi-agency decision-making by identifying and prioritising vulnerable children.
of recommendations in the 2022 Telford inquiry into child sexual exploitation focused on the importance of data sharing (source)
Many reports have identified long-standing problems with current approaches to risk assessment. Practitioners make sensitive judgements on crucial issues in conditions of uncertainty, often under considerable pressure, and based on information that is incomplete or may be changing rapidly. Moreover, an over-reliance on intuition carries the risk of embedding biases [source]
Data sharing, analysis and record keeping create an administrative burden which reduces the focus on safeguarding. Current practices rely upon general purpose tools like spreadsheets and word processors to collate and analyse data. In addition to the administrative burden, general purpose tools also mask the insights that modern AI has to offer [source]
Many reports and policy documents identify the criticality of data sharing for safeguarding children. Nevertheless, too often, data is held in siloed databases and spreadsheets. Vulnerable children then become more difficult to identify. While data sharing raises legitimate concerns, reviews repeatedly find data silos increase the risk of exploitation [source]
While technology presents new opportunities for risk assessment, humans remain integral to system design. A system’s outputs need to be communicable to analysts and decision makers while being testable for biases. However, too often, humans are not considered. Safeguarding systems, therefore, must be designed around effective human-machine interaction [source]
For defensible risk assessment using AI, practitioners require basic digital literacy to understand and critique a machine’s output. Just as practitioners may critique a colleague’s assessment, they must be able to question why a machine has generated a particular output. AI, however, is too often mystified and difficult to understand [source]
ICO guidance recommends high algorithmic transparency for decision making to overcome the problems with opaque systems [source]
CESIUM augments professional judgement with evidence-based insights for safeguarding decision-making.
CESIUM’s algorithmic outputs, insights and digital literacy training seek to mitigate against de-humanising children through quantification and datafication.
CESIUM’s algorithmic insights uncover potential biases and blind spots in machine learning algorithms that lead to discriminatory decision-making.
CESIUM is co-designed to promote dialogue between safeguarding professionals.
Today’s societal problems are complex and multi-faceted. Technology has the power to help in incredibly valuable ways, but we believe it’s people who are (and should) remain at the heart of solving our biggest problems and making the right decisions. That’s why our ethical AI products are embedded in an ecosystem of sociotechnical services, from data protection and privacy to in-depth subject matter expertise.
Our Data Protection team leads CESIUM’s privacy-by-design journey by ensuring full compliance and adherence to good practices. Partnerships are regulated with data sharing agreements setting out the purpose of data sharing, regulating what happens to the data during processing, and outlining roles and responsibilities for the involved parties. The Data Protection team co-authored CESIUM’s DPA with safeguarding partners for the responsible sharing of case data. A DPIA is a process to help identify and minimise data protection risks. CESIUM’s DPIA mandates the necessary controls to protect to mitigate the high risk of sharing sensitive data about children. The DSA and DPIA govern CESIUM’s implementation and operation. The result is a secure system that enables the assured multi-agency sharing and analysis of safeguarding data.
Trilateral Research’s cybersecurity team, in conjunction with Amazon Web Services, has implemented a secure environment for protecting safeguarding data. We align to the National Cyber Security Centre self-assessment. CESIUM is also subject to annual penetration testing. Identity Access Management ensures a dynamic access to data whereby practitioners only have access to the data to which they are entitled to view. Each measure ensures the tightest of controls, while preserving a fluid user experience.
Trilateral Research’s Ethics Innovation team deliver CESIUM’s approach to ethics-by-design with an ethical impact assessment (EIA). Trilateral’s ethics-by-design approach puts humans at the centre of technological development. As such, an EIA prioritises the proper use and understanding of Trilateral Research’s technology with the aim of promoting trust and to impact societal good. The resulting ethics work incorporated into CESIUM’s EIA sought to identify and integrate the protection and promotion of ethical values and fundamental rights into its development. The team employed an agile, interactive method and worked closely with end users to understand their needs, communicate ethical priorities and translate ethical values into functional requirements. The result is a system that supports and maintains fundamental human rights.
Trilateral Research’s Sociotech Insights Group provide sociological expertise for developing CESIUM’s text analytics. The text analytics algorithm employs a reference dataset in a text classification architecture, for which SIG experts compiled a range of relevant texts and created a labelling schema for the reference dataset. The result of this effort is an augmented analyst experience of return home statements that gets to the right insights, quicker.
Trilateral’s Innovation & Research team support ensuring the latest research in ethical AI and law enforcement are fed into the design and development of the application. Further subject matter experts and ethicists continue to support innovation efforts for CESIUM in considering how to advance network analysis insights in child exploitation and organised crime during wider innovation efforts.
UK public sector organisations can acquire CESIUM through the government’s G-Cloud marketplace.
Interested in understanding how CESIUM can help your organisation? Schedule a demo here.
In choosing to partner with Trilateral Research and the National Working Group, it was apparent that we are all committed to collective working to produce a tool that can be developed to a localised level, but at the same time understanding the wider potential for law enforcement and our partner agencies
London today with @Trilateral_UK team, @lincspolice & #NWG, a 3rd sector org focusing on child protection. Discussing ways of protecting vulnerable children. So refreshing to work with a tech provider that so fundamentally embraces #DataEthics at its heart. #DataAnalytics