CESIUM shines a light on the hidden exploitation of children

Reading Time: 4 minutes

Authors:  

Dr Hayley Watson | Director, Sociotech Innovation
Dr Stephen Anning | Product Manager

Date: 17 October 2022

In a recent validation test, CESIUM identified 16 children several months before they were referred by safeguarding partners.

A child being exploited has no voice, lives in fear of those exploiting them, and is hopeless to find a way out; they can depend only upon responsible adults in their community to recognise the signs and intervene on their behalf.  Often though, the adults around them are either oblivious to the exploitation, are fearful of trying to intervene or are the perpetrators themselves. The symptoms of this exploitation are rarely overt and easy to see, requiring the intervention of well-trained but over-extended safeguarding professionals.

Over the course of our work to co-design CESIUM with Lincolnshire Police we have gained new insights into identifying vulnerable children.

The complex environment for safeguarding children from exploitation 

Child exploitation is a complex challenge to tackle, in part because of its inter-relationships with other crime types and the lack of clear information about those crimes. The Commission on Young Lives reports that social services identified 12,720 children in England during 2020/2021 who were vulnerable to criminal exploitation by gangs. Further, they said that school referrals to social services “fell from 117,010 in pre-Covid 2019/20 to 81,180 in 2020/21 when there were two school lockdowns – a drop of 31%”. The pandemic further impacted the understanding of the scale of vulnerable children due to the loss of valuable reporting information from safeguarding partners.

The Children and Social Work Act 2017 reformed the delivery of multi-agency services for child safeguarding. This legislation abolished Local Safeguarding Children Boards and along with the 2018 Working Together to Safeguard Children strategy established statutory obligations for safeguarding, and gave responsibilities to a “safeguarding partner”, which concerning a local authority area in England, means:

  1. the local authority, generally the children’s services
  2. a clinical commissioning group lead
  3. the Chief Constable of a Police Force

For the UK’s 134 Safeguarding Children Partnerships, data sharing is critical to a Safeguarding Partnership’s ability to identify vulnerable children and intervene accordingly. The 2020 National Child Safeguarding Practice Review Panel report identifies the consequences of weak information sharing:

“Issues such as weak information sharing, communication and risk assessment have, over decades, impeded our ability to protect children and to help families.”

A 2021 review by Sir Alan Wood of What Works for Children’s Social Care proposed the following challenge to data sharing:

“How can we help hard-pressed stressed professionals to feel empowered to ask for and share information and to inculcate an attitude of sharing and openness in those who control the data and information?”

More recently, the July 2022 Independent Inquiry into child sexual exploitation in Telford responds to “insufficient and ineffective information sharing” by including improvements to data sharing in 15 of its 47 recommendations for reforming safeguarding services.

The problem is clear; Safeguarding Partnerships need the confidence and tools for multi-agency information sharing and decision-making to improve safeguarding arrangements for vulnerable children.

How does CESIUM tackle these safeguarding challenges?

CESIUM is transforming how partnerships safeguard children from exploitation with ethically designed technology. CESIUM responds to the Woods report challenge by empowering safeguarding professionals with secure access to multi-agency case data. CESIUM’s ethical AI augments multi-agency decision-making by using shared data access to identify and prioritise vulnerable children. In response to the National Child Safeguarding Review panel, improved information sharing better facilitates inter-agency communication and risk assessment. The word ‘augment’ is critical to Trilateral’s ethical approach to technology development. We do not seek to replace an analyst’s professional judgement that an algorithm can never hope to capture; instead, CESIUM augments professional judgement with decision support tools.

CESIUM’s AI statistically models the typical child referred to a partnership and then assigns a score to new cases in accordance with their similarity to the typical referral. These scores are then applied to all records to create a prioritisation score for each child. Trilateral Research addresses the ethical concerns of biases with algorithmic transparency whereby algorithms explain how their outputs are produced. These explanations then provide a rationale to support decision making. CESIUM, therefore, records a similarity score to the typical referral and a prioritisation score against all other cases.

The validation test took a snapshot of records from February 2022 and recorded CESIUM’s prioritisation score for each. We then compared these records against referrals during the subsequent seven months and identified 16 children who received high scores in January and months before their actual referral:

  • 47 children referred to the partnership between February and September were in the 95th percentile and above
  • 31 of these children had been previously referred (before February 2022)
  • 8 children had not been previously referred but were referred within the next three months (Feb – May 2022)
  • 5 children had not been previously referred and were then referred 3-6 months later (May – Aug 2022)
  • 3 children had not been previously referred and were then referred six months later (after Aug 2022)
  • 16 children (8 + 5 + 3), therefore, were given high-priority months before being referred.

Validation work with analysts revealed many occurences of petty crime for these 16 cases. In isolation, petty crime paints a picture of a problematic child, but considering these in aggregate using CESIUM’s AI, however, paints a picture of a child who is vulnerable to exploitation.

What our co-design partner says

Jonathan McAdam, Chief Superintendent – Lincolnshire Police

We have been on a fascinating co-design journey with Trilateral Research over the last three years to develop CESIUM.  There have been significant benefits of using a research and development company committed to working with a public sector service such as Lincolnshire Police.  With a shared vision of safeguarding vulnerable children from harm, this has been an exciting journey, with the findings now demonstrating how a platform such as CESIUM can maximise our operational capability as a single agency and in a partnership setting.  We now look forward to the next steps of further integrating CESIUM into our operational processes to fully realise the potential of the system and to enhance our safeguarding arrangements. 

To find out more, please contact Stephen Anning, CESIUM Product Manager, for further information and a demonstration.

Related posts