Falling through the cracks: The human cost of information sharing failures in multi-agency safeguarding partnerships

Reading Time: 5 minutes

Authors:  

Amelia Williams | Research Communications Officer

Date: 1 July 2024

An estimated 5 million children between the ages of 11 and 17 are at risk of exploitation in England and Wales, according to a recent report by Barnardo’s. The new estimate comes after a series of sexual abuse scandals exposed the existence of paedophile rings. The eruption of each scandal—such as those in Rotherham, Newcastle, and Huddersfield—revealed a worrying pattern. Despite the presence of police, social workers, and other child safeguarding agencies, these crimes went uninterrupted for decades and amassed thousands of child victims.

The UK’s current safeguarding system

The UK has a number of multi-agency safeguarding partnerships, designed to protect vulnerable children from sexual abuse and exploitation. They involve a range of organisations such as police, welfare services, care workers, schools, NGOs, healthcare professionals, and other community groups. However, they are often hindered by failures in information sharing. Specifically, these partnerships lack uniform policies explaining how information should be shared. Instead, they are driven by informal sharing that relies on the discretion of individuals to assess evidence and respond proactively.

Time and again, informal data sharing has failed. Poor inter-agency communication has led to incidents in which multiple agencies had access to concerning information and assumed another agency would follow up, resulting in inaction. As a result, police investigations are weakened and vulnerable to bias and victim blaming. In contrast, effective multi-team collaboration consistently yields positive results. It ensures that the interdisciplinary expertise of social workers and other officials helps the police recognise troublesome adolescent behaviour as signs of exploitation and abuse. The clear difference between these approaches highlights the urgent need for improved information sharing practices in child safeguarding.

New information sharing guidance

In April 2024, the UK government released updated advice titled “Information Sharing Advice for practitioners providing safeguarding services for children, young people, parents and carers.” The guidance states that data protection legislation does not prevent information sharing for safeguarding purposes when necessary, proportionate, and justified. It outlines seven rules, including prioritising child protection over privacy concerns, engaging with children and carers when safe, and seeking prompt advice when uncertain. The guidance also clarifies that consent is not required when a child is at risk, and safeguarding takes precedence over confidentiality. However, the effectiveness of these guidelines remains to be seen. As the following case studies from the 2010s demonstrate, bureaucratic failures have long undermined the efficacy of safeguarding partnerships.

The human cost of bureaucratic failures

The bureaucratic failures undermining safeguarding partnerships may sound mundane but have a significant human cost. Hundreds, and in some cases thousands, of child victims. While the following   case-studies come from the 2010s, recent reports and warnings from the Independent Inquiry into Child Sexual Abuse suggest that the same mistakes are occurring today.

The Rochdale Sex Abuse Scandal

The Rochdale scandal refers to a child sex abuse ring that resulted in 28 convictions and at least 47 victims. Although an independent inquiry suggests this number is likely much higher. Despite the existence of a multi-agency partnership, the crimes went undetected for years, with victims known to social services, regularly visiting A&E with injuries consistent with sexual assault, going missing, and exhibiting worrying behavior in school.

A Serious Case Review found that the lack of inter-agency information sharing was a major factor preventing intervention, with agencies failing to share information in a timely manner or work together effectively. Examples of information sharing failures included GPs not informing Children’s Social Care of mental health referrals, healthcare practitioners not being informed of or invited to reviews of children’s care, and social workers not returning calls, even when urgent. Instead of proactive information sharing, investigators relied on victims coming forward as witnesses, demonstrating an ignorance of the dynamics of grooming, intimidation, and post-traumatic stress.

Child Sexual Exploitation in Oxfordshire

The Oxfordshire child sex abuse ring, operational between 1998 and 2012, resulted in the convictions of 22 men for sexual crimes against over 300 identified victims aged 11 to 15. Similar to the Rochdale incident, a Serious Case Review found that the police relied on victims coming forward rather than facilitating a proactive and collaborative response between agencies.

The Review identified information sharing failures both between and within different agencies, where information from frontline service providers didn’t reach higher levels of management. The report stated that from 2005-2010, there was sufficient information to warrant a more rigorous response, but it did not happen. This failure to share information prevented investigators from grasping the scale of the abuse, contributing to the high number of victims.

Telford Child Sexual Exploitation

Since 2012, 10 people have been convicted for sexual crimes against minors in the Telford area. A recent independent inquiry estimated that more than 1,000 girls were abused over a 40-year period, significantly higher than the initial estimate. The Inquiry found evidence of poor information sharing practices, particularly between Social Care and other agencies, describing the multi-agency partnership as hampered by a “silo culture.”

While information sharing was effective between certain agencies, others were slow to share due to profession-specific concerns. Police, social workers and GPs all experience different difficulties and confusion about what they were legally permitted to share. The Inquiry suggests that rather than a complete data sharing failure, the Telford case highlights the need for extra support to guide certain professions in confidently sharing data.

Our solutions: Honeycomb and CESIUM

These cases demonstrate that child safeguarding has been persistently undermined by inadequate information sharing practices. The team at Trilateral Research are addressing information sharing challenges through the development of two ethical AI tools, CESIUM and Honeycomb, both of which are underpinned by our in-house anonymisation tool MASC.

CESIUM was co-created with Lincolnshire Police to enhance the efficiency of child safeguarding procedures. Presently, to identify children at risk, stakeholders must manually sift through and consolidate data from diverse databases, inform other agencies of their findings via email, and then coordinate multi-agency meetings to determine the next steps. CESIUM revolutionises these processes. The tool applies AI and data analytics to extract data from across different agencies’ databases, employing flagging mechanisms and NLP to analyse the  information and aid risk identification.

Without CESIUM, the risk assessment process takes five days on average. With CESIUM, the same assessments take twenty minutes, reducing pre-screening time by 80% and leading to earlier safeguarding referrals in 34% of cases.

Honeycomb was designed in collaboration with the Greater Manchester Police and Greater Manchester Combined Authority to understand and address modern slavery and human trafficking. Among other things, the ethical AI application:

  • Uses our product MASC to anonymise data to facilitate secure sharing between agencies,
  • Creates a problem profile of modern slavery and human trafficking,
  • Uses survivors’ stories to generate visualisations of trafficking routes across countries and continents, and
  • Uses survivors’ stories to identify emerging trends in survivors’ needs.

Both Honeycomb and CESIUM can be applied to the prevention of and intervention in sexual abuse cases like the ones mentioned above, as well as coercion by criminal gangs, forced labour, and other forms of exploitation.

At Trilateral Research, we believe innovative technology can and should be deployed for the public good, alongside stringent ethical, legal, and privacy safeguards. That’s why we’re working to harness the power of AI to protect some of society’s most vulnerable groups. For more information on Trilateral’s Programme Against Exploitation and Violence, please contact Dr Julia Muraszkiewicz.

 

Related posts

Let's discuss your career