UK: +44 (0) 207 0528 285 | IE: +353 (0) 51 833 958
UK: +44 (0) 2070528285
IE: +353 (0) 51 833 958
Home > News & Insights > Sociotech Insights > The Interplay of Explicit and Tacit Knowledge with Automated Systems for Safeguarding Children

The Interplay of Explicit and Tacit Knowledge with Automated Systems for Safeguarding Children

The UK’s Digital Policing Strategy prioritises “the power of digital technologies and behaviours to identify the risk of harm and protect the vulnerable in the physical and digital world”. In support of this priority, Trilateral Research is collaborating with Lincolnshire Police and Lincolnshire Safeguarding Children Partnership (LSCP) to co-design and develop CESIUM, a data analytics and AI platform for safeguarding children who may be vulnerable to exploitation. In co-developing the automated aspects of CESIUM, we have taken a sociotechnical systems design approach to consider the interaction of human factors with technical systems. This approach has revealed that while AI and automation hold much promise, they also raise human-centred considerations. As this blog post will explain, recognising the interaction of explicit and tacit knowledge in sociotechnical system design is essential for the successful adoption of digital technologies in policing.

Trilateral’s background stems from understanding the ethical considerations of emerging technologies; accordingly, accuracy, fairness, and transparency have been at the forefront of CESIUM’s co-design and development. As the Digital Policing Strategy notes, digital technologies represent a promise to “quickly support decision making with insights drawn from vast quantities of information”. Correspondingly, CESIUM will provide a platform for the multi-agency sharing and automated processing of information to support LSCP’s mission to tackle child exploitation. By automating the processing of data at scale, we also recognise accompanying ethical considerations. Questions about accuracy address the algorithmic fitting and forecasting errors; fairness involves treating similarly situated groups similarly; transparency refers to the accessibility of AI technology to stakeholders (see the EU High Level Expert Group on AI Guidelines or Alan Turing Institute’s Guidelines for Using AI in the Public Sector for usable definitions of these ethical principles).

Explicit and tacit knowledge

Highlighting the difference between explicit and implicit knowledge provides a way to understand the human-centred considerations of sociotechnical systems like CESIUM and facilitate its successful adoption by police. “Explicit knowledge refers to any form of written documentation that makes knowledge ‘explicit’ and hence available”. Professional development courses and intelligence databases contain explicit knowledge of child safeguarding. “Tacit knowledge, on the other hand, is ‘implicit’ knowledge that is gained through individual experience and action”. This type of knowledge is about gut feeling or instinct that is difficult to make explicit in a training course or database. In safeguarding, tacit knowledge is a “sixth sense” that something may be untoward. Effective analysis of a particular problem combines explicit and tacit knowledge, which in sociotechnical terms means combining the outputs of technical systems and professional judgement.

CESIUM’s explicit knowledge is in the fusion of police and local authority data to support Multi-Agency Child Exploitation (MACE) meetings. Comprising Lincolnshire’s police force, health agencies and county council, “the weekly MACE meeting draws together information from different partner agencies to explore the options available for tackling child exploitation in the community”. Within a rigorous data governance regime, CESIUM presently draws together information by fusing police data, while plans are in place to add local authority data. The platform then displays a timeline of a child’s case history and a network graph of their associates to identify potential threats. The algorithms additionally use the past MACE decisions to risk-assess and prioritise children who are potentially vulnerable to exploitation. This automated processing of data at scale saves MACE workers many hours in manual processing and helps identify children they may have missed.

The experiences of police officers and local authority caseworkers represent the tacit knowledge CESIUM seeks to augment. To harness this tacit knowledge and to combine it with the explicit knowledge of CESIUM, Trilateral Research provides training to give users the digital literacy to question the accuracy and fairness of CESIUM’s outputs and so that the explicit knowledge in CESIUM may enhance the tacit knowledge of professional judgement. We seek to promote transparency with explainability, whereby all of CESIUM’s predictions have an accompanying rationale to explain the algorithm’s inner workings. These rationales may then be accepted, rejected or modified as part of MACE decision-making. In capturing and processing explicit knowledge contained within the police and local authority databases, CESIUM augments the tacit knowledge of caseworkers who have vast experience in safeguarding children. This human-centred approach is essential for the successful adoption by police of CESIUM and any automated AI system.

As a sociotechnical system comprising cutting edge technologies, the professional instinct of LSCP’s members remains central to CESIUM and protecting children.

For further information about CESIUM please check out our website and do get in touch (sociotech@trilateralresearch.com). Also, check out our upcoming workshop at the Web Science conference on 26-29 Jun 2022.

Stephen Anning

Product Manager

Join our newsletter