An inadvertent consequence of the lockdown restrictions was the volume of people plunging into the online realm, all from diverse backgrounds and with varying technical know-how. This created a perfect storm in terms of the uncertainty and confusion surrounding online interactions. While some parents have converted to working from home, children are experiencing more unsupervised access to the internet which potentially results in – as argued by EUROPOL – more exposure to offenders and also potentially higher risk of becoming lonely and isolated.
This sentiment is also echoed by the Internet Watch Foundation (IWF), who contend that there has been a considerable increase in child sexual imagery since the beginning of the pandemic lockdown. They go further to explain that nearly half of the imagery is “self-generated,” often created using webcams after falling victim to extortive, deceptive tactics and grooming by someone they believe they are in a relationship with online. The result of this is often to blackmail and demand for more extreme activities to be performed. The IWF hold that such practices usually happen while parents/guardians are close by and most likely in another room.
Dangers of disinhibition
The use of webcams and social platforms are not a new trend, though the platforms change quickly and often. With children and teens engaging in these activities, there can form a perceived distancing from reality or an online-disinhibition effect. This ‘online-disinhibition’ effect exists where a sense of safety is felt when communicating online. This is mostly expressed through a lack of restraint in cyberspace, in comparison to in-person communication and is often a result of both the anonymity and perceived invulnerability when sitting behind a screen. This separation from the off-line self can even convince people that their on-line behaviours are not them at all, and so the on-line self becomes a compartmentalised self, more prone to deviant actions and dissociated from consequences.
In an attempt to tackle the growing threat of child sexual exploitation online, we have seen a variety of responses from state authorities. Ylva Johansson, the European Commissioner for Home Affairs, is currently suggesting that emergency legislation be placed to decelerate the threat and to bypass current privacy laws to allow internet companies to contribute higher volumes of voluntary reports. This concern has developed as a result of police in Europe depending largely on voluntary reports submitted by the National Centre for Missing & Exploited Children (NCMEC) in the United States of America.
The Irish Garda Commissioner Drew Harris has reported an increase of reports from the NCMEC last year. This increase can perhaps be attributed to pandemic lockdown and increased internet activity and engagement, but also could be a result of technology advancement within the field.
The European Electronics Communication Code (EECC), which requires a review of the telecommunications sectors legislative framework, is being used to explain the decline of voluntary reporting as it created legal uncertainty for companies wishing to report.
Ylva Johansson suggests providing EUROPOL with increased powers and creating a scenario where companies can more freely report potential instances of grooming and sexual abuse. As grand a gesture as this may appear, to implement these powers under the guise of concern for the safety of children may have far more unforeseen effects. These emergency legislative bypasses possess an unrealised and concerning potential for mission-creep, spreading to less noble causes and infringing on data protection rights.
Dark Web Marketplaces
In situations where images and videos are attempted to be distributed for profit, dark web marketplaces contain the majority of illicit and sexualised content, with the world’s largest illegal market, DarkMarket, recently taken down in a joint international effort. This would act as a substantial blow through slowing the selling and sharing of child sexual content among other illicit goods.
Other Omnibus or ‘High Street’ markets, much like DarkMarket are well known to Law Enforcement Agencies, and the exchange of clandestine commodities are not always as easily removed or found, despite rigours efforts. Vendors can circumvent these controls rather easily by obfuscating descriptions and images and through the use of codes.
The path the EU and LEAs more generally appear to be traveling down is one of management and reactive tactics. This is undeniably beneficial for removing child sexual material online, though little seems to be done in introducing preventative measures, such as promoting insight into the dangers of online social engagement and the risks inherent to communicating online.
The European Union Agency for Law Enforcement training (CEPOL), recently hosted a research and science conference on the pandemic effects on law enforcement training and practice, where Trilateral Research highlighted how a primary endeavour of prioritising preventative measures, rather than the knowledge to report after instances of potential exploitation have occurred, would yield greater results in reducing child sexual exploitation online.
Also raised, was the concept of developing a comprehensive source of knowledge and raising public awareness, which would produce an inherent early detection system among the general public by means of education, helping avoid instances of exploitation before they occur. The potential benefits of such an approach are considerable, and do not require the loosening of regulations for regarding our information and relying on the generosity of these companies to voluntarily report instances of sexual exploitation.
As part of Trilateral’s Sociotech for Good technology development, we have built CESIUM application with safeguarding professionals with a focus on informing their work in combating child exploitation and enhancing preventative measures.
For more information, please contact our team.