According to some reports, there was a ‘tsunami’ of child exploitation image sharing during the pandemic and associated lockdowns. UK authorities have received a 1500% increase in reports of online child sexual abuse material, while Australia had a 129% spike.
Although an increase in reports is helpful, police are struggling to deal with the associated investigation. Officers must analyze every device belonging to a suspect, which may include cataloguing and assessing thousands of photos to verify whether they are perfectly innocent – or evidence of child sexual exploitation. This process can take days or weeks – and there simply aren’t enough expert resources available.
Applying artificial intelligence to the fight against crime
Realizing that they need smarter tools to deal with the huge volume of files and images, the Australian federal police have turned to artificial intelligence (AI). AI is often used for identifying patterns and details in pictures, like automated facial recognition. So similar technology could be used to help simplify the search for suspicious materials, weeding out perfectly harmless photos and flagging those which need to be reviewed by an officer.
Before an AI algorithm can perform any kind of pattern matching, it must first be trained. In the case of an image matching algorithm, the system needs to be supplied with thousands of photographs that meet a specific criterion.
The system being developed by the Australian police also needs to be trained – and they have enlisted the help of the public. As part of the ‘My Picture Matters’ campaign, adult Australian citizens are being asked to share 100,000 photographs of themselves as ‘happy children’. The subject of the picture is unimportant – birthday, graduation, sports day etc – it just has to show a normal child doing normal activities.
The new system, called AiLecs, will then be trained with the pictures to help the algorithm learn what a ‘normal’ child image looks like. Once training is complete, AiLecs should be able to analyze the contents of a suspect’s hard drive automatically; images of ‘happy’ children will be ignored (initially) while anything else will be flagged as suspicious.
By speeding up the forensic search, AiLecs will help to bring child sex abusers to justice more quickly – or to clear innocent suspects faster. Once operational, it is hoped that AiLecs can be shared with other law enforcement agencies around the world who are facing similar time and resource shortages in their fight against child exploitation.
AI will not solve the problem of forensic analysis on its own because a human police officer still needs to verify the results generated. But if the algorithm works as expected, AiLecs should be a powerful tool to help close more investigations more quickly.