My Pictures Matter

7 June 2022

Monash University experts are calling for people to contribute to a world first ethically-sourced and managed image bank for research to combat child exploitation.

The project is an initiative of the AiLECS Lab – a collaboration between Monash University’s Faculty of Information Technology and the Australian Federal Police – which develops artificial intelligence (AI) technologies that aid law enforcement and enhance community safety.

Launched today, AiLECS researchers are asking persons aged 18 and above to contribute photographs of themselves as children through the My Pictures Mattercrowdsourcing campaign. These pictures will be used to train AI models to recognise the presence of children in ‘safe’ situations, to help identify ‘unsafe’ situations and potentially flag child exploitation material.

AiLECS Lab Co-Director Associate Professor Campbell Wilson said machine learning models that are trained on images of people are often fed images that are scraped off the internet or without documented consent for their use.

“To develop AI that can identify exploitative images, we need a very large number of children’s photographs in everyday ‘safe’ contexts that can train and evaluate the AI models intended to combat child exploitation,” Associate Professor Wilson said.

“But sourcing these images from the internet is problematic when there is no way of knowing if the children in those pictures have actually consented for their photographs to be uploaded or used for research.

“By obtaining photographs from adults, through informed consent, we are trying to build technologies that are ethically accountable and transparent.”

The researchers have developed comprehensive strategies for the safe storage and use of this data to preserve the privacy of those depicted, and processes for ensuring ongoing management of consent.

Project lead and data ethics expert Dr Nina Lewis said the principles of data minimisation have been applied to maintain privacy around the images submitted.

“We are not collecting any personal information from contributors other than the email addresses associated with consent for research use, and these email IDs will be stored separate to the images,” Dr Lewis said.

“The images and related data will not include any identifying information, ensuring that images used by researchers cannot reveal any personal information about the people who are depicted.

“People who have contributed photos will also be able to get details and updates about each stage of the research, if they choose to be contacted about the project. And they can even opt to change use permissions or revoke their research consent and withdraw images from the database at a later date.

“While we are creating AI for social good it is also very important to us that the processes and methods we are using are not sitting behind an impermeable wall – we want people to be able to make informed choices about how their data is used, and to have visibility over how we are using their data and what we are using it for.”

Machine learning tools developed with the help of My Pictures Matter campaign will support a major ongoing AiLECS initiative to counter online child exploitation through technologies designed to: better protect survivors of abuse from ongoing harm; assist in identifying and prosecuting perpetrators; and minimise repeated exposure of highly distressing content for the Australian Federal Police and other law enforcement officers.

AiLECS Lab co-director and AFP Leading Senior Constable Dr Janis Dalins said the ultimate goal for developing AI techniques to detect and triage child sexual abuse material was to more rapidly identify victims and material not previously seen by law enforcement.

“This will enable police to intervene faster to remove children from harm, stop perpetrators and better protect the community,” Dr Dalins said.

“In 2021, the AFP-led Australian Centre to Counter Child Exploitation received more than 33,000 reports of online child exploitation and each report can contain large volumes of images and videos of children being sexually assaulted or exploited for the gratification of offenders.

“Reviewing this horrific material can be a slow process and the constant exposure can cause significant psychological distress to investigators. AiLECS Lab’s initiatives will support police officers and the children we are trying to protect; and researchers have thought of an innovative way to ethically develop the technology behind such initiatives.”

By the end of 2022, the researchers are aiming to have a large enough database consisting of at least 100,000 ethically-sourced images for training the AI algorithm through the My Pictures Matter campaign.

AiLECS Lab’s research is funded by Monash University, the Australian Federal Police and the Westpac Safer Children Safer Communities grant program.

To learn more about the My Pictures Matter campaign and submit images, please visit: https://mypicturesmatter.org/