Perceptual Hashing

Status: Ongoing

Effective automatic detection of Child Sexual Abuse Material (CSAM) is a continuing challenge, including the rapid detection of previously seen material.  Developing efficient algorithms to identify, retrieve, and classify this material, is a key activity in the technological countering of online child exploitation. Algorithms deployed for this use require robust similarity metrics to match perceptually similar altered images/videos with their original versions. Such metrics with this capability facilitate both detection and classification of CSAM materials.

Various approaches such as Apple’s NeuralHash, Microsoft’s PhotoDNA, and Facebook’s PDQ are used to detect and prevent the distribution of child exploitation materials. These approaches are quite different in terms of extracting effective features (perceptual hashes) to match images/videos. This project aims to analyse various algorithms and evaluate their performance on ethically curated datasets, in addition, to exploring new concepts and approaches in  achieving relevant effective feature mapping techniques.