In a double whammy that should make both child abusers and connoisseurs of pornography tremble in their sweaty seats,
Microsoft and researchers from a South Korean university have today announced two new technologies that can be used to auto-detect pornographic images and movies. Microsoft is first up with PhotoDNA, which, in partnership with the National Center for Missing and Exploited Children, will be used to scan
Facebook, Bing, and SkyDrive for child porn. If inappropriate images are found, they are immediately deleted, and the uploader will be reported to the authorities. Meanwhile, at Advanced Institute of Science and Technology in South Korea, two researchers have used the Radon transform signal processing technique to detect a “sexual scream or moan” in digital video files.
The automatic detection and removal of child porn, in itself, is of huge importance — but the inclusion of Facebook, the largest photo sharing service in the world, is gargantuan. Image matching technologies are not new, but PhotoDNA can accurately detect images that have been significantly resized or cropped, and through extensive testing it has yet to provide a single false positive result. Facebook will scan every image that’s uploaded and compare it against a database of child porn digital fingerprints, and simply refuse to accept any exploitative images. The only problem with PhotoDNA, though, is that it can’t decide that an image is pornographic; it can only match an image against known samples of child porn.
The researchers, who obviously watched an awful lot of porn, found that while speech is low-pitched and music is variable, pornographic sounds are generally high-pitched, change tone rapidly, and periodically repeat. Surprisingly, this “moan detector” technology is actually very accurate, successfully detecting 93% of the porn test clips. The technique missed some clips with confusing background tracks, and it also falsely detected some comedy shows as pornography: the the sounds of laughter, cheering, and crying have sexual characteristics, apparently.
While these technologies sound great in theory, some serious implications arise when you look at the bigger picture. PhotoDNA can be used for any kind of images, including consensual adult pornography — and when combined with audio detection of porn, you would have a tool that could successfully block the digital distribution of every kind of porn. Imagine a hyper-conservative state that decides to issue a blanket ban on any and all pornography. That government could install PhotoDNA and the scream detector on core ISP routers — and if you think you can evade it with encryption or proxying, the government could also demand that Facebook, Flickr, RapidShare, and other digital file lockers install the porn scanner on their servers.
Copyright © 2010 Ziff Davis Publishing Holdings Inc.