- The Washington Times - Monday, February 8, 2010

Pedophiles can be chased down, prosecuted and put behind bars, but in too many cases, their hideous handiwork still roams the world: The explicit photos they took of babies, toddlers and children being sexually abused remain online, where they can be endlessly traded and collected by other pedophiles.

The victims’ pain at having these images left in cyberspace is inexpressible, said Ernie Allen, president and chief executive of the National Center for Missing and Exploited Children (NCMEC). As one child victim said, “Everytime somebody downloads and looks at my picture, it’s like I am being raped all over again.”

This revictimization of children is about to end, thanks to a new technology called PhotoDNA, which was donated in December to NCMEC by Microsoft Corp.

With PhotoDNA, illegal child-pornography images can be broken down to their tiniest pieces, measured and assigned a unique digital “fingerprint.”

Armed with a tell-tale “fingerprint,” law enforcement officials can sift through millions of online images and find copies — even slightly altered ones — of an illegal photo.

When the officials find illegal copies, they can report them to the hosting Internet service provider (ISP).

The ISP can quickly “scrub” the images off their sites, said Mr. Allen. At least 68 ISPs are already interested in PhotoDNA, he added.

The process will be “very fast” and “very reliable,” said Hany Farid, a computer science professor at Dartmouth College who helped Microsoft develop PhotoDNA. Its image-detection rate is about 98 percent, and the false-positive rate is “less than one in 1 billion,” Mr. Farid said in a December briefing.

NCMEC officials estimate that they will review 9 million child-pornography images and videos in 2010.

The volume is a result of home-produced child pornography, rather than commercial production, said Mr. Allen.

Law enforcement data on some 2,700 illegal images show that most child sex-abusers are family members, family friends or neighbors.

Pedophiles are typically driven to make, share and acquire new images. So when pedophiles are arrested, it’s not uncommon to find that they have hundreds, even thousands, of images of child pornography in their possession — and “many of the same illegal images show up in everyone’s collections,” said John Shehan, director of the NCMEC’s Exploited Children’s Division.

Owing to privacy and free-speech concerns, PhotoDNA will only be used “narrowly and surgically” on the “worst of the worst” images, said Mr. Allen. By definition, this means an image that shows the sexual penetration and/or sexual contact of an identified prepubescent child by an adult using his or her own body, a foreign object or an animal.

“Nobody can suggest [such images are] protected speech, and nobody can suggest that this is violation of legitimate privacy rights,” said Mr. Allen.

PhotoDNA sounds like a “very promising development,” said Pamela Rucker Springs, corporate spokeswoman for AOL Inc. “We will definitely take a hard look at it” and see how it might complement AOL’s existing technology that stops transmission of “bad images,” she said.

“We are excited about the possibilities PhotoDNA offers,” said Brooke Bratton, vice president and corporate counsel of United Online, which is best-known for its classmates.com, floral-related services, and NetZero and Juno Internet access and e-mail services.

“Our goal is to help disrupt the spread of known sex-abuse images online,” said Brad Smith, Microsoft senior vice president and general counsel. “We need to bring this issue out of the shadows and get it out into the open.”

Copyright © 2016 The Washington Times, LLC. Click here for reprint permission.

blog comments powered by Disqus

 

Click to Read More

Click to Hide