News Ticker

Google technology could save exploited children

This article by Larry Magid originally appeared in the San Jose Mercury News

Listen to Larry’s CBS News Podcast with Google Sr. Scientist Shumeet Baluja and National Center for Missing and Exploited Children President Ernie Allen

Working with law enforcement, analysts at the National Center for Missing and Exploited Children (NCMEC) in Alexandria, Va., spend their days doing something no one should have to do. They look at what is called “child pornography,” but the photos and videos are actually evidence of children – in some cases infants – being sexually abused.

They do this work to help rescue children who may still be in the clutches of a predator, to help catch and prosecute perpetrators and to send a message to would-be child abusers that exploiting children will not be tolerated. The online locations of suspected child pornography are reported to NCMEC via the CyberTipline by Internet service providers and members of the public. Images are submitted by law enforcement agencies across the country.

The work is emotionally draining and challenging. But thanks to some new technology from Google, these analysts now have a tool that could greatly enhance their effectiveness.

A disclosure: I serve as an unpaid member of NCMEC’s board of directors. When it comes to NCMEC’s work, I’m not objective. I passionately share the non-profit organization’s commitment to protecting children. I’m also a strong free-speech advocate, but the images they deal with are not protected by the First Amendment. Production, distribution and possession of “child porn” is illegal in the United States and many other countries.

The concept behind Google’s software is simple, but the implementation took four engineers thousands of hours over the better part of a year, according to Google’s senior research scientist Shumeet Baluja, who is the technical leader of the project.

The software allows an analyst to highlight a pattern somewhere in an image. It could be a calendar on the wall, a logo on a T-shirt, a prominent tattoo or perhaps the pattern of the carpet. It then looks for that pattern in other images and when it finds a match or a likely match it presents those images to the analyst. In some cases it will analyze the entire image to look for matches or near matches. NCMEC President Ernie Allen said the organization reviewed 5 million images and videos in the past year and more than 13 million since 2002.

Without this software, the only ways to make a match is to depend on the memory of analysts or to find an exact copy of the image with a file’s “hash mark.” But the hash mark – the digital fingerprint of a file – doesn’t follow all images. If an image is edited or compressed, for example, the hash mark changes.

We humans may be more perceptive than computers and better able to distinguish similar or unique characteristics. But computers have much better memories. As a NCMEC board member, I have heard amazing stories about analysts and police officers who have matched photos based on characteristics they remember from pictures they may have seen months ago. But I think that they all would appreciate a little help from machines.

Baluja says the technology will work even if the images are modified, if a photo has been changed from color to black and white, or if the pattern is at a different angle or position in the photo or video. It can also pick out a single pattern in a video, even if it’s a compilation of many shorter videos.

Google engineers and scientists were able to work on the project using what the company calls “20 percent time.” Google allows all of its employees to dedicate 20 percent of their work time to projects they initiate. Some of those projects benefit Google stockholders, some benefit end-users and some might wind up not benefiting anyone. This project has the potential to benefit thousands of children.

The engineers didn’t have to start from scratch. The technology is an outgrowth of the anti-piracy software Google developed to help its YouTube division ferret out videos suspected of being posted without the permission of copyright holders.

Google representatives are quick to point out that they don’t always take down copyrighted video flagged by software because in some cases there is a legitimate “fair use” case for it being posted. But what I find interesting about this is that a technology developed to protect intellectual property rights could be applied to protect children.

I’m sure that most people share Google’s motto of “do no evil.” But there are some people on this planet who are very evil toward children. Let’s hope that the efforts of these Google staffers and the hard-working people at NCMEC result in more of these evil people being sent to a place where they can no longer harm children.

If you come across videos or images of child pornography, don’t save them – that’s against the law. But do report their location to NCMEC’s CyberTipline at www.cybertipline.com