Thursday 12 March 2009

New Side to Face-Recognition Technology: Identifying Victims

Since Sept. 11, discussion of the disputed technology of face recognition has focused on its potential for identifying criminals and terrorists -- and for invading citizens' privacy. But in England, the police are pursuing a different path: they want to use facial recognition software to identify crime victims.

Using software developed by a Canadian company, Britain's National Crime Squad is creating a database of nearly three million pictures seized in raids of child pornography rings. By matching the images against pictures of missing children, investigators hope to find them, or at least generate clues -- an unusual car or or distinctive scenery -- that can help identify the people making the photos and films.

Facial recognition has been in development for decades, but recent advances in computer power and software have made the systems less expensive and more accurate -- though just how accurate remains a subject of debate.

Most systems work by taking pictures of faces, comparing them to a template and making dozens of measurements of each one, including factors like the distance between the eyes. In the case of Imagis Technologies -- the company in Vancouver, British Columbia, that created the software out of earlier work on recognizing patterns in satellite photographs -- the program detects hundreds of ''light source positions.'' It also measures factors like the angle of the head and facial shape, said Andy Amanovich, the company's chief technology officer.

The mathematical description of those features is stored in a database, to be compared with other strings of numbers that have been derived from faces -- and also jewelry, clothes, scars and background objects like furniture or vehicles.

No facial recognition system is perfect, or even close: all make mismatches and overly broad matches. Many can be confounded by simple subterfuges like wigs or glasses. Civil liberties and other groups say they cast too wide a net, invading privacy and extending the reach of surveillance too far.

And the technology's credibility has not been helped, many experts agree, by exaggerated claims for its effectiveness. ''These software companies have popped off numbers that they can't really substantiate,'' said Ron Cadle, a vice president of Pellco Inc., which is adapting facial recognition systems for use in Fresno Yosemite International Airport. ''It's kind of given them a black eye.''

Mr. Amanovich agreed. ''There's a lot of false claims out there and a lot of specious claims to what all technologies can do,'' he said.

Nevertheless, Mr. Cadle, who uses recognition programs from Visionics Inc. and Viisage, said his company had boosted the reliability his partners' software so that it can make a match 80 percent of the time and falsely claim a match with just 1 of every 500 passengers. Mr. Amanovich, however, said such figures are so malleable at this early stage that claims are not useful.

The British project had its origins in a 1997 sweep in which 101 members of a child pornography trading ring called Wonderland were arrested in raids around the world.

Aficionados of child pornography tend to be obsessive collectors of pictures and films, and that and other raids led to a police database of some three million images -- too many for humans to sort through effectively. (Efforts to create books or CD's by hand had yielded 1,200 identifiable faces, leading to the identification of just 18 children, one of whom had been murdered.) So in December 2000, the squad signed an agreement with a contractor, Serco Group, to automate the rest of the process. Serco turned to Imagis.

Peter Spindler, a detective superintendent with the National Crime Squad, says he has been impressed with early results. The software was able to identify images from a test database -- not just images of children, but also of siblings. The feature could could help identify families participating in the porn trade.

But one expert in child pornography said the British efforts was ''not going to do much.''

Dr. John Philip Jenkins, a professor of history at Penn State and author of ''Beyond Tolerance: Child Pornography on the Internet,'' said child pornography photos were unlikely to lead investigators to the children involved. A child victim's identity, he said, ''is only likely to come to light if the child comes up in an abuse case.''

Many of the images, he added, now flow from the former Soviet Union, where lax enforcement allowed the trade to flourish. There, he said ''police corruption is going to limit the effectiveness of any attempt to use this technology'' successfully.

He called for international efforts to crush online image trading.

But Detective Spindler said the police had to try to do more than restrict the traffic in illicit images. ''It's not simply about identifying people who are abusing the Internet, people who are trading child pornography,'' he said. ''This is about people abusing children.''

Photo: A detective in London, Peter Spindler, left, says image identification from a test database was impressive. Dave Lutes, chief engineer of Imagis Technologies, demonstrates the program in Victoria with a mock photo. (Jeff Vinick for The New York Times); (Johnathan Player for The New York Times) Chart: ''How Face-Recognition Technology Works'' Face-recognition technology is increasingly being used in security systems and law-enforcement investigations. Here is one approach, the basis of systems made by several companies in the field. FIRST LOOK -- The system must decide whether the image before it is a human face. It looks for a pair of eyes and the borders of the face. RESIZING -- The computer adjusts the contrast and size of the image to make it similar in format to the other faces in its database. MATH -- The image is now a grid of pixels, each with a "gray scale" value between 0 for black and 255 for white. These can be expressed as numbers and used to process the image mathematically. COMPARISON -- The face is compared with 128 archetypal faces, or eigenfaces, made from thousands of faces in a database. The new face is described as being similar, by percentages, to the eigenfaces. RESULT -- The system compares the new face's eigenface against the eigenfaces of all the real people in its database, then displays all the people the new face resembles, in order of similarity. (Source: ''Face Recognition for Smart Environments,'' Alex Pentland and Tanzeem Choudhury, in the IEEE publication Computer; Jim Wayman, San Jose State University)

By JOHN SCHWARTZ

No comments:

Post a Comment