In an earlier post, I linked to a test to see how good we are at recognizing faces. There is also software that takes measurements of parts of the face that are supposedly highly distinctive (such as the shape of the ear) and uses that data to find points of similarity to give estimates of the likelihood of an exact match.
But Ava Kofman writes that this technology is far from foolproof and can lead to errors that can lead to the ruin of innocent people, using the example of one man whose life has been upended because of faulty matches.
No threshold currently exists for the number of points of similarity necessary to constitute a match. Even when agencies like the FBI do institute classification guidelines, subjective comparisons have been shown to differ greatly from examiner to examiner. And the appearance of differences, or similarities, between faces can often depend on photographic conditions outside of the examiner’s control, such as perspective, lighting, image quality, and camera angle. Given these contingencies, most analysts do not ultimately provide a judgment as to the identity of the face in question, only as to whether the features that appear to be present are actually there.
…“As an analytical scientist, whenever someone gives me absolute certainty, my red flag goes up,” said Jason Latham, who worked as a biochemist prior to becoming a forensic scientist and certified video examiner. “When I came from analytical sciences to forensic sciences, I was like some of these guys are not scientists. They are voodoo witchcraft.”
…In 2009, following the National Academy of Sciences’ call for stricter scientific standards to underpin forensic techniques, the FBI formed the Facial Identification Scientific Working Group to recommend uniform standards and best practices for the subjective practice of facial comparison. But the working group’s mission soon ran up against an objective difficulty: Like some other forensic sciences, facial comparison lacks a statistical basis from which its conclusions may be drawn.
This is, in part, because no one knows the probability of a given feature’s distinctiveness. As a FAVIAU slide on the “Individualization of People from Images” explained, “Lack of statistics means: conclusions are ultimately opinion-based.”
We tend to be too easily impressed by analyses that assign numbers to estimates, giving them a heft that may not be warranted.
Tabby Lavalamp says
Yay, another technology that television and movies have taught people is infallible. Every time they do a facial recognition search, it finds the bad guy without fail -- unless the plot calls for it not working, then they’re just not in the system.
Marcus Ranum says
It doesn’t work but the government is still spending money on it hand over fist.
There are scary systems being fielded that use face recognition to attempt to identify attendees at protests, people passing through airports, etc. When you point out that it doesn’t work reliably enough the response is usually “it’s better than nothing.” In other words the movie Brazil accurately predicted the future.
Marcus Ranum says
Also, as reported in the Guardian, 70+% of everyone is already in databases:
https://www.theguardian.com/world/2016/oct/18/police-facial-recognition-database-surveillance-profiling
John Morales says
Mano, point taken about the inadvisability of prematurely relying on it, given the state of the art.
—
Marcus,
It may never be perfect, but will it ever not be sufficiently adequate for the tasks at hand?
I’m pretty sure that, sooner or latter, it will become better than humans at facial recognition — probably sooner if money keeps being spent hand over fist.
sonofrojblake says
Whatever proportion of people are already in databases in the USA, I’d bet folding money the proportion is higher in the UK. It’s often said we’re the most surveilled nation on earth -- we love a bit of CCTV.
Of course, there is a section of our society who are immune to this form of state intrusion, because they routine walk our streets masked. It’s so hard to tell whether they’re more oppressed than us or not.
intransitive says
Facial recognition is another technology, like mass phone spying, that has little or no use against criminals but will instead be mainly used to monitor the innocent majority.
The average person like you or I walk around with a bare face unless wearing a balaclava in -30C temperatures. A criminal who plans and armed robbery might wear a Guy Fawkes or other mask which will defeat the software -- the facial points picked up will be the mask’s, not the real face.