Facial recognition – Jokesville says UK’s “Great Bane”

Kaye Beach

Oct 10, 2011

Facial Recognition-“The technology doesn’t work”

David Moss, in response to my info dump on the FBI and their plan to unleash facial recognition on a national scale, provides us with some absolutely critical information and some invaluable advice as well.

In case you are wondering, David Moss is an IT Specialist, Researcher and longtime campaigner against the UK’s biometric ID scheme.  He was a guest on AxXiom For Liberty last April when he detailed for us some BIG problems with biometric identification that we often overlook in our outrage over the loss of our privacy and freedoms.  (Listen to that show here.)

He is the “Great Bane” of government waste and corruption, in my estimation and I have to confess that when I grow up(figuratively speaking, of course) I want to be just like David Moss.

David’s work proved that India’s plan to bio metrically identify and number 1.2 billion of India’s people is bound to fail.  India’s ID card scheme – drowning in a sea of false positives

David says;

“Once your students have finished Facial Recognition 101, the better ones will have understood that the technology doesn’t work.”

And

“This roll-out is an opportunity. Not a threat. Grab it with both hands, embrace it and enjoy it”

This information was too important to leave parked in the comments section so I am re-posting his comments in their entirety.  Take note activists, this is a priceless lesson from a real pro!


Facial recognition – Jokesville 1

Kaye, thank you for the refresher course.

Please find herewith further material for the refresher course.

Once your students have finished Facial Recognition 101, the better ones will have understood that the technology doesn’t work.

Let’s be clear. Don’t let’s mix up our biometrics. They’re not all the same.

DNA, irisprints, traditional fingerprinting – they’re all biometrics worth worrying about from the point of view of privacy. But two-dimensional facial recognition? That’s what Aliya Sternstein’s article is about. That’s the new service being launched nationwide. Forget it. It doesn’t work. It’s jokesville.

At last the suppliers of 2-D facial recognition technology have been lured into a very public demonstration of the reliability or otherwise of their wares. They’ve never had to submit to this discipline before. They’ve never provided any warranties. Now they may find themselves twisting in the wind, hung out to dry, publicly humiliated and exposed as charlatans, mountebanks, snake oil salesmen, astrologers who convince only the simple-minded.

This roll-out is an opportunity. Not a threat. Grab it with both hands, embrace it and enjoy it.

continued …

Facial recognition – Jokesville 2

May I suggest a five-point plan to take advantage of the FBI’s proposed NGI trial?

The idea is to get the trial results widely published so that everyone can see whether they should share the vendors’ confidence in their own products. If not, the FBI can safely drop the technology, without impugning crime-fighting, and public money can be better invested elsewhere (or left with the public, who probably know better how to invest it).

1. Aliya Sternstein’s article says: “FBI officials would not disclose the name of the search product or the vendor”. Time for a freedom of information (FOI) request. This is public money being spent here. No doubt the FBI and the vendor have a mass of confidentiality agreements protecting intellectual property and future commercial interests. Fine. But the public have rights, too. Step #1. Get the names of the vendor and the product being used. There are a lot of people involved in this trial. The FBI, NIST, law enforcement in Michigan, Washington, Florida and North Carolina, Lockheed Martin and, no doubt, others. If FOI doesn’t succeed in getting the names, they’ll leak out from one of those sources.

2. Aliya Sternstein’s article says: “NGI’s incremental construction seems to align with the White House’s push to deploy new information technology in phases so features can be scrapped if they don’t meet expectations or run over budget”. Good. So this roll-out is in the nature of a technology trial. Technology trials can fail. That’s the whole point. That’s scientific method. And if the trial fails, the “features can be scrapped” – that’s what the White House wants. In line with that, step #2, pressure must be brought to bear on the FBI/NGI to run this like a proper trial. The protocol must be published. The trial will be run like this … these are the acceptance criteria … results will be collated like so … and if they don’t meet the criteria, the “features” have failed and will be dropped and no more public money will be wasted on them. This is the upright, responsible, businesslike way to assess the technology. You won’t be putting the FBI on the spot, they won’t have their back to the wall. Having met James A. Loudermilk II of the FBI, I have no doubt that this is exactly the way the FBI would expect to run this trial.

3. Aliya Sternstein’s article says the FBI “gained insights on the technique’s accuracy by studying research from the National Institute of Standards and Technology”. Good. Step #3 – get on to NIST. A long time ago, NIST produced a report on the Face Recognition Vendor Test 2006, NISTIR 7408. They must have more up to date reports, but you could start with this one.

3.1 NISTIR 7408 gets you a list of people to contact – P. Jonathon Phillips, W. Todd Scruggs, Alice J. O’Toole, Patrick J. Flynn, Kevin W.
Bowyer, Cathy L. Schott, Matthew Sharpe. These people are proper academics. They trade on their reputation. They protect their reputation. They speak the truth.

3.2 It gets you a list of vendors – Cognitec, Identix*, Neven Vision, Rafael, Sagem*, SAIT, Toshiba, Tsinghua U(niversity), Viisage* (p.9).

3.3 It gets you a list of the test databases used (p.35). There are five for 2-D facial recognition. Four of them have less than 350 people on them. Those samples are too small to tell anything.

3.4 So we’re only interested in the results of the database with 36,000 people on. The low-resolution images there were gathered under controlled conditions. You can do that with prisoners (which is who NIST and others tend to get their large volume data from). You can’t with mugshots of non-cooperative suspects. The results are in Figure 20 (p.46). Figure 20 measures reliability at three levels – 1 false accept in 100, 1 in a 1,000 and 1 in 10,000. Given that the word is “this is not something where we want to collect a bunch of surveillance film and enter it in the system … that would be useless to us. It would be useless to our users”, presumably the FBI will use 1 in 1,000 or even 1 in 10,000. At those levels, false rejects vary between about 5 in 100 and 18 in 100. Jokers love quoting that bit on p.2 where it says “The FRVT 2006 results from controlled still images and 3D images document an order-of-magnitude improvement in recognition performance over the FRVT 2002″. Some technologies saw an improvement. But not 2-D low resolution facial recognition, that’s shown no improvement at all.

4. Back in 2009, NIST advised the Unique Identification Authority of India (UIDAI) on the biometrics to use for their Aadhaar scheme. As a result, India has adopted flat print fingerprints and irisprints to identify their 1.2 billion people. Not facial recognition. That has been dropped. It isn’t good enough. And if it isn’t good enough for India, how can it be good enough for the US? Step #4 – get on to NIST again, and maybe the UIDAI.

5. The business schools of the world also tested facial recognition to try to stop “plants” taking exams on behalf of less gifted students. They dropped it. They tested flat print fingerprinting and dropped that. It doesn’t work well enough. Now they’re testing palm veinprints. If facial recognition isn’t good enough for the business schools, how can it be good enough for the FBI? Step #5 – get on to GMAC.

———-

* all now owned by Safran Group under the umbrella of Morpho.

Facial recognition – Jokesville 3

Some miscellaneous points:

1. The false accept rates mentioned above range from 1 in 100 to 1 in 10,000. What does that mean? It means that the mugshot submitted by Florida law enforcement, or whoever, will falsely match between 1 in 100 and 1 in 10,000 of the 10 million mugshots on the FBI’s database. That means the enquiry will return between 1,000 and 100,000 possible matches. The number can be reduced by excluding the dead people still on the FBI database. But Florida’s still going to have an awful lot of mugshots to look through. It may not be worth it.

2. The false reject rates mentioned above range between 5 in 100 and 18 in 100. So between 5% and 18% of the Florida mugshots submitted will be falsely rejected – i.e. there is a match on file but the software doesn’t find it. Again, it may just not be worth it. Especially as those figures (5%, 18%) were obtained in a lab test, doing just a computer run. In the live, operational environment, the false reject rate is likely to be much higher.

3. (A note for students on Facial recognition 102. When you’re doing a lab test, you should speak about “false match rates” and “false non-match rates”. When you’re doing a field trial, voluntary and co-operative subjects in a simulation of the real environment, you should talk about “false accept rates” and “false reject rates”. And when you’re in the live, operational environment, it’s “false positive identification rates” and “false negative identification rates”.)

4. Professor John Daugman, an American working at Cambridge University, England, and the man who invented irisprinting reckons that there’s no hope for facial recognition, not with big populations, and precious little for flat print fingerprinting. “Irises have about 249 degrees-of-freedom, … whereas faces have only about 20 degrees-of-freedom (independent dimensions of variation), and fingerprints have about 35″. There’s just not enough randomness in faces to make facial recognition useful. Don’t let the FBI spend too much money on this trial before calling it a day.

5. There will be objections to the points made in these three posts. Mr Moss doesn’t know what he’s talking about. Mr Moss confuses 1-to-1 matching with 1-to-many. Mr Moss knows perfectly well that the job the FBI hope to do with this trial is quite different to the UIDAI’s Aadhaar scheme. These objections sound good. But pursue them before accepting them. The objector may not know what he or she is talking about. Mr Moss, unsurprisingly, thinks he does know what he is talking about.

6. Some traps the FBI may like to avoid.

6.1 Back in 1998, the police in the London Borough of Newham had been testing Visionics face recognition technology, claimed by the vendors to have driven crime off the streets of Newham, yeah right!, and were quoted as follows in New Scientist magazine: “… in June this year, the police admitted to The Guardian newspaper that the Newham system had never even matched the face of a person on the street to a photo in its database of known offenders, let alone led to an arrest”. Why that word “admitted”? Because the police had been lured into promoting the success of the technology, they got themselves on the hook, and then – quite properly – they had to get themselves off. The FBI will not want to make the same, embarrassing mistake.

6.2 Here in the UK we use ePassport technology at 10 of our airports to try to get people through security quickly. If your face matches the biometric template in your passport, you’re through, otherwise not. Does this facial recognition technology work? Sometimes the UK Border Agency say it does, no qualifications. Other times they say it’s still under trial. In the end, we settled on the latter. Lin Homer, the Chief Executive of the UK Border Agency in February 2010 wrote to me saying: “We plan to evaluate all 10 sites. Evaluation of Manchester gave us enough confidence to proceed to expand the trial. We are aware that different environments may impact the use of facial recognition technology, we therefore wished to determine and compare results from more diverse airport environments to ensure the technology is robust and consistent”. The Independent Chief Inspector of UKBA inspected Manchester Airport in May 2010, three months after that letter, he described a number of problems with the ePassport technology and then said at para.5.29 “We could find no overall plan to evaluate the success or otherwise of the facial recognition gates at Manchester Airport and would urge the Agency to do so soon as possible”. Oops.

Leave a comment