Biometrics: guilty until proven innocent
from a paper by Jim Wayman, Antonio Possolo and Tony Mansfield delivered at the International Biometrics Conference hosted by the US National Institute of Standards and technology, 2-4 March 2010
The system, which replaces traditional passport control measures, is undergoing a "live trial" at Manchester Airport, where a UKBA worker said it was suffering almost daily malfunctions.
He said immigration officers had been able to accompany travellers through the scanners without an alarm being triggered, even though the booths are supposed to detect if more than one person enters at a time.
"Immigration officers have been able to tailgate passengers through the machine, without the machine picking it up," he said ...
"There are five 'pods' and when one breaks down, they all break down," he said ...The UKBA source said there were widespread concerns about the facial recognition equipment.
"There is no reliable data on the machine's ability to pick up forgeries and imposters," he said ...
"The technology is further undermined by staff sitting in front of the monitors for three hours at a time, leading to mental fatigue and a drop-off in concentration. There are major concerns about the reliability and accuracy of facial recognition technology ...
"Up until the point of the official launch, it was rejecting 30 per cent of those who tried to get through it," the UKBA worker said.
"We believe they had to recalibrate it – essentially make it easier to get through the system."
Telegraph, 4 October 2008
The ID card scheme will guard against one person having multiple identities by checking the two fingerprints and facial scan held on a chip on the ID card against biometrics in a central database, the National Identity Register.
But academic John Daugman, a former member of the Biometrics Assurance Group (BAG) which reviewed the scheme, says its reliance on fingerprints and facial photos to verify a person's identity will cause the system to collapse under the weight of mismatched identifications ...
Daugman said that even if the error rate was as low as one in a million, the 10 to the power of 15 comparisons needed to verify the IDs of 45 million people would result in one billion false matches.
He told silicon.com: "The use of fingerprints will cause deduplication to drown in false matches.
"The government was badly advised by its internal scientists in the Home Office when it took the decision to base the biometric system on fingerprints instead of iris patterns.
silicon.com, 26 September 2008
The system at Manchester Airport can be used by adult biometric passport holders from the UK and Europe.It works by scanning passengers' faces and comparing them to the photographs digitally stored on their passports.
The Public and Commercial Services Union (PCS) voiced concerns that the technology was "untried and untested" ...
"People are being allowed through on the basis of this technology. It means that 95% of people won't be checked in any way, other than by the machine."
European Biometrics Portal, 17 July 2007
New Scientist, 14 April 2007
National Audit Office, 5 February 2007
The Sussex Argus, 10 January 2007
Ministers will reel off positive stats but that doesn't really answer the question.
What if the money and energy was redirected to old-fashioned, bobby-on-the-beat policing that deters crime rather than being spent on schemes that make it easier to find criminals after the event. And what about the science behind DNA testing and the like? Is it that reliable? There are a lot of people with a stake in proving it is effective (well, there are Government contracts at stake) and few scientists with the resources to debunk it.
And if all these databases, cameras and new technologies are so great compared to what went before, why are serious crime levels so stubbornly high?
The Times, 23 October 2006
91. ... We are surprised by the Home Offices unscientific approach and suggest that rather than collating figures merely to provide information regarding performance, the Home Office admits that it cannot release details until it has completed trials. We note the lack of independent evidence relating to the performance of iris scanning and welcome the Home Offices commitment to undertake a large-scale matching test using pre-recorded biometrics. Given the relative lack of information available publicly regarding the performance of biometrics in a national scheme, we recommend that once the scheme is established the Home Office publishes details of the performance levels of the technology ...
93. We are surprised and concerned that the Home Office has already chosen the biometrics that it intends to use before finishing the process of gathering evidence. Given that the Identity Cards Act does not specify the biometrics to be used, we encourage the Home Office to be flexible about biometrics and to act on evidence rather than preference. We seek assurance that if there is no evidence that any particular biometric technology will enhance the overall performance of the system it will not be used ...
95. We note the lack of explicit commitment from the Home Office to trialling the ICT solution and strongly recommend that it take advice from the ICT Assurance Committee on trialling. We seek an assurance that time pressure and political demands will not make the Home Office forgo a trial period or change the purpose of the scheme.
96. In written evidence the Home Office said it was not necessary to embark on publicly funded scientific research to improve the capabilities of biometrics. This claim was subsequently denied in oral evidence and the identity card team asserted that research was being undertaken into fingerprint biometric performance ... We regret the confusion at the Home Office regarding the research that it is funding and what research it requires ... The Home Office has not provided us with evidence either that they have identified areas where the evidence base is weak nor that they have commissioned research in order to strengthen it. On the basis of the evidence that we have seen, we conclude that the Home Office does not seem to have an effective mechanism for ensuring that the required research and development in the relevant scientific and technological areas is carried out. We recommend that the Home Office identifies the gaps in the evidence base underpinning the identity cards programme, that it commissions research to fill these gaps and that it feeds any new developments into the scheme where appropriate. This process should be overseen by the departmental Chief Scientific Adviser.
House of Commons Science and Technology Committee, 20 July 2006
Mayfield, a Portland, Oregon, attorney, was arrested by the FBI in May 2004 as a material witness after FBI Laboratory examiners identified Mayfield’s fingerprint as matching a fingerprint found on a bag of detonators connected to the March 2004 terrorist attack on commuter trains in Madrid, Spain, that killed almost 200 people and injured more than 1,400 others. Mayfield was released 2 weeks later when the Spanish National Police identified an Algerian national as the source of the fingerprint on the bag.
The FBI Laboratory subsequently withdrew its fingerprint identification of Mayfield.
We found several factors that caused the FBI’s fingerprint misidentification. The unusual similarity between Mayfield’s fingerprint and the fingerprint found on the bag confused three experienced FBI examiners and a court-appointed expert. However, we also found that FBI examiners committed errors in the examination procedure, and the misidentification could have been prevented through a more rigorous application of several principles of latent fingerprint identification. For example, the examiners placed excessive reliance on extremely tiny details in the latent fingerprint under circumstances that should have indicated that these features were not a reliable support for the identification.
The examiners also overlooked or rationalized several important differences in appearance between the latent print and Mayfield’s known fingerprint that should have precluded them from declaring an identification.
In addition, we determined that the FBI missed an opportunity to catch its error when the Spanish National Police informed the FBI on April 13, 2004, that it had reached a “negative” conclusion with respect to matching the fingerprint on the bag with Mayfield’s fingerprints. 2.
DNA Reviews: Within the past 2 years, the OIG completed two reviews examining various aspects of DNA issues. In the first review, completed in May 2004, the OIG examined vulnerabilities in the protocols and practices in the FBI’s DNA Laboratory.
This review was initiated after it was discovered that an examiner in a DNA Analysis Unit failed to perform negative contamination tests, and the Laboratory’s protocols had not detected these omissions. The OIG’s review found that certain of the FBI Laboratory’s DNA protocols were vulnerable to undetected, inadvertent, or willful non-compliance by DNA staff, and the OIG report made 35 recommendations to address these vulnerabilities.
The FBI agreed to amend its protocols to address these recommendations and to improve its DNA training program. In addition, the OIG continues to audit laboratories that participate in the FBI’s Combined DNA Index System (CODIS), a national database maintained by the FBI that allows law enforcement agencies to search and exchange DNA information.
The OIG’s CODIS audits identified concerns with some participants’ compliance with quality assurance standards and with their uploading of unallowable and inaccurate DNA profiles to the national level of CODIS.
The OIG currently is analyzing findings from DNA laboratory audits – both OIG-conducted audits and external quality assurance audits – to determine if they reveal global trends and vulnerabilities.
We also are assessing the adequacy of the FBI’s administration of CODIS, including its oversight of the national DNA database, and evaluating its implementation of corrective actions in response to the original report.
US Department of Justice, May 2, 2006
The Guardian, 21 October 2005
Mr McNulty said on Sunday: "There are difficulties with the technology, not least in terms of people who have difficulties with their eyes anyway, not least with people with brown eyes rather than other coloured eyes, and all those are being factored into the equation.
"None of these problems are new, but increasingly as biometrics are more and more used... we think the technology can only get better and better and better."
BBC, 17 October 2005
The Scotsman, 20 September 2005
According to the study, the results of the first biometric passport trials conducted in 2004-2005 showed that the quality of fingerprint information used in the tests was sometimes poor and that the biometric documents were less robust than the traditional passports.
The quality of digital photographs was also a concern, as unclear backgrounds, insufficient contrast, and other problems such as reflection from spectacle lenses resulted in about 1.6% of photographs being unsuitable for automated biometric matching.
In addition, including fingerprints of young children and the elderly in the future Dutch e-passport may prove more difficult than expected as people have to hold on their fingers for quite a while for the procedure to be successful, the report says ...
The Dutch government is not the only European government experiencing technical difficulties with the development of its biometric passport programme.
In the UK, the findings of a biometrics enrolment trial, published earlier this year, revealed that biometric technologies were still not foolproof and that large-scale issuance of biometric identity and travel documents would inevitably run into some glitches.
In Germany, serious concerns over the government’s biometric passport programme were voiced by security and privacy experts, parliamentary committees and by the Federal Data Protection Commissioner, who even called for a moratorium on the introduction of biometric passports in light of the still immature state of the technology and of a number of unresolved data protection issues.
According to press reports, technical difficulties have also led the Irish Government to shelve plans to introduce biometric chips into passports for the time being.
IDABC eGovernment Observatory (= Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizen), 20 September 2005
The idea behind the Home Office restrictions - first announced last year - is to ensure the smooth running of new scanning technology, which apparently has problems recognising gurning and grinning holiday makers.
The rules also specify the mouth should be closed, your piccie should be less than a month old, and only taken against an "off-white, cream or light grey, plain background."
And what if you insist on sending in your most winsome, toothy grin? Smiley faces will lead to applications being refused until officials receive suitable photos, says the Home Office. Which is no laughing matter.
The Guardian, 5 September 2005
"ID cards can only be the answer if the recognition of them is almost perfect," he said.
"Identity cards are only going to work if we have a biometric answer - that may be iris recognition but it is unlikely to be facial recognition because that changes because of diet and beards and everything else."
BBC, 15 June 2005
Facial recognition was the least successful identification technology ...
Among other things, further trials are needed, specifically targeted towards those disabled groups that have experienced enrolment difficulties due to environment design, biometric device design, or to specific group problems – for example, black participants and participants aged over 59 had lower iris enrolment success rates ...
A report released by the European Commission on 30 March 2005 warned that – on the technological side – there is currently a lack of independent empirical data. This means that there is an urgent need to conduct large-scale field trials to ensure the successful deployment of biometric systems.
IDABC (= Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizens), 31 May 2005
"To be honest, I think it is a possibility that eventually we will conclude it isn't good enough or that the current systems we're using aren't good enough for a large scale public domain application such as an ID card," she said.
BBC, 25 May 2005
Atos Origin, 24 May 2005
BBC News, 3 December 2004
Fingerprint recognition can be fooled by calluses, residual prints on the reader, and even hand cream! Face recognition struggles in certain lighting conditions and can by fooled by disguises; and iris recognition can be confused by contact lenses and watery eyes.
Biometric identification also faces stringent opposition from civil liberties groups who believe that it represents a breach of privacy. There is great concern about the storage of the biometric data and who has access to it. The possibility of storage of personal data on a centralized government database causes greatest concern. There is concern that this data may be misused and even that it may be possible for a person’s stored data, or ‘biometric reference template’, to fall into the wrong hands. Even schemes in which data is stored on the card itself have not been immune from criticism.
PA Consulting, 24 November 2004
BBC News, 21 October 2004
The Guardian, 9 September 2004
BBC News, 12 August 2004
The shortcomings in the system were exposed as MPs took part in a pilot project at the UK Passport Service HQ in London. The trial is testing technology for the proposed "biometric" identity card.
Eye malfunctions have also been found to cause a problem for the technology, while some experts believe the scanners will not work on people wearing hard contact lenses.
Mr Sables said that there may also problems when people have faint fingerprints, such as manual labourers who work with concrete, but that the next generation of technology would overcome the problem by reading bloodflow beneath the skin.
The Daily Telegraph, 6 May 2004
A PILOT scheme for the proposed national identity card system is beset with technical difficulties even before it has started, MPs were told yesterday. Problems with the project forced officials to delay its launch by three months because of difficulties with the hardware and software. As a result, the planned length of time the pilot will operate has been cut from six to three months.
The Times, 5 May 2004
Such "exceptional cases" could be someone who simply has very long eyelashes, or it could be something more serious like a disability that affects the collection of biometric data.
BBC News, 26 April 2004
BBC News, 22 January 2004
"The current encoding of photographs digitally into passport chips is almost entirely for the purpose of ultimate visual comparison by a human," says Professor John Daugman. And although humans are not very good at that, he says, machines have an even harder time ... If a machine were to take over in order to match passport images against a database of pictures, Professor Daugman says the rate of error would still be five to 40%, even with the best algorithms.
"Today's computer algorithms for automatic face recognition have a truly appalling performance, in terms of accuracy," he says. "Even small variations in pose angle, illumination geometry, viewing angle, and facial expression have catastrophic effects on algorithm accuracy," says the Professor ...
The key to the power of biometrics to identify people is the amount of randomness and complexity that the biometric contains, according to Professor Daugman. "Face recognition is inherently unreliable because there isn't nearly enough randomness in the appearance of different faces. Fingerprints are vastly better biometrics than faces," he says, "but better still are iris scans" ... there will be some stumbling blocks and no biometric method offers 100% certainty.
BBC News, 13 January 2004
As for David Blunkett’s conviction that a biometric record will guarantee an individual’s identity without any “false positive” readings, the evidence is lacking. Last year, after rigorously testing leading iris-scanning and face-matching products, the US Defence Department reported that they were far less effective than their manufacturers claimed.
Eye-scanning software from Iridian, for instance, claims a 99.5 accuracy rate; the Pentagon found that it worked only 94 per cent of the time. As for Visionics’ “face-recognition” technology, which maps patterns on individuals’ faces, it recognised people in tests barely 51 per cent of the time, rather than the 99.3 per cent claimed.
The Times, 18 November 2003
The UKPS/DVLA proposals assume that applications are processed, and biometric images collected at local offices, in a manner similar to the current process of checking and driving licence applications by the high street partners of UKPS and DVLA. We assume a similar number of local offices (i.e. approximately 2000). (14)
Even under relatively good conditions, face recognition fails to approach the required performance. (52c)
With the known performance of fingerprint, iris and face biometric systems, this requirement mandates the use of multiple fingers, or irises, and confirms that facial recognition is not a feasible option. (55)
Face recognition is not strong enough to uniquely identify one person in a population of 50m. (57)
Tony Mansfield and Marek Rejman-Greene, 12 November 2003
A woman from Australian customs told me straight: "In Australia we will not give fingerprints to the US for the purpose of visa entry, we absolutely do not give fingerprints." Her response was there would be no travel from Australia to the US.
Some people say it's like barcodes, which didn't work in the early days. Biometrics will get better, it's true. But it's a bad analogy because barcodes can be controlled in manufacturing. If a checker has to type in the code too many times they make the manufacturer redesign the can. Human beings can't go to God. No one technology is going to provide the magic bullet.
People are different in ways that you could never imagine. They never have what you think they are going to have where you think they are going to have it. It never, ever, occurred to me that people can have polydactylism: one fellow had two right thumbs. I have a friend who has a hard time with facial recognition systems: he is very light-skinned, with very light hair but mostly bald. Against a light background, the computer couldn't find the outline of his face, and it said: "There's nobody here." Another guy I knew didn't have a round pupil because he had damaged his eye. You couldn't use iris recognition on that one eye. And then there are people with one glass eye. Or take privacy advocate Simon Davies, whose irises move constantly. He can't be successfully iris-scanned.
Everybody learns from reading Mark Twain's Pudd'nhead Wilson that fingerprints are unchanged from cradle to grave and that everybody has unique fingerprints. But despite this, there remains a tremendous controversy over the admissibility of fingerprints as evidence. I've been an expert witness on this. Fingerprinting is very defendable, but the government has used some of the most stupid, crazy, spurious and non-scientific scientific arguments to try to defend it. We do lack the scientific basis, and that's what we're trying to make up for now.
DNA is not biometrics, it's not automatic unless you touch a machine and it takes a sample, like in the movie Gattaca. But there are a couple of problems. First, you are invading my privacy by asking me to touch a machine and by removing something from my body. I find that disgusting. Secondly, there may be information in that DNA analysis that tells you something about me as a person. Other biometrics don't give any information about a person at all ...
Face recognition still seems to be the holy grail. Perhaps it's more acceptable to people than being fingerprinted or iris-scanned. And often if we have any information at all on terrorists, the face may be the only thing we have. But there are many problems. Take the London mayor, Ken Livingstone, and his idea that you can point a camera at a car and do facial recognition of the occupants. We did that at a Mexico border crossing in Otay Mesa. The immigration service tried to automate the crossing by installing facial recognition cameras in a system called SENTRI, but the driver had to stop and look into the camera. That was highly problematic because the height of the cars varied, and window frames obscured the faces. The state of this technology is we are still trying to teach the cameras that the two people in each scene are the same person.
You have no clue who I am, and I could give you my fingerprint and you still wouldn't know who I am. That's a fundamental flaw in all the legislation. Biometrics says nothing about whether I'm a terrorist or not ...
We'll never use biometrics to track somebody. I've got a really good idea for tracking people: you ask them to carry radio transmitters... Like my mobile phone? ... So right now the government can track you within metres. That's a much better way to track people ...
Biometric tests are not like tests of computer security because in biometrics you are testing people - and people are extremely expensive to test. We have seen that recently with the results from a facial recognition test sponsored by the US Department of Defense and conducted by the National Institute of Standards and Technology (NIST). Two of the companies involved came forward and said "We've improved our product, those results don't apply to us." How would they know? No one has tested the new product. And tests are so expensive that they can't afford them. We see this in biometrics all the time ...
In hand geometry, you get nine measurements. In facial recognition you get 128. Why don't we just concatenate them? It turns out the mathematics is really, really hard. If you throw cotton balls into a shoebox with no gravity, what is the probability that there will be a collision? The probability of a collision increases as you get more balls, the smaller the box gets or the bigger the cotton balls are. Then suppose we change the dimension: so that they are not cotton balls but the shadows of cotton balls on the floor of the box. The shadows may be colliding while the cotton balls are not colliding. Can we put together a mathematical formula that tells us how increasing the dimensions of the system decreases the probability of collisions? In biometrics, a collision is a false match.
New Scientist, 21 June 2003
Maybe nine or ten months ago they would have risen to the bait. In those days the face-recognition industry was on a high. In the wake of 11 September, Visionics, a leading manufacturer, issued a fact sheet explaining how its technology could enhance airport security. They called it "Protecting civilization from the faces of terror". The company's share price skyrocketed, as did the stocks of other face-recognition companies, and airports across the globe began installing the software and running trials. As the results start to come in, however, the gloss is wearing off. No matter what you might have heard about face-recognition software, Big Brother it ain't ...
Image Metrics, a British company that develops image-recognition software, ... warned of the danger of exaggerated claims, saying that "an ineffective or poorly applied security technology is as dangerous as a poorly tested or inappropriately prescribed drug" ... to catch 90 per cent of suspects at an airport, face-recognition software would have to raise a huge number of false alarms. One in three people would end up being dragged out of the line - and that's assuming everyone looks straight at the camera and makes no effort to disguise themselves ...
Palm Beach International Airport in Florida released the initial results of a trial using a Visionics face-recognition system. The airport authorities loaded the system with photographs of 250 people, 15 of whom were airport employees. The idea was that the system would recognise these employees every time they passed in front of a camera. But, the airport authorities admitted, the system only recognised the volunteers 47 per cent of the time while raising two or three false alarms per hour ...
To give themselves the best chance of picking up suspects, operators can set the software so that it doesn't have to make an exact match before it raises the alarm. But there's a price to pay: the more potential suspects you pick up, the more false alarms you get. You have to get the balance just right. Visionics - now called Identix after merging with a fingerprint-scanning company in June - is quick to blame its system's lacklustre performance on operators getting these settings wrong ...
Numerous studies have shown that people are surprisingly bad at matching photos to real faces. A 1997 experiment to investigate the value of photo IDs on credit cards concluded that cashiers were unable to tell whether or not photographs matched the faces of the people holding them. The test, published in Applied Cognitive Psychology (vol 11, p 211), found that around 66 per cent of cashiers wrongly rejected a transaction and more than 50 per cent accepted a transaction they should have turned down. The report concluded that people's ability to match faces to photographs was so poor that introducing photo IDs on credit cards could actually increase fraud.
The way people change as they age could also be a problem. A study by the US National Institute of Standards and Technology investigated what happens when a face-recognition system tries to match up two sets of mugshots taken 18 months apart. It failed dismally, with a success rate of only 57 per cent.
There's another fundamental problem with using face-recognition software to spot terrorists: good pictures of suspects are hard to come by ...
Very few security personnel at American airports have CIA clearance, so they aren't allowed to see the images. "Until they've got cleared personnel in each of those airports they can't stop terrorists getting on planes," says Iain Drummond, chief executive of Imagis technologies, a biometrics company based in Vancouver, Canada ...
Airport security isn't the only use for face-recognition software: it has been put through its paces in other settings, too. One example is "face in the crowd" on-street surveillance, made notorious by a trial in the London Borough of Newham. Since 1998, some of the borough's CCTV cameras have been feeding images to a face-recognition system supplied by Visionics, and Newham has been cited by the company as a success and a vision of the future of policing. But in June this year, the police admitted to The Guardian newspaper that the Newham system had never even matched the face of a person on the street to a photo in its database of known offenders, let alone led to an arrest.
New Scientist, 7 September 2002
BBC, 17 May 2002
Indeed, according to a report published in December, the only major research explicitly commissioned to validate the technique is based on flawed assumptions and an incorrect use of statistics. The research has never been openly peer reviewed. This month, the US government also published a set of funding guidelines that rules out further studies to validate both fingerprint evidence and other existing forensic techniques presented as evidence in court.
In 2003, a proposal by the US National Academies to validate such techniques collapsed after the Department of Defense and Department of Justice demanded control over who should see the results of any investigation.
New Scientist, 28 January 2002