Experts from the University of California, Davis warned this week that the reliability of fingerprint biometrics has declined considerably due to technological concerns and a growing world population.
Law Professor Edward Imwinkelried and biometrics expert Mike Cherry released their findings this week in an article for the law journal Judicature about improving fingerprint identification.
"We can no longer naively assume the reliability of our current fingerprint standards," the pair wrote. "Given the stakes - not only justice in a particular case but national security itself - we must do better."
Imwinkelried and Cherry, who is vice chairman of the digital technology committee of the National Association of Criminal Defense Lawyers, said that the majority of the computerized systems used to match and categorized ridge patterns within fingerprints are far too imprecise for the important applications for which they are increasingly used.
"If we're going to rely on the computer technology for the watch list on terrorism (or) when we do background checks, we've got to have some assurance the computer system is reliably accurate," Imwinkelried said.
The world’s growing population also exacerbates the problem. When the first system for classifying and identifying fingerprints was created in the late 1800s, its creator said that the chances of two identical fingerprints were one in 64 billion.
Imwinkelried and Cherry said they worry that with the world population exceeding six billion, and with most owning 10 fingers, the pool of fingerprints exceeds those odds.
The pair wrote that improvements in biometrics need to be made by mining data from existing fingerprint databases to detect new patterns and classifications for fingerprint matching. They also encouraged law and industry to go back to using 10 fingers to classify an individual, a practice that has gone out of favor.
Click here to email West Coast Bureau Chief Ericka Chickowski.