Skip to Content
Uncategorized

Fingerprinting’s finger-pointing past

How fingerprinting made its mark.

While face recognition software, iris scanning, and other identification technologies have been capturing headlines as promising new authentication and security tools, fingerprint records, arguably the first biometric databases, go back more than a hundred years. The idea of using fingerprints for identification was startlingly novel in its time-and caused a bitter dispute between two men who claimed to have invented the technology.

During the 1870s, Henry Faulds, a Scottish missionary working as a doctor in Japan, happened across an ancient pot marked with its creator’s fingerprints. The discovery inspired him to investigate fingerprints. In 1880, Faulds published a letter in Nature in which he observed that “when bloody finger-marks or impressions on clay, glass, etc., exist, they may lead to the scientific identification of criminals.” The next month, Nature published a reply from William Herschel, an India-based British magistrate. Herschel had collected fingerprints since the 1860s and suspected that each person’s fingerprint was unique-but he had never studied their potential for forensic use.

Neither letter received much attention until 1892, when Francis Galton, Charles Darwin’s cousin and a noted scientist himself, published Finger Prints. Galton established that fingerprints are unique and don’t change over a person’s lifetime, and suggested a classifying system. In 1901, Scotland Yard founded its Fingerprint Bureau, based largely on Galton’s system. Although Faulds had suggested a similar system to Scotland Yard years earlier, Galton and Herschel took credit for the innovation. Infuriated, Faulds instigated a public battle of letters with Herschel that would last until his rival’s death in 1917.

Regardless of who originally envisioned fingerprints as a forensic tool, the practice took off. In 1902, fingerprints were first used as evidence in a British court to identify a burglar who had stolen some billiard balls. And 1902 was also the year that fingerprints were first systematically employed in the United States, when the New York Civil Service Commission began fingerprinting applicants to prevent them from cheating on tests.

Although fingerprinting may recall the Sherlock Holmesian era during which it was created, new tools have brought the system into the digital age. Today, the FBI’s fingerprint system contains more than 40 million people’s fingerprints. A suspect’s prints can be identified within two hours; just a few years ago, the process could take weeks.

Deep Dive

Uncategorized

Embracing CX in the metaverse

More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.

Identity protection is key to metaverse innovation

As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.

The modern enterprise imaging and data value chain

For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.

Scientists have created synthetic mouse embryos with developed brains

The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.