While face recognition software, iris scanning, and other identification technologies have been capturing headlines as promising new authentication and security tools, fingerprint records, arguably the first biometric databases, go back more than a hundred years. The idea of using fingerprints for identification was startlingly novel in its time-and caused a bitter dispute between two men who claimed to have invented the technology.
During the 1870s, Henry Faulds, a Scottish missionary working as a doctor in Japan, happened across an ancient pot marked with its creator’s fingerprints. The discovery inspired him to investigate fingerprints. In 1880, Faulds published a letter in Nature in which he observed that “when bloody finger-marks or impressions on clay, glass, etc., exist, they may lead to the scientific identification of criminals.” The next month, Nature published a reply from William Herschel, an India-based British magistrate. Herschel had collected fingerprints since the 1860s and suspected that each person’s fingerprint was unique-but he had never studied their potential for forensic use.
Neither letter received much attention until 1892, when Francis Galton, Charles Darwin’s cousin and a noted scientist himself, published Finger Prints. Galton established that fingerprints are unique and don’t change over a person’s lifetime, and suggested a classifying system. In 1901, Scotland Yard founded its Fingerprint Bureau, based largely on Galton’s system. Although Faulds had suggested a similar system to Scotland Yard years earlier, Galton and Herschel took credit for the innovation. Infuriated, Faulds instigated a public battle of letters with Herschel that would last until his rival’s death in 1917.
Regardless of who originally envisioned fingerprints as a forensic tool, the practice took off. In 1902, fingerprints were first used as evidence in a British court to identify a burglar who had stolen some billiard balls. And 1902 was also the year that fingerprints were first systematically employed in the United States, when the New York Civil Service Commission began fingerprinting applicants to prevent them from cheating on tests.
Although fingerprinting may recall the Sherlock Holmesian era during which it was created, new tools have brought the system into the digital age. Today, the FBI’s fingerprint system contains more than 40 million people’s fingerprints. A suspect’s prints can be identified within two hours; just a few years ago, the process could take weeks.
Anti-aging drugs are being tested as a way to treat covid
Drugs that rejuvenate our immune systems and make us biologically younger could help protect us from the disease’s worst effects.
These materials were meant to revolutionize the solar industry. Why hasn’t it happened?
Perovskites are promising, but real-world conditions have held them back.
The baby formula shortage has birthed a shady online marketplace
Desperate parents just want to feed their babies. They’re having to contend with misinformation, price gouging, and scams along the way.
I tried to buy an Olive Garden NFT. All I got was heartburn.
Our newest issue spells out what you need to know about the dizzying world of digital money.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.