Artificial Intelligence researchers used a neural network to create fake fingerprints that could be a hacker’s dream tool.
Five researchers, led by Philip Bontrager of the New York University engineering school, developed what they have called “DeepMasterPrints.” The Guardian reported that the research was presented at a biometrics conference in Los Angeles in October. As the Guardian points out, their report, published last month, explains how the fake prints they generated could replicate more than one in five real fingerprints in a biometric identification system.
The paper suggests this technique could be used to create replicated fingerprints that could be used in something akin to a “dictionary attack,” but instead of software that runs millions of popular passwords through a system, a DeepMasterPrints-inspired tool could run several fake fingerprints through a system to see if any prints match any accounts.
Screenshot: Philip Bontrager, Aditi Roy, Julian Togelius, Nasir Memon, Arun Ross (DeepMasterPrints: Generating MasterPrints for Dictionary Attacks via Latent Variable Evolution)
The key to their research is that many fingerprint scanners only read a portion of a print, and some different fingertip portions have more in common than others.
So when researchers created new prints by feeding a set of real fingerprints into a generative adversarial network, they only needed to create prints that matched certain portions of other fingerprints—the portions that tend to have commonalities.
It’s unlikely someone could use such a technique to break into your phone (as one report suggests). “A similar setup to ours could be used for nefarious purposes, but it would likely not have the success rate we reported unless they optimized it for a smartphone system,” Bontrager told Gizmodo. “This would take a lot of work to try and reverse engineer a system like that.”
But if a hacker accessed a system with many fingerprint-accessible accounts, they’d have a good shot at cracking into a few of them.
Bontrager and his team want their research to inspire companies to step up fingerprint-security efforts. “Without verifying that a biometric comes from a real person, a lot of these adversarial attacks become possible,” Bontrager said. “The real hope of work like this is to push toward liveness detection in biometric sensor.” [The Guardian]
Featured image: Leon Neal (Getty)