Wednesday , April 21 2021

Demonstrate that it is possible to violate fingerprint readers by fingerprinting



Master fingerprints are real or synthetic fingerprints that coincide with a large number of fingerprints from real people. In this paper, a team of researchers from the University of New York (NYU) and Michigan State University (MSU) created master fingerprint images using a method known as latent variable model and the use of technology machines. These fingerprints, called "DeepMasterPrints," have a 20% effectiveness and allow you to recreate fingerprints used in recognition systems that can be exploited through a "dictionary attack" attack.

In an article presented at a biometrics security conference (BTAS 2018), experts explain that to create DeepMasterPrint, experts took two things into account. On the one hand, for ergonomic reasons, fingerprint sensors are often very small (like on smartphones), which makes them work using a portion of a user's fingerprint image. Therefore, since identification of an identity by means of small portions of fingerprints is not an easy task, as could be when reading a complete fingerprint, the possibility that a part of a fingerprint of a finger does not coincide with Another part of the fingerprint of a different finger is high. Researcher Aditi Roy took this into account and introduced the concept of fingerprints, which are a set of real or synthetic fingerprints that coincide with a large number of other fingerprints.

The second thing that they took into account is that some fingerprints have characteristics common to each other. Which means that a fake fingerprint that contains many common features has more real chances of matching other fingerprints.

From here, the researchers used a type of artificial intelligence algorithm called the "antagonistic gene network" to artificially create new fingerprints that can combine as many partial fingerprints as possible. In this way, they have been able to develop a library of artificial fingerprints that function as master keys for a specific biometric identification system. In addition, it is not necessary to have a fingerprint sample belonging to a specific individual, but it can be performed in relation to anonymous subjects and still have some scope for success.

Although it is very difficult for an attacker to use something similar to DeepMasterPrint because it takes a lot of work to optimize the artificial intelligence of a specific system, since each system is different, it is an example of what can happen with time and something to be aware of . Something similar was seen this year at Black Hat's security conference when IBM researchers demonstrated by proof of concept that it was possible to develop malware that uses artificial intelligence to perform attacks based on facial recognition.


Source link