Biometrics and Artificial Neural Networks: How Big Data Collection Works in Your Favor

• Bookmarks: 43


Big Data has been a recent industry buzzword, shifting the business models of companies like IBM, SAS, and Credit Suisse. McKinsey & Company calls it “the next frontier for innovation, competition, and productivity.” New data aggregation methods may come at a price, however. According to Atlantic Council Senior Fellow Banning Garrett, Big Data may “compromise our privacy and security.” As the federal Privacy and Civil Liberties Oversight Board report on data-gathering programs has shown, there is indeed room for discussion on what Big Data means for the everyday citizen.

Murdoch University researchers Mark Abernethy and Shri Rai have turned speculation into experimentation, and their results suggest that data gathered about you could actually help keep you safer. Using biometrics and artificial neural networks they glimpse at the future of securitization policy through an experiment where individuals are identified by using large data caches they themselves created.

In their experimental study “Complementary Feature Level Data Fusion for Biometric Authentication Using Neural Networks,” presented at the December 2013 Australian Information Warfare and Security Conference, Abernathy & Rai demonstrate how physical features of individuals, such as facial structure, the iris, fingerprints, and even the way a keyboard is struck, are matched against templates of Artificial Neural Networks (ANNs). By putting an individual’s features up against the ANN “gatekeeper,” entrance systems are able to probabilistically determine if the people presenting at the security terminal are who they claim to be.

The authors’ aim “was to investigate the possible accuracy improvement of feature level data fusion.” While still maturing from the early stages of its interdisciplinary origins, Multi-Sensor Data Fusion has “many application areas in military and security as well as civilian domains,” according to the study. Feature level fusion, as opposed to other fusion levels in multi-modal biometric systems (see Ross & Govindarajan 2005), requires selection of salient data from independent sources that “best represent the entity and provide recognition accuracy.”

The innovation of their design lies in the blend of the input data. In their words, “although keystroke dynamics is a weak behavioural biometric characteristic, and fingerprint recognition is a strong physiological biometric characteristic, the objective was to demonstrate the power of the fusion process.” Thus, by taking two different biometrics with significant differences in predictive power, the fusion process was able to generate authentication measurements useful and scalable to training ANNs. In policy terms, an employee can manage access to sensitive information through repetition of their own keystroke behavior and authenticating their fingerprints at terminals available through technology common to many mobile phones.

The ANNs represent a crucial piece of the security puzzle. After choosing a fraction of the 90 test subjects to provide all fingerprints and extensive keystroke information, Abernethy & Rai were able to “teach” the ANNs. All remaining test subjects then provided a set of biometrics to act as a “password” to be supplied when requesting access to the secure server. The artificial neural network then drew on stored learning when presented with an attempt to access it. Here, access was based on a simple test: “yes” the individual is who they claim to be, or “no”, they are not. When entrance candidates submitted their “password” to the portal, the ANNs evaluated them against their past learning, which may or may not have included biometric data on the entrants.  Stated in conventional terms, ANNs act like security guards: the guard first studies a subset of a population sharing a unique trait that varies slightly across individuals, allowing it to subsequently recognize those who possess the trait but do not belong to the training subset.

How the ANNs fused information is what distinguishes this approach from simple cooperation algorithms. Building off of a very small set of past research designs, Abernathy & Rai constructed a teaching regimen for the ANNs that resulted in a FAR (False Acceptance Rate, or Type I Error) of 0.0 and a FRR (False Rejection Rate, or Type II Error) of 0.0004. Translated for the practical security consumer, the way they trained ANNs using individuals’ fingerprints and keystroke dynamics allowed the ANNs to never mistakenly identify an individual that wasn’t cleared to pass the gatekeeper terminal and only rejected four in 10,000 who were cleared. This means that while keeping inappropriate access to zero entrants, the system actually over-protected by a small, near perfect margin.

“Data fusion,” Abernethy & Rai observe, “had been primarily the province of military research.” Civilian researchers sharing an enthusiasm for data-fusion techniques intended to enforce security measures have undertaken studies in the past using measurement variables such as handprints, facial thermography, and geometry, but remain limited. Abernethy & Rai’s study has produced the most robust results to date in simple probability terms: no Type I error and acceptable levels of Type II error.

In tangible policy language, Abernethy & Rai have shown that easily obtained measurements of identification such as fingerprint ink and keyboard histories – features of candidates for security entrance protocols available to local law enforcement agencies and national government institutions alike – can be fused into powerful gatekeeping techniques. In light of this, it is likely that security policy practitioners will be soon take greater notice of data fusion applications, integrating testing methods of physical attributes using ANNs and going beyond traditional encryption methods.

The research reviewed as part of this article has not yet been peer-reviewed.  It was presented in December 2013 at the annual Australian Information Warfare and Security Conference

Feature Photo: cc/(The U.S. Army)

329 views
bookmark icon