• We present a first benchmark of HuMIdb, a publicly available HCI database. • We evaluate touchscreen and background sensor data for mobile passive authentication. • We train a recurrent neural network with triplet loss for each individual modality. • A multimodal system is achieved through the fusion of modalities at score level. • Our results show the suitability of behavioral biometrics for mobile authentication. Current mobile user authentication systems based on PIN codes, fingerprint, and face recognition have several shortcomings. Such limitations have been addressed in the literature by exploring the feasibility of passive authentication on mobile devices through behavioral biometrics. In this line of research, this work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits acquired while the subjects perform different activities on the phone such as typing, scrolling, drawing a number, and tapping on the screen, considering the touchscreen and the simultaneous background sensor data (accelerometer, gravity sensor, gyroscope, linear accelerometer, and magnetometer). Our experiments are performed over HuMIdb, 1 1 https://github.com/BiDAlab/HuMIdb one of the largest and most comprehensive freely available mobile user interaction databases to date. A separate Recurrent Neural Network (RNN) with triplet loss is implemented for each single modality. Then, the weighted fusion of the different modalities is carried out at score level. In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke in a fixed-text scenario. In all cases, the fusion of modalities is very beneficial, leading to Equal Error Rates (EER) ranging from 4% to 9% depending on the modality combination in a 3-second interval.
Read full abstract