BehaveSense: Continuous authentication for security-sensitive mobile apps using behavioral biometrics

Abstract

With the emergence of smartphones as an essential part of our daily lives, continuous authentication becomes an urgent need which could efficiently protect user security and privacy. However, only a small percentage of apps contain sensitive data. To save energy and protect user security, we propose BehaveSense , an accurate and efficient continuous authentication method for security-sensitive mobile apps using touch-based behavioral biometrics. By exploring four different types of touch operations, we train the owner model using One-Class SVM ( OCSVM) and isolation forest (iForest), and calculate the accuracy of each type with the model. Afterwards, we calculate the confidence level of each type using the Bayesian theorem. Finally, we obtain the accuracy of a touch operation sequence with an improved expectedprob algorithm. To validate the effectiveness of the proposed method, we conduct a series of experiments. We collect the WeChat app data of 45 volunteers during two weeks. Experimental results show that our method can recognize user identity efficiently. Specifically, our method achieves average accuracy of approaching 95.85% for touch operation sequence, when considering 9 touch operations. Our method is very promising to authenticate user.

Introduction

With the development of mobile Internet, more and more apps are deployed on smartphones [1], some of which (e.g., paypal, WeChat, and Alipay) contain relatively sensitive data, such as chatting logs and payment information. Taking WeChat as an example, in 2017 >384 million WeChat users spend over 90 minutes per day for chatting and payment [2]. Therefore, how to authenticate and validate the identity of smartphone users when they use these apps becomes an urgent need.

At present, the most common authentication mechanism is that a device locks itself after a few minutes of inactivity and prompts a PIN password/pattern in the touch screen when reactivated [1]. However, this authentication mechanism faces three major challenges. First, the PIN password/pattern is susceptible to shoulder surfing attacks [3]. For example, an impostor can extract the PIN password from touch screen by the thermal imaging technology [3]. Second, the mechanism can hardly resist smudge attacks. Attackers can extract sensitive information by using the smudges (oily residues) left by fingers on the touch screen [4]. Third, to guarantee the user authenticity, more frequent verification is needed. However, frequent verification based on traditional mechanisms is too obtrusive and inconvenient. Therefore, a promising authentication method should be continuous in runtime without interrupting the user.

Recently, both the industry and academic communities proposed that touch-based gestures can be used to uniquely identify an individual [5]. GEST [1] is a typical method based on user touch operation. This method can secure from both shoulder surfing and smudge attacks, but it cannot provide continuous authentication in runtime [6]. On account of the disadvantage of GEST, Chauhan et al. [8] proposed a gesture-based authentication method, which can provide accurate authentication, but it requires specialized hardware. Later, some continuous authentication methods [5], [8], [9], [10], [11] were proposed, which neither require extra hardware nor need user attention. However, most existing methods assumed that we can acquire the touch operation patterns of the all impostors in advance, which is impractical. Therefore, it is illogical to use either a binary classifier or a multiclass classifier to validate user operations. In addition, these existing methods didn't consider the fact that both the using frequency and sensitivity of apps conform to the 80/20 rule, which states that roughly 80% of the sensitive information come from 20% of the apps for smartphones [7].

In this paper, we propose a continuous authentication method for security-sensitive apps, called BehaveSense. When the user uses a certain app (e.g., WeChat), BehaveSense can authenticate and verify her identity based on touch operations. Specifically, touch operations related features (e.g., the screen pressure intensity) are obtained while the user is using her smartphones, which are unique and distinguishable among different users [5]. Thereby, they can be used for continuous verification during runtime without disturbing the user. Moreover, the collection of such parameters is totally transparent to the user and do not need any special hardware.

However, how to fully explore the value of touch operation parameters for efficient user verification is quite challenging. First, user operations are diverse under different scenarios, which may lead to unstable verification and reject the owner with high probability. Second, it is difficult to know who will use our mobile device in advance, i.e., it is impractical to obtain an impostor's touch operation characteristics in advance. Third, even if we obtain stable accuracy on single touch operation, it isn't easy to obtain the accuracy of an operation sequence, as there are different possible combinations in an operation sequence and the accuracy of each type is different.

To address the above challenges, we propose a continuous authentication method with high accuracy. More in detail, to enable stable verification, we choose four types of touch operations which are common and frequent. We extract features and build a model for each type of touch operations respectively. To deal with the second challenge, BehaveSense first builds an anomaly detection model for the owner based on her touch operations, and then performs verification by using the model on runtime touch operations. To calculate the recognition rate of an operation sequence, BehaveSense extracts the confidence level of each type of touch operations using Bayesian theorem and computes the accuracy of a touch operation sequence with an improved expectedprob algorithm [12]. In order to evaluate the efficiency of BehaveSense, we conduct experiments on a real-world dataset consisting of the operation data from 45 volunteers, who have used WeChat in two weeks. The optimal results obtained by OCSVM [13], [14] show that BehaveSense can identify owner operation with high accuracy.

The contribution of this paper is summarized as follows:

We propose a continuous authentication method based on touch operations for security-sensitive apps, which can accurately verify user identity without the requirement of user attention and specialized hardware.

We build an owner model based on OCSVM, which is used to calculate accuracy of each type of operations. Further, we employ Bayesian theorem to calculate confidence level of each type of touch operations. Finally, we propose a novel strategy to verify the user identity based on the sequence of operations using an improved expertedprob algorithm.

We conduct a series of experiments. The results show that BehaveSense can accurately verify user identity. Moreover, the authentication performance becomes better as the length of touch operation sequences increases. Specifically, when considering 9 conservative operations jointly, the average accuracies of authentication reach up to 95.85%, which can satisfy our practical needs.

This paper is organized as follows. Section 2 reviews related work and Section 3 presents the framework of BehaveSense. In Section 4, we describe BehaveSense systems in detail, followed by the experiment results in Section 5. We conclude the paper and discuss the future work in Section 6.

Section snippets

Related work

Traditional authentication mechanisms (e.g., PIN/pattern password, fingerprint) have been extensively studied for years. These authentication mechanisms make it hard for unauthorized users to access data on mobile devices. However, about 40% users actually secure their phones with 4-digit PIN codes which usually are famous dates or repeated numbers [15]. Given such easy-to-find and easy-to-guess PIN passwords, an impostor could access 9.23% mobile devices with less than three attempts [15]. In

Framework

Fig. 1 gives an overview of the framework of the proposed BehaveSense system, which consists of a training phase and a testing phase. In the training phase, we first choose and preprocess the owner's touch operations, and then extract features of each operation type, based on which we train a model for the owner by using OCSVM [13], [14] and isolation Forest (iForest) [21]. Specifically, we mainly consider common and frequent touch operations: clicking operation (Co), vertical sliding operation

Touch operation

The paper authenticates and validates user identity based on screen touch operations, which are commonly divided into clicking operations and sliding operations, where sliding operations consist of single-sliding and multi-sliding operations [22]. In addition, the multi-sliding operation is infrequent and its variability is very large. Thus, to accurately and quickly validate user identity, we mainly focus on common and frequent touch operations, which include clicking operations and

Evaluation

In this section, we evaluate the performance of BehaveSense based on a real-world dataset. First, we collect touch operations when users are using WeChat. Second, we extract feature vectors for each type, and compute the accuracy of them based on the anomaly detection method. Third, we obtain the confidence level of each type based on Bayesian theorem. Finally, we compute the accuracy of different operation sequences with improved expectedprob algorithm and evaluate the performance of our

Conclusion

In the paper, we propose an implicit and continuous authentication method by exploring touch operations on smartphone. In particular, to save energy, the authentication process will be triggered only if some key Apps (e.g., WeChat) are used.

Human behavior recognition has been considered as a core technology of authentication user identity [29]. Therefore, the authentication method proceeds in a passive way based on her normal touch operations while she uses the security-sensitive app. We

Acknowledgment

This work was partially supported by the National Key R&D Program of China (2017YFB1001800) and the National Natural Science Foundation of China (no. 61772428, 61725205).

Yafang Yang is a Ph.D. candidate from Northwestern Polytechnical University, China. Her research interests include ubiquitous computing and mobile sensing.

References (29)

  • et al.

    An adaptive decision-making method with fuzzy bayesian reinforcement learning for robot soccer

    Inf. Sci.

    (2018)

  • L.A. Maglaras et al.

    Combining ensemble methods and social network metrics for improving accuracy of OCSVM on intrusion detection in SCADA systems

    J. Inf. Secur. Appl.

    (2016)

  • M. Markou et al.

    Novelty detection: a review—Part 1: statistical approaches

    Signal Process.

    (2003)

  • M. Shahzad et al.

    Secure unlocking of mobile touch screen devices by simple gestures - you can see it but you can not do it

  • C. Custer

    WeChat blasts past 700 million monthly active users, tops China's most popular apps

    Tech in Asia

    (2016)

  • Mobitiantian, German scientists break out the locking screen password by thermal technique....
  • M.D. Amruth et al.

    Android Smudge Attack Prevention Techniques, Intelligent Systems Technologies and Applications

    (2016)

  • C. Shen et al.

    Performance analysis of touch interaction behavior for active smartphone authentication

    IEEE Trans. Inf. Forensics Secur.

    (2016)

  • M. Frank et al.

    Touchalytics: on the applicability of touchscreen input as a behavioral biometric for continuous authentication

    IEEE Trans. Inf. Forensics Secur.

    (2013)

  • N. Bunkley

    Joseph Juran.103, pioneer in quality control, dies

    New York Times

    (2008)

  • J. Chauhan et al.

    Gesture-based continuous authentication for wearable devices: the google glass case

  • H. Xu et al.

    Towards continuous and passive authentication via touch biometrics: an experimental study on smartphones

  • Z. Ali et al.

    At your fingertips: considering finger distinctness in continuous touch-based authentication for mobile devices

  • Sitova Z., Sedenka J., Yang Q., et al, HMOG: a new biometric modality for continuous authentication of smartphone...
  • Cited by (34)

    Yafang Yang is a Ph.D. candidate from Northwestern Polytechnical University, China. Her research interests include ubiquitous computing and mobile sensing.

    Bin Guo is a professor from Northwestern Polytechnical University, China. He received his Ph.D. degree in computer science from Keio University, Japan in 2009 and then was a post-doc researcher at Institut TELECOM SudParis in France. His research interests include ubiquitous computing and mobile crowd sensing.

    Zhu Wang is an associate professor from Northwestern Polytechnical University, China. He received his B.Sc., M.Sc. and Ph.D. degrees in Computer Science and Technology from the same university, in 2006, 2009 and 2013, respectively. His research interests include pervasive computing, social network analysis, and healthcare.

    Mingyang Li is currently a master student in Northwestern Polytechnical University, China. His research interest is mobile crowd sensing.

    Zhiwen Yu is a professor from Northwestern Polytechnical University, China. He has worked as an Alexander Von Humboldt Fellow at Mannheim University, Germany from Nov. 2009 to Oct. 2010, a research fellow at Kyoto University, Japan from Feb. 2007 to Jan. 2009. His research interests cover ubiquitous computing and HCI.

    Xingshe Zhou is a professor in the School of Computer Science, Northwestern Poly technical University, P.R. China. His research interests include distributed computing, embedded computing, and sensor networks.

    View full text