ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.09061
26
12

EchoLock: Towards Low Effort Mobile User Identification

20 March 2020
Yilin Yang
Chen Wang
Yingying Chen
Yan Wang
ArXivPDFHTML
Abstract

User identification plays a pivotal role in how we interact with our mobile devices. Many existing authentication approaches require active input from the user or specialized sensing hardware, and studies on mobile device usage show significant interest in less inconvenient procedures. In this paper, we propose EchoLock, a low effort identification scheme that validates the user by sensing hand geometry via commodity microphones and speakers. These acoustic signals produce distinct structure-borne sound reflections when contacting the user's hand, which can be used to differentiate between different people based on how they hold their mobile devices. We process these reflections to derive unique acoustic features in both the time and frequency domain, which can effectively represent physiological and behavioral traits, such as hand contours, finger sizes, holding strength, and gesture. Furthermore, learning-based algorithms are developed to robustly identify the user under various environments and conditions. We conduct extensive experiments with 20 participants using different hardware setups in key use case scenarios and study various attack models to demonstrate the performance of our proposed system. Our results show that EchoLock is capable of verifying users with over 90% accuracy, without requiring any active input from the user.

View on arXiv
Comments on this paper