TY - GEN
T1 - Behaviors Speak More
T2 - 22nd ACM Conference on Embedded Networked Sensor Systems, SenSys 2024
AU - Jiang, Chenxu
AU - Yu, Sihan
AU - Fu, Jingjing
AU - Lin, Chun Chih
AU - Zhu, Huadi
AU - Ma, Xiaolong
AU - Li, Ming
AU - Guo, Linke
N1 - Publisher Copyright:
© 2024 Copyright is held by the owner/author(s).
PY - 2024/11/4
Y1 - 2024/11/4
N2 - Human faces have been widely adopted in many applications and systems requiring a high-security standard. Although face authentication is deemed to be mature nowadays, many existing works have demonstrated not only the privacy leakage of facial information but also the success of spoofing attacks on face biometrics. The critical reason behind this is the failure of liveness detection in biometrics. This work advances most biometric-based user authentication schemes by exploring dynamic biometrics (human facial activities) rather than traditional static biometrics (human faces). Inspired by observations from psychology, we propose the mmFaceID to leverage humans' dynamic facial activities when performing word reading for achieving robust, highly accurate, and effective user authentication via mmWave sensing. By addressing a series of technical challenges of capturing micro-level facial muscle movements using a mmWave sensor, we build a neural network to reconstruct facial activities via estimated expression parameters. Then, unique features can be extracted to enable robust user authentication regardless of relative distances and orientations. We conduct comprehensive experiments on 23 participants to evaluate mmFaceID in terms of distances/orientations, length of word lists, occlusion, and language backgrounds, demonstrating an authentication accuracy of 94.7%. We also extend our evaluation in a real IoT scenario. By speaking real IoT commends, the average authentication accuracy can reach up to 92.28%.
AB - Human faces have been widely adopted in many applications and systems requiring a high-security standard. Although face authentication is deemed to be mature nowadays, many existing works have demonstrated not only the privacy leakage of facial information but also the success of spoofing attacks on face biometrics. The critical reason behind this is the failure of liveness detection in biometrics. This work advances most biometric-based user authentication schemes by exploring dynamic biometrics (human facial activities) rather than traditional static biometrics (human faces). Inspired by observations from psychology, we propose the mmFaceID to leverage humans' dynamic facial activities when performing word reading for achieving robust, highly accurate, and effective user authentication via mmWave sensing. By addressing a series of technical challenges of capturing micro-level facial muscle movements using a mmWave sensor, we build a neural network to reconstruct facial activities via estimated expression parameters. Then, unique features can be extracted to enable robust user authentication regardless of relative distances and orientations. We conduct comprehensive experiments on 23 participants to evaluate mmFaceID in terms of distances/orientations, length of word lists, occlusion, and language backgrounds, demonstrating an authentication accuracy of 94.7%. We also extend our evaluation in a real IoT scenario. By speaking real IoT commends, the average authentication accuracy can reach up to 92.28%.
KW - biometrics
KW - facial activity
KW - mmWave
KW - user authentication
UR - http://www.scopus.com/inward/record.url?scp=85211806017&partnerID=8YFLogxK
U2 - 10.1145/3666025.3699330
DO - 10.1145/3666025.3699330
M3 - Conference contribution
AN - SCOPUS:85211806017
T3 - SenSys 2024 - Proceedings of the 2024 ACM Conference on Embedded Networked Sensor Systems
SP - 169
EP - 183
BT - SenSys 2024 - Proceedings of the 2024 ACM Conference on Embedded Networked Sensor Systems
Y2 - 4 November 2024 through 7 November 2024
ER -