LOOKINE
Facial Expression and Head Movement Recognition System
Python, C++
May 2017 - July 2017
Professor: Jia Jia, Department of Computer Science and Technology
Human-Computer Speech Interaction Research Group, Tsinghua University
Beijing, China
Project Introduction
It is believed that nonverbal visual information plays a significant role in fundamental social communication. Unfortunately, it is regretful that the blind can not achieve such necessary information. Therefore, we propose a social-assistant system, Lookine, to help them go beyond this limitation. We apply novel techniques including facial expression recognition, facial action recognition, and head pose estimation, obeying “barrier-free” principles in our design. In experiments, the algorithm evaluation and user study prove that our system has promising accuracy, good real-time performance, and excellent user experience.
My Contribution
Designed a cross-platform system that helps blind people obtain adequate nonverbal information including facial expressions and head movements in social communication.
Implemented the real-time recognition of Facial Action Units through the secondary development of the open source toolkit OpenFace, improving the offline calibration algorithm to be online to meet real-time requirement.
Proposed a novel head movement recognition algorithm using the finite state machine based on the accurate estimation of the head pose, which transcended the traditional SVM algorithm in both accuracy and speed of recognition.
Co-workers
Yaohua Bu, Tianyu Gao, Xuan Zhang