Tangible User Interface
By
Our project focuses on determining and responding to the emotional state of a user. So far sensing systems have focused on facial analysis, seated position, and heart rate. The purpose of this project is to design a real-time tangible device to provide feedback on when and how a user interacts with the device to determine when users are feeling nervous, anxious, or bored. The project will take the form of an anklet with an IR Emitter, and a chair module that contains an accelerometer, IR sensor, and Bluetooth communication to transmit movement data. Training data was collected and used to fit a K-Nearest Neighbor classifier in order to recognize leg fidgeting. In the future the system will be integrated with the lab's assistive robotics, in order to evaluate their emotional state and help them with tasks such as when to take a break. Additionally, while our device can detect fidgeting, research still needs to be done on how fidgeting relates to the emotional state of the user.