Optimizing Human-Robot Team Performance
using Affect and Activity-Aware Robots (ICCRE 2019)
ROLE: Robotic & Human Systems Engineer
SKILLS UTILIZED: Sensor Data Collection, Data Processing, Data Analysis, Coding (C++, Python)
SOFTWARE: ROS, ROS 2, Pupil Capture and Player, Consensys Pro, Artinis OxySoft, Numenta NuPic (HTM Systems), HBP Neurorobotics Platform
HARDWARE: Pupil Labs Pupil EyeTracker, Artinis OCTAMON fNIRS Headset, Shimmer Consensys Sensor Kit, Laipac LookWatch, DJI Ryze Tello
Team communication is essential for effective performance, however, it is limited to externally observable behavior. Less explicit properties, such as a team member's affective state, provide an additional layer of information which is typically unknown to other team members. We propose EMPATH, a human-robot team framework that utilizes physiological sensors for human affect and activity state detection in order to adapt robot behavior within the team. An experiment implementing the EMPATH framework using the Pupil Labs Pupil eye tracker, Shimmer Consensys sensor kit, and Laipac LookWatch for state detection and several UAVs demonstrated improved human-robot team performance for an environmental search task.