We developed a machine learning model to estimate the users' level of emotional stability from the changes in their body movements and their pupils, and researched and developed an HMI system that changes interactions according to the users' state as well as a software that visualizes the decisions of the machine learning model.
When entering manual operation mode from auto operation mode, the interaction that informs the driver of the surrounding conditions based on the drivers' level of emotional stability switches smartly.