Comfortability and a first step through integrating humanoid robots during a live interview
This new column of the online magazine IIT OpenTalk will try to show some aspects of how important and hard is to build robots capable of interacting with humans in a daily basis, and above all, during a set of real interviews performed by the humanoid robot iCub (acting as the interviewer).
We’ve been working close to the researchers of the CONTACT – COgNiTive Architecture for Collaborative Technologies Research Line and the RBCS – Robotics, Brain and Cognitive Sciences Unit of IIT – Istituto Italiano di Tecnologia, trying to understand which are the limitations, for example, in human-machine interaction in a scenario of an interview. We followed part of Maria Elena Lechuga Redondo’s current PhD programme at the IIT. Elena is a PhD Student Fellow under the supervision of Alessandra Sciutti, Radoslaw Niewiadomski, Francesco Rea and Giulio Sandini. She is a computer engineer studying since the beginning of her PhD period how to teach the iCub robot to “interact in a social way”, and ultimately, how to teach robots to recognize and adapt to their partners’ Comfortability level.
In fact, one of the initial steps to enhance current humanoid robots as social partners during an interview focuses on social intelligence and “comfortability”: how relevant it is to understand what is comfortability for a human being when it comes to interact with other people; the comparison between comfortability and other emotions; how a humanoid robot can impact someone’s comfortability in a realistic setting (a live interview, for example).
The aim of the column is therefore to try to understand how these non-verbal features work during a robot-guided interview, what reactions they bring about, and to study whether robots are capable of causing them like humans or in a different way. Through a set of interviews led by the iCub robot, the column attempts to narrate the research and the goals of the researchers interviewed and, at the same time, to understand their perception of “being interviewed by a robot”.
At the end of the column we will maybe be able to reply to our first question: can robots impact human comfortability during a live interview? This is an important first step toward enabling future robots to realize when someone is feeling uncomfortable in an interaction with them. Such skill will be essential for the design of more considerate and respectful machines. But most of all, we might be discussing if using robots as interviewers can help humans train and learn how to better react during a live interview.
Reference
- M.E.L. Redondo, R.Niewiadomski, F.Rea, S.Incao, A.Sciutti, Can Robots Impact Human Comfortability During a Live Interview?In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’21 Companion), March 8–11, 2021, https://doi.org/10.1145/3434074.3447156
- M.E.L. Redondo, A. Vignolo, R. Niewiadomski, F. Rea, and A. Sciutti. 2020. Can Robots Elicit Different Comfortability Levels?. In Wagner A.R. et al. (eds) Social Robotics. ICSR 2020. Lecture Notes in Computer Science, Vol. 12483. Springer, 664–675. https://doi.org/10.1007/978-3-030-62056-1_55