The robot’s inner speech has recently been under investigation. Inner speech is a form of internalized, self-directed dialogue: talking to the self. Initial intriguing findings pertained to the robot’s transparency, robustness, and trustworthiness, and emerged from basic collaborative tasks in Human-Robot Interaction (HRI), such as set up a lunch table. This pilot study extends the investigation to a high-stakes medical scenario where errors can have significant consequences: to prepare a virtual surgical table for a vascular intervention. The study involved both expert and non-expert participants who collaborated with the robot to correctly place and organize surgical instruments. Data were collected through a post-session questionnaire combining closed- ended Likert-scale items and open-ended questions. Interaction dynamics and learning outcomes were assessed through key variables, including transparency, trust, reassurance, and perceived learning, allowing for both quantitative and qualitative analysis. Results from this pilot indicate that the robot’s inner speech enhances participants’ understanding of task instructions, provides reassurance during collaboration, and improves perceived learning quality, evaluated in terms of the Quality of Learning (QoL) framework, with effects varying according to prior expertise. These findings highlight the potential of inner speech to facilitate transparent, trustworthy, and effective training in risk-sensitive domains, offering a foundation for further exploration of robotic tutoring systems in healthcare.
Pipitone, A., Cataldo, G., Corvaia, S., Chella, A. (2025). The robot and the nurse prepare for surgery: early insights into the impact of robot’s inner speech. INTELLIGENT SERVICE ROBOTICS, 19(1) [10.1007/s11370-025-00656-4].
The robot and the nurse prepare for surgery: early insights into the impact of robot’s inner speech
Pipitone, Arianna
Primo
;Cataldo, Giovanna;Corvaia, Sophia;Chella, AntonioUltimo
2025-01-01
Abstract
The robot’s inner speech has recently been under investigation. Inner speech is a form of internalized, self-directed dialogue: talking to the self. Initial intriguing findings pertained to the robot’s transparency, robustness, and trustworthiness, and emerged from basic collaborative tasks in Human-Robot Interaction (HRI), such as set up a lunch table. This pilot study extends the investigation to a high-stakes medical scenario where errors can have significant consequences: to prepare a virtual surgical table for a vascular intervention. The study involved both expert and non-expert participants who collaborated with the robot to correctly place and organize surgical instruments. Data were collected through a post-session questionnaire combining closed- ended Likert-scale items and open-ended questions. Interaction dynamics and learning outcomes were assessed through key variables, including transparency, trust, reassurance, and perceived learning, allowing for both quantitative and qualitative analysis. Results from this pilot indicate that the robot’s inner speech enhances participants’ understanding of task instructions, provides reassurance during collaboration, and improves perceived learning quality, evaluated in terms of the Quality of Learning (QoL) framework, with effects varying according to prior expertise. These findings highlight the potential of inner speech to facilitate transparent, trustworthy, and effective training in risk-sensitive domains, offering a foundation for further exploration of robotic tutoring systems in healthcare.| File | Dimensione | Formato | |
|---|---|---|---|
|
s11370-025-00656-4.pdf
accesso aperto
Tipologia:
Versione Editoriale
Dimensione
1.08 MB
Formato
Adobe PDF
|
1.08 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


