In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations.
Giardina, M., Tramonte, S., Gentile, V., Vinanzi, S., Chella, A., Sorce, S., et al. (2017). Conveying Audience Emotions through Humanoid Robot Gestures to an Orchestra during a Live Musical Exhibition. In Proceedings of the 11th International Conference on Complex, Intelligent,and Software Intensive Systems (CISIS-2017) (pp. 249-261) [10.1007/978-3-319-61566-0_24].
Conveying Audience Emotions through Humanoid Robot Gestures to an Orchestra during a Live Musical Exhibition
GIARDINA, Marcello Emanuele
;Tramonte, Salvatore;Gentile, Vito;CHELLA, Antonio;SORCE, Salvatore;SORBELLO, Rosario
2017-07-05
Abstract
In the last twenty years, robotics have been applied in many heterogeneous contexts. Among them, the use of humanoid robots during musical concerts have been proposed and investigated by many authors. In this paper, we propose a contribution in the area of robotics application in music, consisting of a system for conveying audience emotions during a live musical exhibition, by means of a humanoid robot. In particular, we provide all spectators with a mobile app, by means of which they can select a specific color while listening to a piece of music (act). Each color is mapped to an emotion, and the audience preferences are then processed in order to select the next act to be played. This decision, based on the overall emotion felt by the audience, is then communicated by the robot through body gestures to the orchestra. Our first results show that spectators enjoy such kind of interactive musical performance, and are encouraging for further investigations.File | Dimensione | Formato | |
---|---|---|---|
CISIS_2017_HRI_CR.pdf
accesso aperto
Descrizione: Articolo principale
Tipologia:
Versione Editoriale
Dimensione
1.67 MB
Formato
Adobe PDF
|
1.67 MB | Adobe PDF | Visualizza/Apri |
CISIS2017_Front Cover + TOC.pdf
accesso aperto
Descrizione: Front Cover + TOC
Dimensione
324.66 kB
Formato
Adobe PDF
|
324.66 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.