Locating places in cities is typically facilitated by handheld mobile devices, which draw the visual attention of the user on the screen of the device instead of the surroundings. In this research, we aim at strengthening the connection between people and their surroundings through enabling mid-air gestural interaction with real-world landmarks and delivering information through audio to retain users’ visual attention on the scene. Recent research on gesture-based and haptic techniques for such purposes has mainly considered handheld devices that eventually direct users’ attention back to the devices. We contribute a hand-worn, mid-air gestural interaction design with directional vibrotactile guidance for finding points of interest (POIs). Through three design iterations, we address aspects of (1) sensing technologies and the placement of actuators considering users’ instinctive postures, (2) the feasibility of finding and fetching information regarding landmarks without visual feedback, and (3) the benefits of such interaction in a tourist application. In a final evaluation, participants located POIs and fetched information by pointing and following directional guidance, thus realising a vision in which they found and experienced real-world landmarks while keeping their visual attention on the scene. The results show that the interaction technique has comparable performance to a visual baseline, enables high mobility, and facilitates keeping visual attention on the surroundings.

Hsieh Y.-T., Jylha A., Orso V., Andolina S., Hoggan E., Gamberini L., et al. (2019). Developing hand-worn input and haptic support for real-world target finding. PERSONAL AND UBIQUITOUS COMPUTING, 23(1), 117-132 [10.1007/s00779-018-1180-z].

Developing hand-worn input and haptic support for real-world target finding

Andolina S.;
2019-01-01

Abstract

Locating places in cities is typically facilitated by handheld mobile devices, which draw the visual attention of the user on the screen of the device instead of the surroundings. In this research, we aim at strengthening the connection between people and their surroundings through enabling mid-air gestural interaction with real-world landmarks and delivering information through audio to retain users’ visual attention on the scene. Recent research on gesture-based and haptic techniques for such purposes has mainly considered handheld devices that eventually direct users’ attention back to the devices. We contribute a hand-worn, mid-air gestural interaction design with directional vibrotactile guidance for finding points of interest (POIs). Through three design iterations, we address aspects of (1) sensing technologies and the placement of actuators considering users’ instinctive postures, (2) the feasibility of finding and fetching information regarding landmarks without visual feedback, and (3) the benefits of such interaction in a tourist application. In a final evaluation, participants located POIs and fetched information by pointing and following directional guidance, thus realising a vision in which they found and experienced real-world landmarks while keeping their visual attention on the scene. The results show that the interaction technique has comparable performance to a visual baseline, enables high mobility, and facilitates keeping visual attention on the surroundings.
2019
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi Di Elaborazione Delle Informazioni
Hsieh Y.-T., Jylha A., Orso V., Andolina S., Hoggan E., Gamberini L., et al. (2019). Developing hand-worn input and haptic support for real-world target finding. PERSONAL AND UBIQUITOUS COMPUTING, 23(1), 117-132 [10.1007/s00779-018-1180-z].
File in questo prodotto:
File Dimensione Formato  
[2019][PUC] Developing hand-worn input and haptic support for real-world target finding.pdf

Solo gestori archvio

Tipologia: Versione Editoriale
Dimensione 2.76 MB
Formato Adobe PDF
2.76 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/390692
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact