In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment.

Galioto, G., Tinnirello, I., Croce, D., Inderst, F., Pascucci, F., Giarre, L. (2018). Sensor Fusion Localization and Navigation for Visually Impaired People. In 2018 European Control Conference, ECC 2018 (pp. 3191-3196). Institute of Electrical and Electronics Engineers Inc. [10.23919/ECC.2018.8550373].

Sensor Fusion Localization and Navigation for Visually Impaired People

Galioto, G.;Tinnirello, I.;Croce, D.;
2018-01-01

Abstract

In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment.
2018
Settore ING-INF/03 - Telecomunicazioni
9783952426982
Galioto, G., Tinnirello, I., Croce, D., Inderst, F., Pascucci, F., Giarre, L. (2018). Sensor Fusion Localization and Navigation for Visually Impaired People. In 2018 European Control Conference, ECC 2018 (pp. 3191-3196). Institute of Electrical and Electronics Engineers Inc. [10.23919/ECC.2018.8550373].
File in questo prodotto:
File Dimensione Formato  
Sensor_Fusion_Localization_and_Navigation_for_Visually_Impaired_People.pdf

Solo gestori archvio

Tipologia: Versione Editoriale
Dimensione 1.52 MB
Formato Adobe PDF
1.52 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/344417
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 7
social impact