Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/18335
Full metadata record
DC FieldValueLanguage
dc.contributor.authorROVELO RUIZ, Gustavo-
dc.contributor.authorAbat, M.C.Juan Francisco-
dc.contributor.authorCamahort, Emilio-
dc.date.accessioned2015-02-13T10:19:37Z-
dc.date.available2015-02-13T10:19:37Z-
dc.date.issued2015-
dc.identifier.citationProceedings of the International Conference on Computer Graphics Theory and Applications (GRAPP2015), p. 438-445-
dc.identifier.isbn9789897580871-
dc.identifier.urihttp://hdl.handle.net/1942/18335-
dc.description.abstractThe widespread usage of mobile devices together with their computational capabilities enables the implementation of novel interaction techniques to improve user performance in traditional mobile applications. Navigation assistance is an important area in the mobile domain, and probably Google Maps is the most popular example. This type of applications is highly demanding for user’s attention, especially in the visual channel. Tactile and auditory feedback have been studied as alternatives to visual feedback for navigation assistance to reduce this dependency. However, there is still room for improvement and more research is needed to understand, for example, how the three feedback modalities complement each other, especially with the appearance of new technology such as smart watches and new displays such as Google Glass. The goal of our work is to study how the user perceives multimodal feedback when their route is augmented with directional cues. Our results show that tactile guidance cues produced the worst user performance, both objectively and subjectively. Participants reported that vibration patterns were hard to decode. However, tactile feedback was a unobtrusive technique to inform participants when to look to the mobile screen or listen to the spoken directions. The results show that combining feedback modalities produces good user performance.-
dc.language.isoen-
dc.publisherSciTePress-
dc.rightsCopyright © 2015 SCITEPRESS (Science and Technology Publications, Lda.)-
dc.subject.othermultimodal interface; user evaluation; mobile augmented reality-
dc.titleStudying the User Experience with a Multimodal Pedestrian Navigation Assistant-
dc.typeProceedings Paper-
local.bibliographicCitation.conferencedateMarch 11-14, 2015-
local.bibliographicCitation.conferencenameInternational Conference on Computer Graphics Theory and Applications (GRAPP2015)-
local.bibliographicCitation.conferenceplaceBerlin-
dc.identifier.epage445-
dc.identifier.spage438-
local.format.pages8-
local.bibliographicCitation.jcatC1-
dc.description.notesTo appear in Proceedings of the International Conference on Computer Graphics Theory and Applications (GRAPP) 2015, March 11th - 14th, Berlin, Germany http://www.grapp.visigrapp.org/-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
local.identifier.vabbc:vabb:394448-
dc.identifier.doi10.5220/0005297504380445-
dc.identifier.urlhttp://www.grapp.visigrapp.org/-
local.bibliographicCitation.btitleProceedings of the International Conference on Computer Graphics Theory and Applications (GRAPP2015)-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
item.validationvabb 2018-
item.contributorROVELO RUIZ, Gustavo-
item.contributorAbat, M.C.Juan Francisco-
item.contributorCamahort, Emilio-
item.fullcitationROVELO RUIZ, Gustavo; Abat, M.C.Juan Francisco & Camahort, Emilio (2015) Studying the User Experience with a Multimodal Pedestrian Navigation Assistant. In: Proceedings of the International Conference on Computer Graphics Theory and Applications (GRAPP2015), p. 438-445.-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
RoveloEtAl_GRAPP2015 (1).pdf437.92 kBAdobe PDFView/Open
vabb22524.pdf
  Restricted Access
Published version535.04 kBAdobe PDFView/Open    Request a copy
Show simple item record

Page view(s)

68
checked on Sep 7, 2022

Download(s)

116
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.