VR, Gaze, and Visual Impairment: An Exploratory Study of the Perception of Eye Contact across different Sensory Modalities for People with Visual Impairments in Virtual Reality
M. Wieland, M. Sedlmair, and T. Machulla. Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, page 1–6. New York, NY, USA, Association for Computing Machinery, (Apr 19, 2023)
DOI: 10.1145/3544549.3585726
Abstract
As social virtual reality (VR) becomes more popular, avatars are being designed with realistic behaviors incorporating non-verbal cues like eye contact. However, perceiving eye contact during a conversation can be challenging for people with visual impairments. VR presents an opportunity to display eye contact cues in alternative ways, making them perceivable for people with visual impairments. We performed an exploratory study to gain initial insights on designing eye contact cues for people with visual impairments, including a focus group for a deeper understanding of the topic. We implemented eye contact cues via visual, auditory, and tactile sensory modalities in VR and tested these approaches with eleven participants with visual impairments and collected qualitative feedback. The results show that visual cues indicating the gaze direction were preferred, but auditory and tactile cues were also prevalent as they do not superimpose additional visual information.
%0 Conference Paper
%1 Wieland2023
%A Wieland, Markus
%A Sedlmair, Michael
%A Machulla, Tonja-Katrin
%B Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
%C New York, NY, USA
%D 2023
%I Association for Computing Machinery
%K PN7 PN7-1.3 EXC2075 selected
%P 1–6
%R 10.1145/3544549.3585726
%T VR, Gaze, and Visual Impairment: An Exploratory Study of the Perception of Eye Contact across different Sensory Modalities for People with Visual Impairments in Virtual Reality
%U https://doi.org/10.1145/3544549.3585726
%X As social virtual reality (VR) becomes more popular, avatars are being designed with realistic behaviors incorporating non-verbal cues like eye contact. However, perceiving eye contact during a conversation can be challenging for people with visual impairments. VR presents an opportunity to display eye contact cues in alternative ways, making them perceivable for people with visual impairments. We performed an exploratory study to gain initial insights on designing eye contact cues for people with visual impairments, including a focus group for a deeper understanding of the topic. We implemented eye contact cues via visual, auditory, and tactile sensory modalities in VR and tested these approaches with eleven participants with visual impairments and collected qualitative feedback. The results show that visual cues indicating the gaze direction were preferred, but auditory and tactile cues were also prevalent as they do not superimpose additional visual information.
%@ 9781450394222
@inproceedings{Wieland2023,
abstract = {As social virtual reality (VR) becomes more popular, avatars are being designed with realistic behaviors incorporating non-verbal cues like eye contact. However, perceiving eye contact during a conversation can be challenging for people with visual impairments. VR presents an opportunity to display eye contact cues in alternative ways, making them perceivable for people with visual impairments. We performed an exploratory study to gain initial insights on designing eye contact cues for people with visual impairments, including a focus group for a deeper understanding of the topic. We implemented eye contact cues via visual, auditory, and tactile sensory modalities in VR and tested these approaches with eleven participants with visual impairments and collected qualitative feedback. The results show that visual cues indicating the gaze direction were preferred, but auditory and tactile cues were also prevalent as they do not superimpose additional visual information.},
added-at = {2024-03-26T11:56:18.000+0100},
address = {New York, NY, USA},
author = {Wieland, Markus and Sedlmair, Michael and Machulla, Tonja-Katrin},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/2104f9026b252dae88dc350538aa5fad6/exc2075},
booktitle = {Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems},
day = 19,
doi = {10.1145/3544549.3585726},
interhash = {f9cbb747c9dc731a989a2415a86d5135},
intrahash = {104f9026b252dae88dc350538aa5fad6},
isbn = {9781450394222},
keywords = {PN7 PN7-1.3 EXC2075 selected},
location = {Hamburg, Germany},
month = {04},
pages = {1–6},
publisher = {Association for Computing Machinery},
series = {CHI EA '23},
timestamp = {2024-03-26T11:56:18.000+0100},
title = {VR, Gaze, and Visual Impairment: An Exploratory Study of the Perception of Eye Contact across different Sensory Modalities for People with Visual Impairments in Virtual Reality},
url = {https://doi.org/10.1145/3544549.3585726},
year = 2023
}