P. Müller, E. Sood, und A. Bulling. ACM Symposium on Eye Tracking Research and Applications, Seite 1–10. New York, NY, USA, Association for Computing Machinery, (02.06.2020)
DOI: 10.1145/3379155.3391332
Zusammenfassung
We present the first method to anticipate averted gaze in natural dyadic interactions.
The task of anticipating averted gaze, i.e. that a person will not make eye contact
in the near future, remains unsolved despite its importance for human social encounters
as well as a number of applications, including human-robot interaction or conversational
agents. Our multimodal method is based on a long short-term memory (LSTM) network
that analyses non-verbal facial cues and speaking behaviour. We empirically evaluate
our method for different future time horizons on a novel dataset of 121 YouTube videos
of dyadic video conferences (74 hours in total). We investigate person-specific and
person-independent performance and demonstrate that our method clearly outperforms
baselines in both settings. As such, our work sheds light on the tight interplay between
eye contact and other non-verbal signals and underlines the potential of computational
modelling and anticipation of averted gaze for interactive applications.
%0 Conference Paper
%1 Müller2020
%A Müller, Philipp
%A Sood, Ekta
%A Bulling, Andreas
%B ACM Symposium on Eye Tracking Research and Applications
%C New York, NY, USA
%D 2020
%I Association for Computing Machinery
%K EXC2075 PN7
%P 1–10
%R 10.1145/3379155.3391332
%T Anticipating Averted Gaze in Dyadic Interactions
%U https://doi.org/10.1145/3379155.3391332
%X We present the first method to anticipate averted gaze in natural dyadic interactions.
The task of anticipating averted gaze, i.e. that a person will not make eye contact
in the near future, remains unsolved despite its importance for human social encounters
as well as a number of applications, including human-robot interaction or conversational
agents. Our multimodal method is based on a long short-term memory (LSTM) network
that analyses non-verbal facial cues and speaking behaviour. We empirically evaluate
our method for different future time horizons on a novel dataset of 121 YouTube videos
of dyadic video conferences (74 hours in total). We investigate person-specific and
person-independent performance and demonstrate that our method clearly outperforms
baselines in both settings. As such, our work sheds light on the tight interplay between
eye contact and other non-verbal signals and underlines the potential of computational
modelling and anticipation of averted gaze for interactive applications.
%@ 9781450371339
@inproceedings{Müller2020,
abstract = {We present the first method to anticipate averted gaze in natural dyadic interactions.
The task of anticipating averted gaze, i.e. that a person will not make eye contact
in the near future, remains unsolved despite its importance for human social encounters
as well as a number of applications, including human-robot interaction or conversational
agents. Our multimodal method is based on a long short-term memory (LSTM) network
that analyses non-verbal facial cues and speaking behaviour. We empirically evaluate
our method for different future time horizons on a novel dataset of 121 YouTube videos
of dyadic video conferences (74 hours in total). We investigate person-specific and
person-independent performance and demonstrate that our method clearly outperforms
baselines in both settings. As such, our work sheds light on the tight interplay between
eye contact and other non-verbal signals and underlines the potential of computational
modelling and anticipation of averted gaze for interactive applications.},
added-at = {2021-12-08T17:10:50.000+0100},
address = {New York, NY, USA},
author = {Müller, Philipp and Sood, Ekta and Bulling, Andreas},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/2f97b9935cacc514b3e94fb2318b6f61e/simtech},
booktitle = {ACM Symposium on Eye Tracking Research and Applications},
day = 2,
doi = {10.1145/3379155.3391332},
interhash = {b088d82b86d04188aded80d640e5a7cb},
intrahash = {f97b9935cacc514b3e94fb2318b6f61e},
isbn = {9781450371339},
keywords = {EXC2075 PN7},
location = {Stuttgart, Germany},
month = {06},
pages = {1–10},
publisher = {Association for Computing Machinery},
series = {ETRA '20 Full Papers},
timestamp = {2023-07-31T05:40:46.000+0200},
title = {Anticipating Averted Gaze in Dyadic Interactions},
url = {https://doi.org/10.1145/3379155.3391332},
year = 2020
}