@katharinafuchs

Anticipating Averted Gaze in Dyadic Interactions

, , and . ACM Symposium on Eye Tracking Research and Applications, page 1–10. New York, NY, USA, Association for Computing Machinery, (Jun 2, 2020)
DOI: 10.1145/3379155.3391332

Abstract

We present the first method to anticipate averted gaze in natural dyadic interactions. The task of anticipating averted gaze, i.e. that a person will not make eye contact in the near future, remains unsolved despite its importance for human social encounters as well as a number of applications, including human-robot interaction or conversational agents. Our multimodal method is based on a long short-term memory (LSTM) network that analyses non-verbal facial cues and speaking behaviour. We empirically evaluate our method for different future time horizons on a novel dataset of 121 YouTube videos of dyadic video conferences (74 hours in total). We investigate person-specific and person-independent performance and demonstrate that our method clearly outperforms baselines in both settings. As such, our work sheds light on the tight interplay between eye contact and other non-verbal signals and underlines the potential of computational modelling and anticipation of averted gaze for interactive applications.

Links and resources

Tags

community

  • @simtech
  • @ektasood
  • @katharinafuchs
@katharinafuchs's tags highlighted