Dataset for "How Deep Is Your Gaze? Leveraging Distance in Image-Based Gaze Analysis"
M. Koch, N. Pathmanathan, D. Weiskopf, and K. Kurzhals. Dataset, (2024)Related to: Maurice Koch, Nelusa Pathmanathan, Daniel Weiskopf, and Kuno Kurzhals.2024. How Deep Is Your Gaze? Leveraging Distance in Image-Based GazeAnalysis. In 2024 Symposium on Eye Tracking Research and Applications(ETRA ’24), June 04–07, 2024, Glasgow, United Kingdom. ACM, New York,NY, USA, 7 pages. doi: 10.1145/3649902.3653349.
DOI: 10.18419/darus-4141
Abstract
This dataset was recorded in an AR environment comprised of three physical and three virtual scene objects. Four participants were instructed to gaze at the six objects from different depth levels (50cm, 150cm, 300cm) in two orders (left-to-right, right-to-left). There are seven trials per condition, which equals 42 recordings in total. More details can be found in the README.md.
Related to: Maurice Koch, Nelusa Pathmanathan, Daniel Weiskopf, and Kuno Kurzhals.2024. How Deep Is Your Gaze? Leveraging Distance in Image-Based GazeAnalysis. In 2024 Symposium on Eye Tracking Research and Applications(ETRA ’24), June 04–07, 2024, Glasgow, United Kingdom. ACM, New York,NY, USA, 7 pages. doi: 10.1145/3649902.3653349
%0 Generic
%1 koch2024dataset
%A Koch, Maurice
%A Pathmanathan, Nelusa
%A Weiskopf, Daniel
%A Kurzhals, Kuno
%D 2024
%I DaRUS
%K rp28 data
%R 10.18419/darus-4141
%T Dataset for "How Deep Is Your Gaze? Leveraging Distance in Image-Based Gaze Analysis"
%X This dataset was recorded in an AR environment comprised of three physical and three virtual scene objects. Four participants were instructed to gaze at the six objects from different depth levels (50cm, 150cm, 300cm) in two orders (left-to-right, right-to-left). There are seven trials per condition, which equals 42 recordings in total. More details can be found in the README.md.
@dataset{koch2024dataset,
abstract = {This dataset was recorded in an AR environment comprised of three physical and three virtual scene objects. Four participants were instructed to gaze at the six objects from different depth levels (50cm, 150cm, 300cm) in two orders (left-to-right, right-to-left). There are seven trials per condition, which equals 42 recordings in total. More details can be found in the README.md. },
added-at = {2024-05-27T12:26:27.000+0200},
affiliation = {Koch, Maurice/Universität Stuttgart, Pathmanathan, Nelusa/Universität Stuttgart, Weiskopf, Daniel/Universität Stuttgart, Kurzhals, Kuno/Universität Stuttgart},
author = {Koch, Maurice and Pathmanathan, Nelusa and Weiskopf, Daniel and Kurzhals, Kuno},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/2d80875c4f6cc1ea60b2398d1ccb26624/intcdc},
doi = {10.18419/darus-4141},
howpublished = {Dataset},
interhash = {a13e787f9a0621206adbca661b22f140},
intrahash = {d80875c4f6cc1ea60b2398d1ccb26624},
keywords = {rp28 data},
note = {Related to: Maurice Koch, Nelusa Pathmanathan, Daniel Weiskopf, and Kuno Kurzhals.2024. How Deep Is Your Gaze? Leveraging Distance in Image-Based GazeAnalysis. In 2024 Symposium on Eye Tracking Research and Applications(ETRA ’24), June 04–07, 2024, Glasgow, United Kingdom. ACM, New York,NY, USA, 7 pages. doi: 10.1145/3649902.3653349},
orcid-numbers = {Koch, Maurice/0000-0003-0469-8971, Pathmanathan, Nelusa/0000-0002-6848-8554, Weiskopf, Daniel/0000-0003-1174-1026, Kurzhals, Kuno/0000-0003-4919-4582},
publisher = {DaRUS},
timestamp = {2024-05-27T12:26:27.000+0200},
title = {Dataset for "How Deep Is Your Gaze? Leveraging Distance in Image-Based Gaze Analysis"},
year = 2024
}