This .zip file contains the Unity project of the visualization tool described in the paper "Visual Gaze Labeling for Augmented Reality Studies" which was accepted at EuroVis 2023 Conference. This tool can be used to annotate gaze data from Augmented Reality (AR) scenarios to perform AOI based eye-tracking analysis. The visualization tool consists of a gaze replay and timeline visualization - linked together to provide spatial and image-based annotation. The project includes a dataset that we collected from an AR pilot study.Please install Unity 2020.3.24 to run the visualization tool. The Unity project contains two scenes (located in Assets/Scenes):TimelineVisualizationGazeReplayFirst, drag both scenes into the hierarchy window, then unload GazeReplay. In File/Build Settings, the order of the Scenes in Build should be as follows: TimelineVisualization 0, GazeReplay 1. When you start the active scene (TimelineVisualization), the GazeReplay scene will be loaded as well. You can work with the visualization tool in the game view.The project consists of an Assets, Packages and ProjectSettings folder. Please check the GitHub page for the latest version.
%0 Generic
%1 oney2023replication
%A Öney, Seyda
%A Pathmanathan, Nelusa
%A Becher, Michael
%A Sedlmair, Michael
%A Weiskopf, Daniel
%A Kurzhals, Kuno
%D 2023
%I DaRUS
%K data rp10 rp28
%R 10.18419/darus-3384
%T Replication Data for: Visual Gaze Labeling for Augmented Reality Studies
%X This .zip file contains the Unity project of the visualization tool described in the paper "Visual Gaze Labeling for Augmented Reality Studies" which was accepted at EuroVis 2023 Conference. This tool can be used to annotate gaze data from Augmented Reality (AR) scenarios to perform AOI based eye-tracking analysis. The visualization tool consists of a gaze replay and timeline visualization - linked together to provide spatial and image-based annotation. The project includes a dataset that we collected from an AR pilot study.Please install Unity 2020.3.24 to run the visualization tool. The Unity project contains two scenes (located in Assets/Scenes):TimelineVisualizationGazeReplayFirst, drag both scenes into the hierarchy window, then unload GazeReplay. In File/Build Settings, the order of the Scenes in Build should be as follows: TimelineVisualization 0, GazeReplay 1. When you start the active scene (TimelineVisualization), the GazeReplay scene will be loaded as well. You can work with the visualization tool in the game view.The project consists of an Assets, Packages and ProjectSettings folder. Please check the GitHub page for the latest version.
@dataset{oney2023replication,
abstract = {This .zip file contains the Unity project of the visualization tool described in the paper "Visual Gaze Labeling for Augmented Reality Studies" which was accepted at EuroVis 2023 Conference. This tool can be used to annotate gaze data from Augmented Reality (AR) scenarios to perform AOI based eye-tracking analysis. The visualization tool consists of a gaze replay and timeline visualization - linked together to provide spatial and image-based annotation. The project includes a dataset that we collected from an AR pilot study.Please install Unity 2020.3.24 to run the visualization tool. The Unity project contains two scenes (located in Assets/Scenes):TimelineVisualizationGazeReplayFirst, drag both scenes into the hierarchy window, then unload GazeReplay. In File/Build Settings, the order of the Scenes in Build should be as follows: TimelineVisualization 0, GazeReplay 1. When you start the active scene (TimelineVisualization), the GazeReplay scene will be loaded as well. You can work with the visualization tool in the game view.The project consists of an Assets, Packages and ProjectSettings folder. Please check the GitHub page for the latest version. },
added-at = {2023-07-04T09:09:48.000+0200},
affiliation = {Öney, Seyda/Universität Stuttgart, Pathmanathan, Nelusa/Universität Stuttgart, Becher, Michael/Universität Stuttgart, Sedlmair, Michael/Universität Stuttgart, Weiskopf, Daniel/Universität Stuttgart, Kurzhals, Kuno/Universität Stuttgart},
author = {Öney, Seyda and Pathmanathan, Nelusa and Becher, Michael and Sedlmair, Michael and Weiskopf, Daniel and Kurzhals, Kuno},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/2c66767f02072c6eaffed56bc4409c8f2/intcdc},
doi = {10.18419/darus-3384},
howpublished = {Software},
interhash = {d9bd0913ca54b5333f94ff4c920c8803},
intrahash = {c66767f02072c6eaffed56bc4409c8f2},
keywords = {data rp10 rp28},
note = {Related to: Öney, Seyda, et al. "Visual Gaze Labeling for Augmented Reality Studies." Computer Graphics Forum. Vol. 42. No. 3. 2023. doi: 10.1111/cgf.14837},
orcid-numbers = {Öney, Seyda/0000-0002-5785-6788, Pathmanathan, Nelusa/0000-0002-6848-8554, Becher, Michael/0000-0002-0072-1655, Sedlmair, Michael/0000-0001-7048-9292, Weiskopf, Daniel/0000-0003-1174-1026, Kurzhals, Kuno/0000-0003-4919-4582},
publisher = {DaRUS},
timestamp = {2024-03-14T14:19:24.000+0100},
title = {Replication Data for: Visual Gaze Labeling for Augmented Reality Studies},
year = 2023
}