Saliency3D: A 3D Saliency Dataset Collected on Screen (Dataset and Experiment Application)
Y. Wang, and A. Bulling. Software, (2024)Related to: Y. Wang, Q. Dai, M. Bâce, K. Klein, A. Bulling. "Saliency3D: A 3D Saliency Dataset Collected on Screen", in Proceedings of the ACM Symposium on Eye Tracking Research & Applications (ETRA '24). doi: 10.1145/3649902.3653350.
DOI: 10.18419/darus-4101
Abstract
While visual saliency has recently been studied in 3D, the experimental setup for collecting 3D saliency data can be expensive and cumbersome. To address this challenge, we propose a novel experimental design that utilizes an eye tracker on a screen to collect 3D saliency data. Our experimental design reduces the cost and complexity of 3D saliency dataset collection. We first collect gaze data on a screen, then we map them to 3D saliency data through perspective transformation. Using this method, we collect a 3D saliency dataset (49,276 fixations) comprising 10 participants looking at sixteen objects. Moreover, we examine the viewing preferences for objects and discuss our findings in this study. Our results indicate potential preferred viewing directions and a correlation between salient features and the variation in viewing directions.The files of this dataset are documented in README.md.
Related to: Y. Wang, Q. Dai, M. Bâce, K. Klein, A. Bulling. "Saliency3D: A 3D Saliency Dataset Collected on Screen", in Proceedings of the ACM Symposium on Eye Tracking Research & Applications (ETRA '24). doi: 10.1145/3649902.3653350
%0 Generic
%1 wang2024saliency3d
%A Wang, Yao
%A Bulling, Andreas
%D 2024
%K darus mult ubs_10005 ubs_10018 ubs_20008 ubs_20024 ubs_30086 ubs_30200 ubs_40336 unibibliografie
%R 10.18419/darus-4101
%T Saliency3D: A 3D Saliency Dataset Collected on Screen (Dataset and Experiment Application)
%X While visual saliency has recently been studied in 3D, the experimental setup for collecting 3D saliency data can be expensive and cumbersome. To address this challenge, we propose a novel experimental design that utilizes an eye tracker on a screen to collect 3D saliency data. Our experimental design reduces the cost and complexity of 3D saliency dataset collection. We first collect gaze data on a screen, then we map them to 3D saliency data through perspective transformation. Using this method, we collect a 3D saliency dataset (49,276 fixations) comprising 10 participants looking at sixteen objects. Moreover, we examine the viewing preferences for objects and discuss our findings in this study. Our results indicate potential preferred viewing directions and a correlation between salient features and the variation in viewing directions.The files of this dataset are documented in README.md.
@misc{wang2024saliency3d,
abstract = {While visual saliency has recently been studied in 3D, the experimental setup for collecting 3D saliency data can be expensive and cumbersome. To address this challenge, we propose a novel experimental design that utilizes an eye tracker on a screen to collect 3D saliency data. Our experimental design reduces the cost and complexity of 3D saliency dataset collection. We first collect gaze data on a screen, then we map them to 3D saliency data through perspective transformation. Using this method, we collect a 3D saliency dataset (49,276 fixations) comprising 10 participants looking at sixteen objects. Moreover, we examine the viewing preferences for objects and discuss our findings in this study. Our results indicate potential preferred viewing directions and a correlation between salient features and the variation in viewing directions.The files of this dataset are documented in README.md. },
added-at = {2024-03-25T15:12:44.000+0100},
affiliation = {Wang, Yao/Universität Stuttgart, Bulling, Andreas/Universität Stuttgart},
author = {Wang, Yao and Bulling, Andreas},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/2e00b0cb8692e3ffcdb0b6ae3d9b570f9/unibiblio},
doi = {10.18419/darus-4101},
howpublished = {Software},
interhash = {5bd8d63db6f9dd0f1e54624acecf6963},
intrahash = {e00b0cb8692e3ffcdb0b6ae3d9b570f9},
keywords = {darus mult ubs_10005 ubs_10018 ubs_20008 ubs_20024 ubs_30086 ubs_30200 ubs_40336 unibibliografie},
note = {Related to: Y. Wang, Q. Dai, M. Bâce, K. Klein, A. Bulling. "Saliency3D: A 3D Saliency Dataset Collected on Screen", in Proceedings of the ACM Symposium on Eye Tracking Research & Applications (ETRA '24). doi: 10.1145/3649902.3653350},
orcid-numbers = {Wang, Yao/0000-0002-3633-8623, Bulling, Andreas/0000-0001-6317-7303},
timestamp = {2024-03-25T15:12:44.000+0100},
title = {Saliency3D: A 3D Saliency Dataset Collected on Screen (Dataset and Experiment Application)},
year = 2024
}