Misc,

Stimulus Data for "Comparative Study on the Perception of Direction in Animated Map Transitions Using Different Map Projections"

, , and .
Dataset, (2023)
DOI: 10.18419/darus-3463

Abstract

We compare how well participants can determine the geographical direction of an animated map transition. In our between-subject online study, each of three groups is shown map transitions in one map projection: Mercator, azimuthal equidistant projection, or two-point equidistant projection. The distances of the start and end point are varied. Map transitions zoom out and pan towards the middle point, then zoom in and continue panning, following the recommendations by Van Wijk and Nuij (IEEE InfoVis, 2003). We measure response time and accuracy in the task. We evaluate the results by the sample means per participant, using interval estimation with 95% confidence intervals. We construct the confidence intervals by using BCa bootstrapping. The study is pre-registered on OSF.io, but due to file size limitations, we were not able to submit the video stimuli there.Instead, we provide them here.This repository contains the MPEG-4 video files that were shown to the participants in the videos/ folder.These are numbered from 0 to 1199 for each of the three map projections, which are also stated in the file name, for a total of 3,600 video stimuli. An additional 3×6 example stimuli are also included. For each video stimulus, a JSON file with the same prefix file name (projection + number) is located in the metadata/ folder.These files contain the ground truth metadata for the respective stimulus. The stimuli shown for teaching the participants the task are located with the same structure under the examples/ folder. The entire source code for the study is also available in the related publication.The related repository includes: The code for generating the individual PNG frames, and JSON metadata, for each stimulus.The server and front-end code for the online study itself.The Python and R code for evaluating the study results.

Tags

Users

  • @unibiblio

Comments and Reviews