PUMA publications for /user/frankheyenhttps://puma.ub.uni-stuttgart.de/user/frankheyenPUMA RSS feed for /user/frankheyen2024-03-19T02:45:12+01:00Visual Overviews for Sheet Music Structurehttps://puma.ub.uni-stuttgart.de/bibtex/21a5a2f9c38c1681813ec4543f0020088/frankheyenfrankheyen2024-01-05T17:11:56+01:002023 cybervalley myown peerreviewed sfbtrr161 vis(us) visus visus:heyenfk visus:ngoqh visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/11380e23419fe4dd89f3a9f38d4831cd3/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Quynh Quang Ngo" itemprop="url" href="/person/11380e23419fe4dd89f3a9f38d4831cd3/author/1"><span itemprop="name">Q. Ngo</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/11380e23419fe4dd89f3a9f38d4831cd3/author/2"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><span itemtype="http://schema.org/Book" itemscope="itemscope" itemprop="isPartOf"><em><span itemprop="name">Proceedings of the 24th International Society for Music Information Retrieval Conference (ISMIR)</span>, </em></span><em>page <span itemprop="pagination">692-699</span>. </em><em><span itemprop="publisher">ISMIR</span>, </em>(<em><span>December 2023<meta content="December 2023" itemprop="datePublished"/></span></em>)</span>Fri Jan 05 17:11:56 CET 2024Proceedings of the 24th International Society for Music Information Retrieval Conference (ISMIR)dec692-699Visual Overviews for Sheet Music Structure20232023 cybervalley myown peerreviewed sfbtrr161 vis(us) visus visus:heyenfk visus:ngoqh visus:sedlmaml We propose different methods for alternative representation and visual augmentation of sheet music that help users gain an overview of general structure, repeating patterns, and the similarity of segments. To this end, we explored mapping the overall similarity between sections or bars to colors. For these mappings, we use dimensionality reduction or clustering to assign similar segments to similar colors and vice versa. To provide a better overview, we further designed simplified music notation representations, including hierarchical and compressed encodings. These overviews allow users to display whole pieces more compactly on a single screen without clutter and to find and navigate to distant segments more quickly. Our preliminary evaluation with guitarists and tablature shows that our design supports users in tasks such as analyzing structure, finding repetitions, and determining the similarity of specific segments to others.More materials here: https://ismir2023program.ismir.net/poster_216.html
and here: https://visvar.github.io/pub/heyen2023visual.htmlAugmented Reality Visualization for Musical Instrument Learninghttps://puma.ub.uni-stuttgart.de/bibtex/2638a2d8035200275395844e08dfd303e/frankheyenfrankheyen2023-05-25T12:27:46+02:00cybervalley from:frankheyen myown vis(us) visus visus:heyenfk visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/170bd3606f5afaaac25e194d3bae332f2/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/170bd3606f5afaaac25e194d3bae332f2/author/1"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><em>International Society for Music Information Retrieval Conference 2022 Late-Breaking Demo, </em>(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Thu May 25 12:27:46 CEST 2023International Society for Music Information Retrieval Conference 2022 Late-Breaking DemoAugmented Reality Visualization for Musical Instrument Learning2022cybervalley from:frankheyen myown vis(us) visus visus:heyenfk visus:sedlmaml We contribute two design studies for augmented reality visualizations that support learning musical instruments. First, we designed simple, glanceable encodings for drum kits, which we display through a projector. As second instrument, we chose guitar and designed visualizations to be displayed either on a screen as an augmented mirror or an an optical see-through AR headset. These modalities allow us to also show information around the instrument and in 3D. We evaluated our prototypes through case studies and our results demonstrate the general effectivity and revealed design-related and technical limitations.A Web-Based MIDI Controller for Music Live Codinghttps://puma.ub.uni-stuttgart.de/bibtex/206522384b647489d8635e5fc1fac05e6/frankheyenfrankheyen2023-05-25T12:21:25+02:00cybervalley from:frankheyen myown vis(us) visus visus:heyenfk visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/10517945c7d658bab11c41c76071d6bfb/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Dilara Aygün" itemprop="url" href="/person/10517945c7d658bab11c41c76071d6bfb/author/1"><span itemprop="name">D. Aygün</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/10517945c7d658bab11c41c76071d6bfb/author/2"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><em>International Society for Music Information Retrieval Conference 2022 Late-Breaking Demo, </em>(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Thu May 25 12:21:25 CEST 2023International Society for Music Information Retrieval Conference 2022 Late-Breaking Demo A Web-Based MIDI Controller for Music Live Coding2022cybervalley from:frankheyen myown vis(us) visus visus:heyenfk visus:sedlmaml We contribute an interactive visual frontend to live coding environments, which allows live coders and performers to influence the behavior of their code more quickly and efficiently. Users can trigger actions and change parameters via instruments, buttons, and sliders, instead of only inside the code. For instance, toggling a loop or controlling a fading effect through mouse or touch interaction on a screen is faster than editing code. While this kind of control has already been possible with hardware MIDI devices, we provide a more accessible, easy-to-use, and customizable alternative that only requires a web browser. With examples, we show how users perform live-coded music faster and more easily with our design compared to using pure code.Touching data with PropellerHandhttps://puma.ub.uni-stuttgart.de/bibtex/23ee24270cbfeeeda24d47639eb429e10/frankheyenfrankheyen2022-08-05T10:32:29+02:00from:frankheyen intcdc myown peerreviewed rp4 visus visus:achberar visus:heyenfk visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Alexander Achberger" itemprop="url" href="/person/1e69f77a4eb5dfbd499f83cc13f62f6b8/author/0"><span itemprop="name">A. Achberger</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/1e69f77a4eb5dfbd499f83cc13f62f6b8/author/1"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Kresimir Vidackovic" itemprop="url" href="/person/1e69f77a4eb5dfbd499f83cc13f62f6b8/author/2"><span itemprop="name">K. Vidackovic</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/1e69f77a4eb5dfbd499f83cc13f62f6b8/author/3"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><span itemtype="http://schema.org/PublicationIssue" itemscope="itemscope" itemprop="isPartOf"><em><span itemprop="journal">Journal of Visualization</span>, </em> </span>(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Fri Aug 05 10:32:29 CEST 2022Journal of VisualizationTouching data with PropellerHand2022from:frankheyen intcdc myown peerreviewed rp4 visus visus:achberar visus:heyenfk visus:sedlmaml Immersive analytics often takes place in virtual environments which promise the users immersion. To fulfill this promise, sensory feedback, such as haptics, is an important component, which is however not well supported yet. Existing haptic devices are often expensive, stationary, or occupy the user’s hand, preventing them from grasping objects or using a controller. We propose PropellerHand, an ungrounded hand-mounted haptic device with two rotatable propellers, that allows exerting forces on the hand without obstructing hand use. PropellerHand is able to simulate feedback such as weight and torque by generating thrust up to 11 N in 2-DOF and a torque of 1.87 Nm in 2-DOF. Its design builds on our experience from quantitative and qualitative experiments with different form factors and parts. We evaluated our prototype through a qualitative user study in various VR scenarios that required participants to manipulate virtual objects in different ways, while changing between torques and directional forces. Results show that PropellerHand improves users’ immersion in virtual reality. Additionally, we conducted a second user study in the field of immersive visualization to investigate the potential benefits of PropellerHand there.AR Hero: Generating Interactive Augmented Reality Guitar Tutorialshttps://puma.ub.uni-stuttgart.de/bibtex/2213ed3da050b9778b01b6c34e9c0690c/frankheyenfrankheyen2022-03-27T18:06:49+02:00exc2075 from:frankheyen myown peerreviewed visus:heyenfk visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Lucchas Ribeiro Skreinig" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/0"><span itemprop="name">L. Skreinig</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Ana Stanescu" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/1"><span itemprop="name">A. Stanescu</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Shohei Mori" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/2"><span itemprop="name">S. Mori</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/3"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Peter Mohr" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/4"><span itemprop="name">P. Mohr</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/5"><span itemprop="name">M. Sedlmair</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Dieter Schmalstieg" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/6"><span itemprop="name">D. Schmalstieg</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Denis Kalkofen" itemprop="url" href="/person/1642477c3aabc971a9d5db8bf800f1fc8/author/7"><span itemprop="name">D. Kalkofen</span></a></span></span>. </span><span class="additional-entrytype-information">(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Sun Mar 27 18:06:49 CEST 20222022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)395-401AR Hero: Generating Interactive Augmented Reality Guitar Tutorials2022exc2075 from:frankheyen myown peerreviewed visus:heyenfk visus:sedlmaml We introduce a system capable of generating interactive Augmented Reality guitar tutorials by parsing common digital guitar tablature and by capturing the performance of an expert using a multi-camera array. Instructions are presented to the user in an Augmented Reality application using either an abstract visualization, a 3D virtual hand, or a 3D video. To support individual users at different skill levels the system provides full control of the playback of a tutorial, including its speed and looping behavior, while delivering live feedback on the user’s performance.Data-Driven Visual Reflection on Music Instrument Practicehttps://puma.ub.uni-stuttgart.de/bibtex/2971bca6b6f9e2a5babc80a465c6d4d28/frankheyenfrankheyen2022-03-27T17:58:05+02:00cybervalley exc2075 from:frankheyen myown peerreviewed vis(us) visus visus:heyenfk visus:kurzhako visus:ngoqh visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/12ff37d82b95304fdc262405d0619d58a/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Quynh Quang Ngo" itemprop="url" href="/person/12ff37d82b95304fdc262405d0619d58a/author/1"><span itemprop="name">Q. Ngo</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Kuno Kurzhals" itemprop="url" href="/person/12ff37d82b95304fdc262405d0619d58a/author/2"><span itemprop="name">K. Kurzhals</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/12ff37d82b95304fdc262405d0619d58a/author/3"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><span itemtype="http://schema.org/Book" itemscope="itemscope" itemprop="isPartOf"><em><span itemprop="name">ACM CHI Workshop on Intelligent Music Interfaces (IMI)</span>, </em></span><em><span itemprop="publisher">ACM</span>, </em>(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Sun Mar 27 17:58:05 CEST 2022ACM CHI Workshop on Intelligent Music Interfaces (IMI)Data-Driven Visual Reflection on Music Instrument Practice2022cybervalley exc2075 from:frankheyen myown peerreviewed vis(us) visus visus:heyenfk visus:kurzhako visus:ngoqh visus:sedlmaml We propose a data-driven approach to music instrument practice that allows studying patterns and long-term trends through visualization. Inspired by life logging and fitness tracking, we imagine musicians to record their practice sessions over the span of months or years. The resulting data in the form of MIDI or audio recordings can then be analyzed sporadically to track progress and guide decisions. Toward this vision, we started exploring various visualization designs together with a group of nine guitarists, who provided us with data and feedback over the course of three months.Immersive Visual Analysis of Cello Bow Movementshttps://puma.ub.uni-stuttgart.de/bibtex/2bcd2098cb3222919467008a20016396a/frankheyenfrankheyen2022-03-25T18:51:01+01:00cybervalley exc2075 from:frankheyen myown peerreviewed vis(us) visus visus:heyenfk visus:riglinsn visus:sedlmaml <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/181b3bbb33cf7cd8d64b6dca2e164be14/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Yannik Kohler" itemprop="url" href="/person/181b3bbb33cf7cd8d64b6dca2e164be14/author/1"><span itemprop="name">Y. Kohler</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Sebastian Triebener" itemprop="url" href="/person/181b3bbb33cf7cd8d64b6dca2e164be14/author/2"><span itemprop="name">S. Triebener</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Sebastian Rigling" itemprop="url" href="/person/181b3bbb33cf7cd8d64b6dca2e164be14/author/3"><span itemprop="name">S. Rigling</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/181b3bbb33cf7cd8d64b6dca2e164be14/author/4"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information">(<em><span>2022<meta content="2022" itemprop="datePublished"/></span></em>)</span>Fri Mar 25 18:51:01 CET 2022Immersive Visual Analysis of Cello Bow Movements2022cybervalley exc2075 from:frankheyen myown peerreviewed vis(us) visus visus:heyenfk visus:riglinsn visus:sedlmaml We propose a 3D immersive visualization environment for analyzing the right hand movements of a cello player. To achieve this, we track the position and orientation of the cello bow and record audio. As movements mostly occur in a shallow volume and the motion is therefore mostly two-dimensional, we use the third dimension to encode time. Our concept further explores various mappings from motion and audio data to spatial and other visual attributes. We work in close cooperation with a cellist and plan to evaluate our prototype through a user study with a group of cellists in the near future.ClaVis: An Interactive Visual Comparison System for Classifiershttps://puma.ub.uni-stuttgart.de/bibtex/2b94a323bb1481f5b322255b9bf245115/frankheyenfrankheyen2020-10-20T13:38:55+02:00EXC2075 from:frankheyen myown peerReviewed sfbtrr161 simtech vis(us) visus visus:heyenfk visus:munzta visus:sedlmaml visus:weiskopf <span data-person-type="author" class="authorEditorList "><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Frank Heyen" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/0"><span itemprop="name">F. Heyen</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Tanja Munz" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/1"><span itemprop="name">T. Munz</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Neumann" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/2"><span itemprop="name">M. Neumann</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Daniel Ortega" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/3"><span itemprop="name">D. Ortega</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Ngoc Thang Vu" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/4"><span itemprop="name">N. Vu</span></a></span>, </span><span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Daniel Weiskopf" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/5"><span itemprop="name">D. Weiskopf</span></a></span>, </span> and <span><span itemtype="http://schema.org/Person" itemscope="itemscope" itemprop="author"><a title="Michael Sedlmair" itemprop="url" href="/person/15bcaeedc1ee8e009a637fa403fc78357/author/6"><span itemprop="name">M. Sedlmair</span></a></span></span>. </span><span class="additional-entrytype-information"><span itemtype="http://schema.org/Book" itemscope="itemscope" itemprop="isPartOf"><em><span itemprop="name">Proceedings of the International Conference on Advanced Visual Interfaces</span>, </em></span><em>New York, NY, USA, </em><em><span itemprop="publisher">Association for Computing Machinery</span>, </em>(<em><span>2020<meta content="2020" itemprop="datePublished"/></span></em>)</span>Tue Oct 20 13:38:55 CEST 2020New York, NY, USAProceedings of the International Conference on Advanced Visual InterfacesAVI '20ClaVis: An Interactive Visual Comparison System for Classifiers2020EXC2075 from:frankheyen myown peerReviewed sfbtrr161 simtech vis(us) visus visus:heyenfk visus:munzta visus:sedlmaml visus:weiskopf We propose ClaVis, a visual analytics system for comparative analysis of classification models. ClaVis allows users to visually compare the performance and behavior of tens to hundreds of classifiers trained with different hyperparameter configurations. Our approach is plugin-based and classifier-agnostic and allows users to add their own datasets and classifier implementations. It provides multiple visualizations, including a multivariate ranking, a similarity map, a scatterplot that reveals correlations between parameters and scores, and a training history chart. We demonstrate the effectivity of our approach in multiple case studies for training classification models in the domain of natural language processing.