D. Holzmüller. Software, (2021)Related to: David Holzmüller. On the Universality of the Double Descent Peak in Ridgeless Regression. International Conference on Learning Representations, 2021. arXiv: 2010.01851.
D. Holzmüller. Software, (2022)Related to: David Holzmüller and Dirk Pflüger. Fast Sparse Grid Operations using the Unidirectional Principle: A Generalized and Unified Framework. Sparse Grids and Applications - Munich 2018 (2021). doi: 10.1007/978-3-030-81362-8_4.
D. Holzmüller, und I. Steinwart. Software, (2022)Related to: David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020. arXiv: 2002.04861.
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2023)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2023. arXiv: 2203.09410.
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2022)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2022)Related to: David, Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.
S. Höpfl. Dataset, (2024)Related to: S. Höpfl, M. Albadry, U. Dahmen, K.-H. Herrmann, E. M. Kindler, M. König,J. R. Reichenbach, H.-M. Tautenhahn, W. Wei, W.-T. Zhao, and N. E. Radde.“Bayesian modeling of time series data (BayModTS) - a FAIR workflow toprocess sparse and highly variable data.” In: Bioinformatics (Oxford, England)(2024). doi: 10.1093/bioinformatics/btae312.