M. Hansmann, M. Kohler, and H. Walk. Annals of the Institute of Statistical Mathematics, (2019)Correction to: https://doi.org/10.1007/s10463-018-0674-9.
K. Herrmann, W. Zhao, S. Höpfl, and J. Reichenbach. Dataset, (2024)Related to: S. Höpfl, M. Albadry, U. Dahmen, K.-H. Herrmann, E. M. Kindler, M. König,J. R. Reichenbach, H.-M. Tautenhahn, W. Wei, W.-T. Zhao, and N. E. Radde.“Bayesian modeling of time series data (BayModTS) - a FAIR workflow toprocess sparse and highly variable data.” In: Bioinformatics (Oxford, England)(2024). doi: 10.1093/bioinformatics/btae312.
D. Holzmüller. Software, (2021)Related to: David Holzmüller. On the Universality of the Double Descent Peak in Ridgeless Regression. International Conference on Learning Representations, 2021. arXiv: 2010.01851.
D. Holzmüller. Software, (2022)Related to: David Holzmüller and Dirk Pflüger. Fast Sparse Grid Operations using the Unidirectional Principle: A Generalized and Unified Framework. Sparse Grids and Applications - Munich 2018 (2021). doi: 10.1007/978-3-030-81362-8_4.
D. Holzmüller, and I. Steinwart. Software, (2022)Related to: David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020. arXiv: 2002.04861.
D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart. Software, (2023)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2023. arXiv: 2203.09410.
D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart. Software, (2022)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.
D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart. Software, (2022)Related to: David, Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.
S. Höpfl. Dataset, (2024)Related to: S. Höpfl, M. Albadry, U. Dahmen, K.-H. Herrmann, E. M. Kindler, M. König,J. R. Reichenbach, H.-M. Tautenhahn, W. Wei, W.-T. Zhao, and N. E. Radde.“Bayesian modeling of time series data (BayModTS) - a FAIR workflow toprocess sparse and highly variable data.” In: Bioinformatics (Oxford, England)(2024). doi: 10.1093/bioinformatics/btae312.
M. Nonnenmacher, D. Reeb, and I. Steinwart. Machine Learning and Knowledge Discovery in Databases : Research Track, volume 3 of Lecture Notes in Computer Science, page 87-102. Berlin, Springer, (2021)
I. Steinwart, and A. Christmann. Advances in neural information processing systems 21 : 22nd Annual Conference on Neural Information Processing Systems 2008, 3, page 1569-1576. Red Hook, NY, Curran Associates Inc., (2009)
I. Steinwart, and A. Christmann. Advances in neural information processing systems 22 : 23rd Annual Conference on Neural Information Processing Systems 2009, 3, page 1768-1776. Red Hook, NY, Curran, (2010)
I. Steinwart, J. Theiler, and D. Llamocca. 2010 IEEE International Geoscience and Remote Sensing Symposium, 5, page 3732-3735. Piscataway, NJ, IEEE, (2010)
P. Thomann, I. Blaschzyk, M. Meister, and I. Steinwart. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, 54, page 1329-1337. Red Hook, NY, Curran, (2017)
V. Wagner, S. Höpfl, V. Klingel, M. Pop, and N. Radde. 9th IFAC Conference on Foundations of Systems Biology in Engineering FOSBE 2022, 55, 23, page 86-91. Elsevier, (2022)
V. Zaverkin, D. Holzmüller, L. Bonfirraro, and J. Kästner. Dataset, (2023)Related to: Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, Johannes Kästner. Transfer learning for chemically accurate interatomic neural network potentials, Phys. Chem. Chem. Phys., 2023, 25, 5383-5396. doi: 10.1039/D2CP05793J.
V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner. Software, (2021)Related to: V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” J. Chem. Theory Comput. 17, 6658–6670 (2021). doi: 10.1021/acs.jctc.1c00527.