D. Holzmüller, L. Grinsztajn, und I. Steinwart. Software, (2024)Related to: David Holzmüller, Léo Grinsztajn, and Ingo Steinwart. Better by Default: Strong Pre-Tuned MLPs and Boosted Trees on Tabular Data, 2024. arXiv: 2407.04491.
M. Haas, D. Holzmüller, U. von Luxburg, und I. Steinwart. NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing Systems, Seite 20763-20826. Association for Computing Machinery, (2024)
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2023)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2023. arXiv: 2203.09410.
D. Holzmüller, und I. Steinwart. Software, (2022)Related to: David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020. arXiv: 2002.04861.
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2022)Related to: David, Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.
D. Holzmüller, V. Zaverkin, J. Kästner, und I. Steinwart. Software, (2022)Related to: David Holzmüller, Viktor Zaverkin, Johannes Kästner, and Ingo Steinwart. A Framework and Benchmark for Deep Batch Active Learning for Regression, 2022. arXiv: 2203.09410.