@unibiblio

Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

, and . Software, (2022)Related to: David Holzmüller and Ingo Steinwart. Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent, 2020. arXiv: 2002.04861.
DOI: 10.18419/darus-2978

Abstract

This data set contains code used to generate figures and tables in our paper "Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent". The code is also available on GitHub. Information on the code and installation instructions can be found in the file README.md.

Links and resources

Tags