Hierarchical Gradient-Based Optimization with B-Splines on Sparse Grids
J. Valentin, and D. Pflüger. Sparse Grids and Applications - Stuttgart 2014, volume 109 of Lecture Notes in Computational Science and Engineering, page 315--336. Springer, (März 2016)
DOI: 10.1007/978-3-319-28262-6_13
Abstract
Optimization algorithms typically perform a series of function evaluations to
find an approximation of an optimal point of the objective function.
Evaluations can be expensive, e.g., if they depend on the results of a complex
simulation. When dealing with higher-dimensional functions, the curse of
dimensionality increases the difficulty of the problem rapidly and prohibits a
regular sampling. Instead of directly optimizing the objective function, we
replace it with a sparse grid interpolant, saving valuable function
evaluations. We generalize the standard piecewise linear basis to hierarchical
B-splines, making the sparse grid surrogate smooth enough to enable
gradient-based optimization methods. Also, we use an uncommon refinement
criterion due to Novak and Ritter to generate an appropriate sparse grid
adaptively. Finally, we evaluate the new method for various artificial and
real-world examples.
%0 Conference Paper
%1 valentin2016hierarchical
%A Valentin, Julian
%A Pflüger, Dirk
%B Sparse Grids and Applications - Stuttgart 2014
%D 2016
%E Garcke, Jochen
%E Pflüger, Dirk
%I Springer
%K sparse_grids;_optimization;_B-splines from:leiterrl
%P 315--336
%R 10.1007/978-3-319-28262-6_13
%T Hierarchical Gradient-Based Optimization with B-Splines on Sparse Grids
%U http://www2.informatik.uni-stuttgart.de/cgi-bin/NCSTRL/NCSTRL_view.pl?id=INPROC-2016-12&engl=0
%V 109
%X Optimization algorithms typically perform a series of function evaluations to
find an approximation of an optimal point of the objective function.
Evaluations can be expensive, e.g., if they depend on the results of a complex
simulation. When dealing with higher-dimensional functions, the curse of
dimensionality increases the difficulty of the problem rapidly and prohibits a
regular sampling. Instead of directly optimizing the objective function, we
replace it with a sparse grid interpolant, saving valuable function
evaluations. We generalize the standard piecewise linear basis to hierarchical
B-splines, making the sparse grid surrogate smooth enough to enable
gradient-based optimization methods. Also, we use an uncommon refinement
criterion due to Novak and Ritter to generate an appropriate sparse grid
adaptively. Finally, we evaluate the new method for various artificial and
real-world examples.
@inproceedings{valentin2016hierarchical,
abstract = {Optimization algorithms typically perform a series of function evaluations to
find an approximation of an optimal point of the objective function.
Evaluations can be expensive, e.g., if they depend on the results of a complex
simulation. When dealing with higher-dimensional functions, the curse of
dimensionality increases the difficulty of the problem rapidly and prohibits a
regular sampling. Instead of directly optimizing the objective function, we
replace it with a sparse grid interpolant, saving valuable function
evaluations. We generalize the standard piecewise linear basis to hierarchical
B-splines, making the sparse grid surrogate smooth enough to enable
gradient-based optimization methods. Also, we use an uncommon refinement
criterion due to Novak and Ritter to generate an appropriate sparse grid
adaptively. Finally, we evaluate the new method for various artificial and
real-world examples.},
added-at = {2020-07-27T15:42:33.000+0200},
author = {Valentin, Julian and Pfl{\"u}ger, Dirk},
biburl = {https://puma.ub.uni-stuttgart.de/bibtex/214dfe28fc34bcddc44b60b4a97e6da64/ipvs-sc},
booktitle = {Sparse Grids and Applications - Stuttgart 2014},
cr-category = {G.1.6 Numerical Analysis Optimization},
department = {Universit{\"a}t Stuttgart, Institut f{\"u}r Parallele und Verteilte Systeme, Simulation gro{\ss}er Systeme},
doi = {10.1007/978-3-319-28262-6_13},
editor = {Garcke, Jochen and Pfl{\"u}ger, Dirk},
ee = {ftp://ftp.informatik.uni-stuttgart.de/pub/library/ncstrl.ustuttgart_fi/INPROC-2016-12/INPROC-2016-12.pdf,
https://dx.doi.org/10.1007/978-3-319-28262-6_13},
institution = {Universit{\"a}t Stuttgart, Fakult{\"a}t Informatik, Elektrotechnik und Informationstechnik, Germany},
interhash = {c4e03629d05f4171378e45e417813da6},
intrahash = {14dfe28fc34bcddc44b60b4a97e6da64},
keywords = {sparse_grids;_optimization;_B-splines from:leiterrl},
language = {Englisch},
month = {M{\"a}rz},
pages = {315--336},
publisher = {Springer},
series = {Lecture Notes in Computational Science and Engineering},
timestamp = {2020-07-27T13:42:33.000+0200},
title = {{Hierarchical Gradient-Based Optimization with B-Splines on Sparse Grids}},
type = {Konferenz-Beitrag},
url = {http://www2.informatik.uni-stuttgart.de/cgi-bin/NCSTRL/NCSTRL_view.pl?id=INPROC-2016-12&engl=0},
volume = 109,
year = 2016
}