Menu
Contact | A-Z
img

Wednesday, April 21, 2021 - 14:00 in ZOOM - Video Conference


Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes

A talk in the Bielefeld Stochastic Afternoon series by
Steffen Dereich from Münster

Abstract: We consider convergence of stochastic gradient descent schemes (SGD) under weak assumptions on the underlying landscape. More explicitly, we show that on the event that the SGD stays local we have convergence of the SGD if there is only a countable number of critical points or if the target function satisfies loja-inequalities around all critical levels as all analytic functions do. In particular, we show that for neural networks with analytic activation function such as softplus, sigmoid and the hyperbolic tangent, SGD converges on the event of staying local, if the random variables modeling the signal and response in the training are compactly supported.

Please contact stochana@math.uni-bielefeld.de for Meeting-ID and Password. (New meeting details since April 1!)



Back

© 2017–Present Sonderforschungbereich 1283 | Imprint | Privacy Policy