publication . Report . Preprint . 2021

The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof

Turinici, Gabrel;
Open Access English
  • Published: 26 Mar 2021
  • Publisher: Zenodo
  • Country: France
Abstract
We give here a proof of the convergence of the Stochastic Gradient Descent (SGD) in a self-contained manner.
Subjects
arXiv: Computer Science::Machine LearningStatistics::Machine Learning
free text keywords: Stochastic Gradient Descent, Neural Network, SGD, Adam, RMSprop, Gabriel TURINICI, Stochastic Gradient Descent, Neural Network, SGD, Adam, RMSprop, Gabriel TURINICI, [MATH.MATH-NA]Mathematics [math]/Numerical Analysis [math.NA], Statistics - Machine Learning, Computer Science - Machine Learning, Mathematics - Probability
Related Organizations

[1] Imen Ayadi and Gabriel Turinici. Stochastic Runge-Kutta methods and adaptive SGD-G2 stochastic gradient descent, 2020. arxiv:2002.09304, ICPR2020 paper.

[2] Gabriel Turinici. Stochastic learning control of inhomogeneous quantum ensembles. Phys. Rev. A, 100:053403, Nov 2019. [OpenAIRE]

[3] Gabriel Turinici. Convergence dynamics of generative adversarial networks: the dual metric flows, 2020. arXiv:2012.10410; CADL-ICPR 2020 workshop paper.

[4] Gabriel Turinici. arxiv:1911.13135.

X-Ray Sobolev Variational Auto-Encoders, 2020.

Any information missing or wrong?Report an Issue