
Stochastic optimization algorithms have become indispensable in modern machine learning. The developments of theories and algorithms of modern optimization also requires the application of tools from different methematical branches, such as algebraic and differential geometry. In this dissertation, we answer several problems in stochastic optimization by a wide range of tools. We disprove the noncommutative arithmetic and geometric mean inequality using results from noncommutative polynomial optimization. We propose new, simpler and efficient models and algorithms for optimization over Grassmannian and flag manifolds. We study the problem of statistical inference in gradient-free optimization and contextual bandit optimization, and prove central limit theorems to construct confidence intervals. We present several versions of the Grothendieck inequality over the skew field of quaternions.
FOS: Mathematics
FOS: Mathematics
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
