Non-asymptotic analysis of online noisy stochastic gradient descent

Riddhiman Bhattacharya & Tiefeng Jiang

Advances in Applied Probability2026https://doi.org/10.1017/apr.2026.10056article
ABDC A
Weight
0.50

Abstract

Past research has indicated that the covariance of the stochastic gradient descent (SGD) error done via minibatching plays a critical role in determining its regularization and escape from low potential points. Motivated by some new research in this area, we prove universality results by showing that noise classes that have the same mean and covariance structure of SGD via minibatching have similar properties. We mainly consider the SGD algorithm, with multiplicative noise, introduced in previous work (Wu et al (2016) Int. Conf. on Machine Learning , PMLR, pp. 10367–10376), which has a much more general noise class than the SGD algorithm done via minibatching. We establish non-asymptotic bounds for the multiplicative SGD algorithm in the Wasserstein distance. We also show that the error term for the algorithm is approximately a scaled Gaussian distribution with mean 0 at any fixed point.

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1017/apr.2026.10056

Or copy a formatted citation

@article{riddhiman2026,
  title        = {{Non-asymptotic analysis of online noisy stochastic gradient descent}},
  author       = {Riddhiman Bhattacharya & Tiefeng Jiang},
  journal      = {Advances in Applied Probability},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1017/apr.2026.10056},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Non-asymptotic analysis of online noisy stochastic gradient descent

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.50

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.50 × 0.4 = 0.20
M · momentum0.50 × 0.15 = 0.07
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.