Convergence and Inference of Stream Stochastic Gradient Descent, with Applications to Queueing Systems and Inventory Control
X. Z. Li et al.
Abstract
Stream SGD: Fast Learning and Valid Inference from Dependent Data Many online optimization problems in operations research rely on data generated by Markovian systems whose dynamics depend on the decision parameters, creating both statistical dependence and biased gradient information. This paper, “Convergence and Inference of Stream Stochastic Gradient Descent, with Applications to Queueing Systems and Inventory Control,” develops a unified theory for stream stochastic gradient descent (SGD), a sample-efficient method that uses just one observation per iteration. Using Poisson-equation techniques, the authors quantify and control gradient bias and dependence, proving an optimal [Formula: see text] convergence rate and a state-of-the-art O(log T) regret bound. Beyond optimization performance, the paper introduces an online inference framework for uncertainty quantification and establishes a functional central limit theorem that underpins valid asymptotic inference. A new Wasserstein-type divergence yields verifiable conditions via coupling arguments tailored to operations research models. Applications to queueing and inventory problems demonstrate how the theory translates into practical, scalable algorithms.
1 citation
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.16 × 0.4 = 0.06 |
| M · momentum | 0.53 × 0.15 = 0.08 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.