Online Randomized Distributionally Robust Forecast Combination for Dependent Data
Tao Wang
Abstract
We develop an online framework for forecast combination that integrates distributional robustness with randomized weights in dependent time‐series settings. Unlike deterministic approaches that update a single weight vector, our approach models the weights as random draws from parametric families, with parameters updated sequentially to minimize a worst‐case expected loss over a Wasserstein ambiguity set centered at the empirical joint distribution of forecasts and realizations. Randomization facilitates adaptive exploration of alternative weight configurations and improves calibration by accounting for combination uncertainty, while distributional robustness provides protection against heavy tails and model misspecification. We establish finite‐sample concentration bounds for ‐mixing processes, derive oracle‐type excess risk and online regret guarantees, and characterize the bias‐variance trade‐off induced by randomization. In addition, we show that a batch version of the estimator is asymptotically normal under standard identification and mixing conditions. Monte Carlo experiments across a variety of data‐generating processes demonstrate that the proposed method achieves lower worst‐case error, faster post‐break adjustment, and improved predictive coverage relative to existing forecast combination methods. An empirical application to macroeconomic forecasting further illustrates its robustness and effectiveness under dependence and evolving dynamics.
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.50 × 0.4 = 0.20 |
| M · momentum | 0.50 × 0.15 = 0.07 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.