Stochastic Maintenance for a Large Fleet of Structures
Alain Bensoussan et al.
Abstract
This paper explores stochastic maintenance for a large fleet of structures, focusing on the example of bridges. The approach involves replacing the states of degradation of a facility with a probability distribution over the states. Even though the degradation state of each individual structure may be available, it is often more convenient to work with the proportions of structures in each degradation state. This is particularly useful when incorporating constraints such as limited maintenance budgets or desired quality levels for the overall fleet. Probability distributions are commonly used when the initial state is unknown or when degradation is observed with error (i.e., under partial observation). However, the same methodology is applied here in a different context. Full information may be available, but using it directly can be too complex. Instead, partial information is used to reduce this complexity. The theory of Markov Decision Processes (MDPs) provides a framework for many applications in Operations Research and Management Science, and stochastic maintenance has become one such application. When working with probability distributions over states instead of individual states, the framework is referred to as a mean-field MDP. In this setting, the dynamic programming methodology for MDPs is extended to the mean-field case, tailored to fleets of structures. Both value iteration and policy iteration algorithms are considered to characterize the value function and determine the optimal (randomized) control policy.
1 citation
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.16 × 0.4 = 0.06 |
| M · momentum | 0.53 × 0.15 = 0.08 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.