Random-key optimizer and linearization for the quadratic multiple constraints variable-sized bin packing problem
Natalia Alves Santos et al.
Abstract
This paper addresses the Quadratic Multiple Constraints Variable-Sized Bin Packing Problem (QMC-VSBPP), a challenging combinatorial optimization problem that generalizes the classical bin packing problem by incorporating multiple capacity dimensions, heterogeneous bin types, and quadratic interaction costs between items. We propose two complementary methods that advance the current state-of-the-art. First, a linearized mathematical model is introduced to eliminate quadratic terms, enabling the use of exact solvers such as Gurobi to compute strong lower bounds—reported here for the first time for this problem. Second, we develop RKO-ACO, a continuous-domain Ant Colony Optimization algorithm within the Random-Key Optimizer framework, enhanced with adaptive Q-learning parameter control and efficient local search. Extensive computational experiments on benchmark instances show that the proposed linearized model produces significantly tighter lower bounds than the original quadratic model, while RKO-ACO consistently matches or improves upon all best-known solutions in the literature, establishing new upper bounds for large-scale instances. These results provide new reference values for future studies and demonstrate the effectiveness of evolutionary and random-key approaches for solving complex quadratic packing problems. • Advances the state-of-the-art for the QMC-VSBPP with exact and metaheuristic methods. • Proposes a new linearized model, eliminating complex quadratic terms. • Introduces lower bounds via exact solutions of the linearized model using Gurobi. • Develops the RKO-ACO: an adaptation of continuous ACO for the Random-Key Optimizer.
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.50 × 0.4 = 0.20 |
| M · momentum | 0.50 × 0.15 = 0.07 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.