Impossibility of depth reduction in explainable clustering

Chengyuan Deng et al.

Information and Computation2026https://doi.org/10.1016/j.ic.2026.105430article
ABDC B
Weight
0.50

Abstract

Over the last few years Explainable Clustering has gathered a lot of attention. Dasgupta et al. [ICML’20] initiated the study of explainable k -means and k -median clustering problems where the explanation is captured by a threshold decision tree which partitions the space at each node using axis parallel hyperplanes. Recently, Laber et al. [Pattern Recognition’23] made a case to consider the depth of the decision tree as an additional complexity measure of interest. In this work, we prove that even when the input points are in the Euclidean plane, then any depth reduction in the explanation incurs unbounded loss in the k -means and k -median cost. Formally, we show that there exists a data set X ⊆ R 2 , for which there is a decision tree of depth k − 1 whose k -means/ k -median cost matches the optimal clustering cost of X , but every decision tree of depth less than k − 1 has unbounded cost w.r.t. the optimal cost of clustering. We extend our results to the k -center objective as well, albeit with weaker guarantees.

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1016/j.ic.2026.105430

Or copy a formatted citation

@article{chengyuan2026,
  title        = {{Impossibility of depth reduction in explainable clustering}},
  author       = {Chengyuan Deng et al.},
  journal      = {Information and Computation},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1016/j.ic.2026.105430},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Impossibility of depth reduction in explainable clustering

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.50

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.50 × 0.4 = 0.20
M · momentum0.50 × 0.15 = 0.07
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.