Fuzzy Decision Trees for Explainable Brain Tumor Classification: A Comparative Study with Deep Neural Networks and Classical Binary Decision Trees

Pietro Ducange et al.

Information Systems Frontiers2026https://doi.org/10.1007/s10796-025-10683-2article
AJG 3ABDC A
Weight
0.37

Abstract

Brain Tumor Classification (BTC) using Magnetic Resonance Imaging (MRI) has achieved remarkable progress through Deep Learning (DL) models, particularly Convolutional Neural Networks (CNNs). However, the opaque nature of these models raises concerns regarding explainability, which is critical in clinical decision support. To address this, most research has focused on post-hoc Explainable AI (XAI) methods that provide after-the-fact interpretations of CNN predictions. In contrast, this work investigates an inherently explainable alternative based on Fuzzy Decision Trees (FDTs), which combine the interpretability of rule-based reasoning with the expressiveness of fuzzy logic. Moreover, we enhance model transparency by integrating radiomic features that capture clinically meaningful tumor characteristics such as shape, texture, and intensity. To the best of our knowledge, this is among the first studies to apply FDTs to brain tumor classification from MRI, explicitly coupling radiomics with multi-way FDT architectures. We perform a comprehensive evaluation comparing FDTs against four state-of-the-art CNNs, namely ConvNeXt, ResNet18, ResNet50, and EfficientNetB0, as well as classical binary Decision Trees (DTs). We provide an explicit analysis of the trade-off between accuracy, complexity, and interpretability of the models. Results show that FDTs achieve competitive performance (overall F1-score $$\approx$$ 0.84) compared to the best CNN baseline (ResNet50, F1-score $$\approx$$ 0.86), while offering substantially higher explainability and interpretability. Overall, this study demonstrates that FDTs can bridge the gap between accuracy and explainability, offering a viable explainable-by-design alternative to deep learning in medical imaging. Future work will focus on validating this generalizability across different imaging domains and dataset variations.

1 citation

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1007/s10796-025-10683-2

Or copy a formatted citation

@article{pietro2026,
  title        = {{Fuzzy Decision Trees for Explainable Brain Tumor Classification: A Comparative Study with Deep Neural Networks and Classical Binary Decision Trees}},
  author       = {Pietro Ducange et al.},
  journal      = {Information Systems Frontiers},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1007/s10796-025-10683-2},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Fuzzy Decision Trees for Explainable Brain Tumor Classification: A Comparative Study with Deep Neural Networks and Classical Binary Decision Trees

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.37

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.16 × 0.4 = 0.06
M · momentum0.53 × 0.15 = 0.08
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.