Evaluating Bibliometrics Reviews: A Practical Guide for Peer Review and Critical Reading
Anh-Duc Hoang
Abstract
Along with discussing bibliometric analyses' limitations and potential biases, this paper addresses the growing need for comprehensive guidelines in evaluating bibliometric research by providing systematic frameworks for both peer reviewers and readers. While numerous publications provide guidance on implementing bibliometric methods, there is a notable lack of frameworks for assessing such research, particularly regarding performance analysis and science mapping. Drawing from an extensive review of bibliometric practices and methodological literature, this paper develops structured evaluation frameworks that address the complexity of modern bibliometric analysis, introducing the VALOR framework (Verification, Alignment, Logging, Overview, Reproducibility) for assessing multi-source bibliometric studies. The paper's key contributions include comprehensive guidelines for evaluating data selection, cleaning, and analysis processes; specific criteria for assessing conceptual, intellectual, and social structure analyses; and practical guidance for integrating performance analysis with science mapping results. By providing structured frameworks for reviewers and practical guidelines for readers to interpret and apply bibliometric insights, this work enhances the rigor of bibliometric research evaluation while supporting more effective peer review processes and research planning. The paper also discusses potential areas for further development, including the integration of qualitative analysis with bibliometric data and the advancement of field-normalized metrics, ultimately aiming to support authors, reviewers, and readers in navigating the complexities of bibliometrics and enhancing the meaningfulness of bibliometric research.
30 citations
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.78 × 0.4 = 0.31 |
| M · momentum | 1.00 × 0.15 = 0.15 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.