MEDS: Methodology for Evaluation in Design Science

Richard Baskerville et al.

European Journal of Information Systems2026https://doi.org/10.1080/0960085x.2026.2627280article
AJG 4ABDC A*
Weight
0.50

Abstract

Design Science Research (DSR) is a paradigm that centres on the development and evaluation of artefacts. While central, a proper scientific evaluation can significantly increase the work required to complete a DSR project. The high costs, workload, and delays caused by evaluations have been shown to lead to poor or absent evaluations. These findings present two paradoxes: (1) evaluation is considered essential, yet it is commonly done poorly or not at all; (2) DSR promises to improve research relevance through timely artefact development and reliability through scientific evaluation, yet poor and lengthy evaluations cause DSR to often fail to deliver on its promise. To address these paradoxes, this paper presents MEDS (Methods for Evaluation in Design Science), a four-step evaluation method that includes three component methods new to the evaluation guidance literature: MuSCoW (Must, Should, Could, Won’t), the DSR Evaluation Selection Framework, and Short-Scoping of evaluation methods. MEDS aims to address the above paradoxes and guide DSR researchers, especially novices, to achieve a programme of effective and resource-efficient DSR evaluations as an on-going component of existing DSR methodologies. By carefully planning, orchestrating, and scoping evaluation activities, MEDS delivers a rigorous evaluation programme without overwhelming researchers with a massive evaluation burden.

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1080/0960085x.2026.2627280

Or copy a formatted citation

@article{richard2026,
  title        = {{MEDS: Methodology for Evaluation in Design Science}},
  author       = {Richard Baskerville et al.},
  journal      = {European Journal of Information Systems},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1080/0960085x.2026.2627280},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

MEDS: Methodology for Evaluation in Design Science

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.50

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.50 × 0.4 = 0.20
M · momentum0.50 × 0.15 = 0.07
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.