Increasing Take‐Up of Social Benefits: A Meta‐Analysis of Field Experiments

Karl‐Emil Bendtsen

Journal of Policy Analysis and Management2026https://doi.org/10.1002/pam.70085article
AJG 3ABDC A
Weight
0.50

Abstract

Can reducing administrative burdens increase the take‐up of social benefits? This meta‐analysis reviews 51 field experimental studies reporting 187 treatment effect sizes. Using the administrative burden framework to compare interventions, I reclassify each intervention by the stage it measures on (application vs. final receipt) and whether it reduces the learning demands by providing information or the compliance demands by providing assistance. The results indicate that it is significantly easier to increase application rates than actual take‐up rates. On average, estimated treatment effects are about twice as large when outcomes are measured at the application stage as when they are measured on final benefit receipt. The most effective interventions are the ones reducing compliance demands, as these are estimated to increase actual take‐up by 8.31 percentage points on average. Interventions reducing learning demands are estimated to increase actual take‐up by 3.39 percentage points on average. These findings consolidate the field experimental evidence on how to improve take‐up rates and highlight the need for further research on application stages, treatment compliance, and variation across welfare regimes.

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1002/pam.70085

Or copy a formatted citation

@article{karl‐emil2026,
  title        = {{Increasing Take‐Up of Social Benefits: A Meta‐Analysis of Field Experiments}},
  author       = {Karl‐Emil Bendtsen},
  journal      = {Journal of Policy Analysis and Management},
  year         = {2026},
  doi          = {https://doi.org/https://doi.org/10.1002/pam.70085},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Increasing Take‐Up of Social Benefits: A Meta‐Analysis of Field Experiments

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.50

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.50 × 0.4 = 0.20
M · momentum0.50 × 0.15 = 0.07
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.