Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation

Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation

Published: Jul 31, 2017
Publisher: Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Analytic Technical Assistance and Development
Download
Authors

Peter Z. Schochet

Charles Tilley

This study investigates how much of a practical difference it makes to use design-based methods versus more conventional model-based methods. The study re-analyzes data from nine past education RCTs covering a wide range of evaluation designs. Impacts are estimated using design-based, HLM, and robust standard error methods. The study finds that the design- and model-based methods yield very similar impact estimates and levels of statistical significance, especially when the underlying analytic assumptions (such as the weights used to aggregate clusters and blocks) are aligned. Furthermore, the differences between the design- and model-based methods are no greater than the differences between the two considered model-based methods.

How do you apply evidence?

Take our quick four-question survey to help us curate evidence and insights that serve you.

Take our survey