Comparing Impact Findings from Design-Based and Model-Based Methods: An Empirical Investigation
Publisher: Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Analytic Technical Assistance and Development
Jul 31, 2017
This study investigates how much of a practical difference it makes to use design-based methods versus more conventional model-based methods. The study re-analyzes data from nine past education RCTs covering a wide range of evaluation designs. Impacts are estimated using design-based, HLM, and robust standard error methods. The study finds that the design- and model-based methods yield very similar impact estimates and levels of statistical significance, especially when the underlying analytic assumptions (such as the weights used to aggregate clusters and blocks) are aligned. Furthermore, the differences between the design- and model-based methods are no greater than the differences between the two considered model-based methods.
You may also like...
What Works for Whom? A Bayesian Approach to Channeling Big Data Streams for Public Program Evaluation
The Impact of Healthy Harlem on the Prevalence of Child Overweight and Obesity and Contributing Factors: Interim Evaluation Report