"Is School Value-Added Indicative of Principal Quality?" (November 2012)
"Assessing the Rothstein Test: Does It Really Show Teacher Value-Added Models Are Biased?" (February 2012)
"Evaluating Teachers: The Important Role of Value-Added" (November 2010)
Using Value-Added Growth Models to Track Teacher and School Performance
Value-added methods (sometimes described as student growth models) measure school and teacher effectiveness as the contribution of a school or teacher to students’ academic growth. The methods account for students’ prior achievement levels and other background characteristics. Mathematica works with school districts, charter management organizations, state education agencies, and policymakers to develop value-added methods. In addition to implementing these methods on the ground in schools and districts, we also conduct value-added research to inform issues such as teacher quality distribution as well as teacher incentives and performance.
Education stakeholders also use value-added measures to identify promising approaches for improving student achievement. The measures help detect high-performing schools and teachers whose students make strong gains during a school year, for students with low or high initial achievement. For example, using value-added data provided by Mathematica, New Leaders for New Schools studied high-performing teachers within high-performing schools to document the instructional practices these teachers use.
Because value-added measures are used to judge the relative effectiveness of schools and teachers, their statistical foundations must be solid, clearly documented, and hold up to public scrutiny. Further, they must be communicated clearly to educators so that the findings are perceived as accurate and fair. Our value-added measures are designed to be technically sound, objective, transparent, and responsive to stakeholders’ concerns. We are known for our ability to address methodological challenges inherent in modeling student achievement and to communicate the results with language and graphics that resonate with educators.
Recognizing the trade-offs inherent in developing value-added measures, we work with local stakeholders and do not insist on a single “correct” approach. At the same time, our experience can help inform educators about the consequences of modeling decisions. We use existing data on student test scores and demographics as much as possible, with additional district-level data incorporated as needed.
Partnering with School Districts and States
Our researchers have assessed the effectiveness of schools and teachers in public and charter school settings. In the Pittsburgh Public Schools, we have developed school performance measures and helped the district use them to identify ways to improve performance. For the District of Columbia Public Schools, we work within a tight timeline to help measure the effectiveness of schools and teachers as part of the IMPACT evaluation system. We have also provided value-added measures to the Achievement First network of charter schools that they use to assess teacher effectiveness as new data become available.
Improving Value-Added Methods
Mathematica researchers have made significant contributions to methodological advances in value-added methods. For instance, a Mathematica report to the U.S. Department of Education analyzed how the amount of data used in value-added models affects the accuracy with which these models can identify high- and low-performing educators. This analysis provides a clear framework by which policymakers can determine the number of years of data on which teachers’ and schools’ evaluations should be based. In a brief for the U.S. Department of Education, we estimated teachers’ value-added and measured access to highest-performing teachers in high- and low-income schools.
Use of Value-Added Methods for National Evaluations