Get Updates via Email Get Updates Get our RSS Feed
  Follow Mathematica on Twitter  Share/Save/Bookmark

Mathematica Designs Value-Added Model for the DC Public Schools

Mathematica Policy Research has worked with the District of Columbia Public Schools (DCPS) since February 2009 to develop measures of school and teacher effectiveness, based on student test score growth, using a method known as a value-added model.

What is a value-added model and why is it used to assess schools and teachers?
A value-added model is a statistical model designed to capture students’ test score growth attributable to a school or teacher, compared to the progress students would have made at the average school or with the average teacher. This method is employed in measuring the performance of schools and/or teachers in many school districts, including Chicago, Dallas, Milwaukee, Minneapolis, and New York City.

A value-added model measures teachers’ contributions to students’ achievement growth and typically accounts for the effect of student background characteristics on that growth. Because a value-added model focuses on growth and accounts for students’ initial performance, it allows any schools or teachers to be identified as high performers, regardless of whether their students were high-performing or low-performing at baseline. Value-added models provide a better measure of school or teacher effectiveness than alternate measures, such as those that rely on gains in the proportion of students achieving proficiency.

What challenges are associated with using value-added methods?
Although the basic concepts associated with using a value-added model to measure school or teacher performance are straightforward, complexities arise when applying the model to data. Among these challenges are addressing student mobility across schools, accounting for co-teaching, generating measures based on a small sample of students, factoring in measurement error in the test, and comparing value-added measures across grades. To account for student mobility and co-teaching, we adapted a standard value-added model and incorporated enrollment and roster data linking students to teachers. We applied statistical techniques to address problems of small samples and measurement error. To compare teachers of different grades, we adjusted teachers’ value-added scores so that the average teacher in each grade received the same value-added score. In addition, we multiplied each teacher’s score by a grade-specific conversion factor to ensure that the dispersion of teacher value-added scores by grade was similar.

It is important to recognize the limitations of any performance measures, including those generated by a value-added model like the one we created for DCPS. For example, as with any statistical model, there is uncertainty in the estimates produced; therefore, two teachers with similar value-added estimates are “statistically indistinguishable” from one another. We quantified the precision with which the measures are estimated by reporting the upper and lower bounds of a confidence interval of performance for each teacher. In addition, because value-added estimates measure not only the effectiveness of the teacher but also the combined effect of all factors that affect student achievement in the classroom, some caution should be applied when comparing teachers across schools. Finally, if student assignment to teachers was based on unobservable factors—for example, pairing difficult-to-teach students with teachers who have succeeded with similar students in the past—a value-added model might unfairly penalize these teachers because it cannot statistically account for factors that cannot be measured. There is a debate among value-added researchers about how important this caveat is in practice. A recent paper by Tom Kane of Harvard University and Douglas Staiger of Dartmouth University offers some evidence suggesting that unobservable student characteristics based on student assignment do not play a large role in determining value-added scores.

What is the relationship between Mathematica Policy Research and DCPS?
DCPS aimed to incorporate measures of school and teacher effectiveness, based on student test score growth, as components of IMPACT, a new teacher assessment system. To support these efforts, DCPS asked Mathematica to design a value-added model to measure school and teacher performance in the district.

DCPS sought an objective, fair, and transparent value-added model. We developed such a model in accord with these principles. It was reviewed by two independent value-added experts, Eric Hanushek of the Hoover Institution at Stanford University and Tim Sass of Florida State University.

Because we feel there is no “one-size-fits-all” value-added model for every context, we worked with DCPS to construct a model appropriate for its district. In the course of making decisions about the value-added model, we presented DCPS with policy options that were informed by the best available research and empirical results using DCPS data. DCPS weighed the trade-offs associated with these options in the context of its goals and circumstances.

"Design of Value-Added Models for IMPACT and TEAM in DC Public Schools, 2010-2011 School Year" (May 2011)
"Measuring School and Teacher Value Added for IMPACT and TEAM in DC Public Schools" (August 2010)

More on our work on value-added models.

About Mathematica: Mathematica Policy Research, a nonpartisan research firm, provides a full range of research and data collection services, including program evaluation and policy research, survey design and data collection, research assessment and interpretation, and program performance/data management, to improve public well-being. Its clients include federal and state governments, foundations, and private-sector and international organizations. The employee-owned company, with offices in Princeton, N.J., Ann Arbor, Mich., Cambridge, Mass., Chicago, Ill., Oakland, Calif., and Washington, D.C., has conducted some of the most important studies of education, health care, international, disability, family support, employment, nutrition, and early childhood policies and programs.