Get Updates via Email Get Updates Get our RSS Feed
  Follow Mathematica on Twitter  Share/Save/Bookmark
Education

Education Policy Research

Scientifically based methods are the hallmark of our work evaluating education programs and studying education policy issues. Our studies cover early learning experiences as well as education in the K-12 grades and college years. Our studies have provided important counsel to policymakers as they seek ideas for improving American education. We have also played an important role in advancing the state of the science in education research. Read more about our work on specific education topics.


What's New


  • Highlights
  • Latest Work
  • Newsroom
  • Conferences
  • Multimedia
  • Testimony

New Education Studies Focus on Teacher Effectiveness and Access

The first, an evaluation of the Talent Transfer Initiative, found that a $20,000 incentive for high-performing teachers to move to low-performing
elementary schools helped raise math and reading tests scores 4 to 10 percentile points. The second study examined Access to Effective Teaching, finding that disadvantaged students received less effective teaching, on average, than other students in the 29 study districts examined in the study.

Turning Around Low-Performing Schools: New Findings on School Improvement Grants

Evaluation brief examining school turnaround efforts funded by School Improvement Grants (SIG) finds that schools implementing a SIG-funded intervention model were more likely than schools not implementing one to report having primary responsibility for setting professional development requirements and determining the length of the school day. Fact sheet | Press release


 

 

 

 

 

 

  • "Kauffman School Evaluation Long-Term Outcomes Report: Year 2." Matthew Johnson, Eric Lundquist, Cleo Jacobs Johnson, and Claudia Gentile, March 2014. The Kauffman School is a charter school in Kansas City, Missouri that opened in 2011 to serve middle and high school students from the city’s most economically disadvantaged neighborhoods . This report evaluates the effectiveness of the School at improving student achievement, attendance, and discipline outcomes during its first two years of operation.
  • "Strategic Data Project and Education Pioneers Year 1 Report: Laying the Groundwork for Data-Driven Decision Making." Kristin Hallgren, Cassie Pickens Jewell, Celina Kamer, Jacob Hartog, Andrew Gothro, October 2013. The Bill & Melinda Gates Foundation contracted with Mathematica to conduct an implementation study of the Strategic Data Project (SDP) and Education Pioneers (EP) programs, which aim to enhance the capacity of school districts and other education agencies to collect, manage, analyze, and use data through the support, training, and placement of additional staff. This report presents findings from the study's first year (2012–2013), when partner agencies in the study began working with the SDP or EP program.
  • "Companion Document for the Strategic Data Project and Education Pioneers Year 1 Report: Laying the Groundwork for Data-Driven Decision Making." Kristin Hallgren, Cassie Pickens Jewell, Celina Kamler, Jacob Hartog, and Andrew Gothro, October 2013. The Bill & Melinda Gates Foundation contracted with Mathematica to conduct an implementation study of the Strategic Data Project (SDP) and Education Pioneers (EP) programs, which aim to enhance the capacity of school districts and other education agencies to collect, manage, analyze, and use data through the support, training, and placement of additional staff. This companion document to the full report presents seven profiles, one for each agency in the study, which describe the experience of each agency and were developed using data collected from interviews that addressed agencies' data collection, use, analysis, and reporting efforts.
  • "The Teacher-Student Data Link Project: First-Year Implementation." Kristin Hallgren, Cassie Pickens Jewell, and Celina Kamler, February 2013. The Bill & Melinda Gates Foundation contracted with Mathematica to conduct an implementation study of the Teacher-Student Data Link project, which aims to support states in developing best practices for linking student and teacher data to improve educational outcomes. This report presents findings from 2011, the project's first year of implementation.
  • "Transfer Incentives for High-Performing Teachers: Final Results from a Multisite Randomized Experiment." Steven Glazerman, Ali Protik, Bing-ru Teh, Julie Bruch, and Jeffrey Max, November 2013. This report, a multisite randomized experiment looks at the Talent Transfer Initiative, which offers a $20,000 incentive to high-performing teachers to move to low-performing schools. The intervention had positive effects on student test scores in math and reading in elementary schools—the equivalent of a 4 to 10 percentile point increase. Executive summary.
  • "Access to Effective Teaching for Disadvantaged Students." Eric Isenberg, Jeffrey Max, Philip Gleason, Liz Potamites, Robert Santillano, Heinrich Hock, and Michael Hansen, November 2013. This study explores the disparity in access to effective teachers in 29 school districts across the country, revealing that disadvantaged students receive poorer-quality instruction, on average, compared with other students. Mathematica conducted the studies for the Institute of Education Sciences. Executive summary.
  • "Measuring Teacher Value Added in DC, 2012-2013 School Year." Eric Isenberg and Elias Walsh, September 2013. This report updates the approach to estimating value-added models of teacher effectiveness in the District of Columbia Public Schools (DCPS) and eligible DC charter schools participating in Race to the Top during the 2012–2013 school year.
  • "Using Alternative Student Growth Measures for Evaluating Teacher Performance: What the Literature Says." Brian Gill, Julie Bruch, and Kevin Booker, September 2013. This report summarizes the evidence on measures of student achievement growth used in teacher evaluation that do not rely on traditional annual state assessments, and that instead use commercially available assessments, locally developed common assessments, and teacher-developed student learning objectives. Executive summary.
  • "Impacts of Five Expeditionary Learning Middle Schools on Academic Achievement." Ira Nichols-Barrer and Joshua Haimson, July 2013. In the first rigorous study of the impacts of Expeditionary Learning (EL) model schools, Mathematica found that EL middle school students perform better in reading and math than their counterparts in other public schools. Fact sheet.
  • "The Effectiveness of Secondary Math Teachers from Teach For America and the Teaching Fellows Programs." Melissa A. Clark, Hanley S. Chiang, Tim Silva, Sheena McConnell, Kathy Sonnenfeld, Anastasia Erbe, and Michael Puma, September 2013. The first large-scale, random assignment study of the effects of secondary school math teachers from Teach For America and the Teaching Fellows programs found they were as effective as, and in some cases more effective than, teachers receiving traditional certification. The study was sponsored by the U.S. Department of Education's Institute of Education Sciences. Read the brief.
  • "Improving Post-High School Outcomes for Transition-Age Students with Disabilities: An Evidence Review." R. Brian Cobb, Stephen Lipscomb, Jennifer Wolgemuth, Theresa Schulte, Abigail Veliquette, Morgen Alwell, Keriu Batchelder, Robert Bernard, Paul Hernandez, Helen Holmquist-Johnson, Rebecca Orsi, Laura Sample McMeeking, Jun Wang, and Andrea Welnberg, August 2013. This report uses evidence-based standards developed by the What Works Clearinghouse to review the research literature on programs that help students with disabilities make transitions after high school. Community-based work programs had mixed effects on employment and potentially positive effects on postsecondary education. Functional life skills development programs had potentially positive effects on independent living although the extent of evidence was small. Executive summary.
  • "Impacts of Five Expeditionary Learning Middle Schools on Academic Achievement." Ira Nichols-Barrer and Joshua Haimson, July 2013.
  • "Charter School Authorizers and Student Achievement.." Ron Zimmer, Brian Gill, Jonathon Attridge, and Kaitlin Obenauf. Education Finance and Policy, January 2014 (subscription required). This article uses data from Ohio, a state that allows a wide range of organizations to authorize charter schools, to examine the relationship between type of authorizer and charter-school effectiveness.
  • "Do KIPP Schools Boost Student Achievement?" Philip M. Gleason, Christina Clark Tuttle, Brian Gill, Ira Nichols-Barrer, and Bing-ru Teh. Education Finance and Policy, January 2014 (subscription required). This article measures the achievement impacts of 41 Knowledge Is Power Program (KIPP) charter middle schools nationwide and found consistently positive and statistically significant test-score effects in reading, math, science, and social studies.
  • "Borrowing Constraints, College Enrollment, and Delayed Entry." Matthew T. Johnson. Journal of Labor Economics, October 2013 (subscription required). This article specifies a dynamic model of education, borrowing, and work decisions of high school graduates to ascertain how increasing the amount students are permitted to borrow through government-sponsored loan programs would affect educational attainment.
  • "Statistical Power for School-Based RCTs with Binary Outcomes." Peter Z. Schochet. Journal of Research on Educational Effectiveness, June 2013 (subscription required). This article develops a new approach for calculating appropriate sample sizes for school-based randomized controlled trials (RCTs) with binary outcomes using logit models with and without baseline covariates. The theoretical analysis develops sample size formulas for clustered designs in which random assignment is at the school or teacher level using generalized estimating equation methods. The key finding is that sample sizes of 40 to 60 schools that are typically included in clustered RCTs for student test score or behavioral scale outcomes will often be insufficient for binary outcomes.
  • "Funding Special Education by Total District Enrollment: Advantages, Disadvantages, and Policy Considerations." Elizabeth Dhuey and Stephen Lipscomb. Education Finance and Policy, summer 2013 (subscription required). This policy brief aims to help policymakers, educators, and researchers better understand census funding, a special education finance model used by several states and the federal government. Under this model, aid levels are based primarily on total district enrollment and a fixed amount per student. Census funding is viewed as a cost-containment approach, but it has raised concerns about funding equity. The brief examines the key advantages and disadvantages of the model and discusses options for easing funding equity concerns. It also describes other ways in which states and districts may be able to contain special education costs while maintaining quality programs.
  • "What Are Error Rates for Classifying Teacher and School Performance Using Value-Added Models?" Peter Z. Schochet and Hanley S. Chiang. Journal of Educational and Behavioral Statistics, April 2013 (subscription required). This article addresses likely error rates for measuring teacher and school performance in the upper elementary grades using value-added models applied to student test score gain data. Using formulas based on ordinary least squares and empirical Bayes estimators, error rates for comparing a teacher’s performance to the average are likely to be about 25 percent with three years of data and 35 percent with one year of data. Corresponding error rates for overall false positive and negative errors are 10 percent and 20 percent, respectively. The results suggest that policymakers must carefully consider likely system error rates when using value-added estimates to make high-stakes decisions regarding educators.
  • "Charter High Schools' Effects on Educational Attainment and Earnings." Kevin Booker, Brian Gill, Tim Sass, and Ron Zimmer, January 2014. This issue brief discusses a new analysis, using data from Florida and Chicago, suggesting that charter high schools are not only increasing postsecondary educational attainment but may also boost students' long-run earnings.
  • “Do Disadvantaged Students Get Less Effective Teaching? Key Findings from Recent Institute of Education Sciences Studies.” Jeffrey Max and Steve Glazerman, January 2014. This IES evaluation brief helps policymakers understand emerging evidence on access to effective teaching and synthesizes findings from three peer-reviewed studies commissioned by the U.S. Department of Education’s Institute of Education Sciences. According to the brief, disadvantaged students received less effective teaching than other students, and their access to effective teaching varied across school districts. Mathematica conducted two of the studies cited in the brief, including one that measured access to the highest-performing teachers and another that measured access to effective teaching.
    Technical appendix

  • "Operational Authority, Support, and Monitoring of School Turnaround." NCEE  Evaluation Brief. Rebecca Herman, Cheryl Graczewski, Susanne James-Burdumy, Matthew Murray, Irma Perez-Johnson, and Courtney Tanenbaum, December 2013. This brief focuses on the implementation of School Improvement Grants (SIG) by examining three interrelated levers for school improvement: (1) school operational authority, (2) state and district support for turnaround, and (3) state monitoring of turnaround efforts. SIG principles emphasize that school leaders should be given the autonomy to operate on matters such as staffing, calendars, and budgeting, but then also be appropriately supported and monitored by states and districts to ensure progress. Findings are based on spring 2012 survey responses from 450 school administrators and interviews with administrators in the 60 districts and 21 of the 22 states where these schools are located.

  • "After Two Years, Three Elementary Math Curricula Outperform a Fourth." NCEE Evaluation Brief. Roberto Agodini, Barbara Harris, Neil Seftor, Janine Remillard, and Melissa Thomas, September 2013. Read the final report from a large-scale, rigorous study examining how four math curricula affect achievement across two years—from first through second grades. The curricula were (1) Investigations in Number, Data, and Space; (2) Math Expressions; (3) Saxon Math; and (4) Scott Foresman-Addison Wesley Mathematics. Fact sheet.
  • "Charter High Schools' Effects on Long-Term Attainment and Earnings." Kevin Booker, Brian Gill, Tim Sass, and Ron Zimmer, January 2014.This working paper discusses a new analysis, using data from Florida and Chicago, suggesting that charter high schools are not only increasing postsecondary educational attainment but may also boost students' long-run earnings.
  • "Staffing a Low-Performing School: Behavioral Responses to Selective Teacher Transfer Incentives." Ali Protik, Steven Glazerman, Julie Bruch, and Bing-ru Teh, December 2013. This working paper examines behavioral responses to an incentive program that offers high-performing teachers in 10 school districts across the country $20,000 to transfer into the district's hardest-to-staff schools. Specifically, the paper looks at high-performing teachers' willingness to transfer and the effect of the transfer offer on the internal dynamics of receiving schools.
  • "Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables." Matthew Johnson, Stephen Lipscomb, and Brian Gill, November 2013. This working paper examines the sensitivity and precision of teacher value-added model (VAM) estimates obtained under model specifications that differ based on whether they include student-level background characteristics, peer-level background characteristics, and/or a double-lagged achievement score. It also tests two model variations not previously evaluated—replacing classroom average peer characteristics with teacher-year level averages, or allowing for variation in the relationship between current and lagged achievement scores based on student demographic characteristics—to determine whether they affect the VAM estimates.
  • "How Does a Value-Added Model Compare to the Colorado Growth Model?" Elias Walsh and Eric Isenberg, October 2013. This working paper compares teacher evaluation scores from a typical value-added model with results from the Colorado Growth Model (CGM) and finds that use of the CGM in place of a value-added model depresses the evaluation scores for teachers with more English language learner students and increases the evaluation scores for teachers of low-achieving students.
  • "Accounting for Co-Teaching: A Guide for Policymakers and Developers of Value-Added Models." Eric Isenberg and Elias Walsh, October 2013. This working paper outlines four options available to policymakers for addressing co-teaching in a value-added model: the partial credit method, the teacher team method, the full roster method, and the full roster-plus method. The authors discuss why the first two methods are impractical, and show that the last two methods are empirically similar.
  • "Elementary School Data Issues: Implications for Research Using Value-Added Models." Eric Isenberg, Bing-ru Teh, and Elias Walsh, October 2013. This working paper compares teacher/student links that have undergone a roster confirmation process—whereby teachers verify the subjects and students they taught—to teacher/student links from unconfirmed administrative data. Due to the departmentalization of instruction in math and reading/English language arts in grades 4 and 5, about one in six teachers in these grades and subjects is linked in the unconfirmed data to a subject that he or she does not teach. The authors discuss the circumstances in which using unconfirmed teacher/student links in value-added models most affect research.
  • "Shrinkage of Value-Added Estimates and Characteristics of Students with Hard-to-Predict Achievement Levels." Mariesa Herrmann, Elias Walsh, Eric Isenberg, and Alexandra Resch. April 2013. This working paper investigates how empirical Bayes shrinkage, an approach commonly used in implementing teacher accountability systems, affects the value-added estimates of teachers of students with hard-to-predict achievement levels, such as students who have low prior achievement and receive free lunch. Teachers of these students tend to have less precise value-added estimates than teachers of other types of students. Shrinkage increases their estimates’ precision and reduces the absolute value of their value-added estimates. However, this paper found shrinkage has no statistically significant effect on the relative probability that teachers of hard-to-predict students receive value-added estimates that fall in the extremes of the value-added distribution and, as a result, receive consequences in the accountability system.
  • "Does Tracking of Students Bias Value-Added Estimates for Teachers?" Ali Protik, Elias Walsh, Alexandra Resch, Eric Isenberg, and Emma Kopa, March 2013. This working paper uses urban school district data to investigate whether including track indicators or accounting for classroom characteristics in the value-added model is sufficient to eliminate potential bias resulting from the sorting of students into academic tracks. Accounting for two classroom characteristics—mean classroom achievement and the standard deviation of classroom achievement—may reduce bias for middle school math teachers, whereas track indicators help for high school reading teachers. However, including both of these measures simultaneously reduces the precision of the value-added estimates in this context. While these different specifications produce substantially different value-added estimates, they produce small changes in the tails of value-added distribution.
  • "Horseshoes, Hand Grenades, and Treatment Effects? Reassessing Bias in Nonexperimental Estimators." Kenneth Fortson, Philip Gleason, Emma Kopa, and Natalya Verbitsky-Savitz, March 2013. Nonexperimental methods, such as regression modeling or statistical matching, produce unbiased estimates if the underlying assumptions hold, but these assumptions are usually not testable. Most studies testing nonexperimental designs find that they fail to produce unbiased estimates, but these studies have examined weaker evaluation designs. This working paper addresses these limitations and finds the use of baseline data that are strongly predictive of the key outcome measures considerably reduces bias, but might not completely eliminate it.
  • "Charter School Authorizers and Student Achievement." NCSPE Occasional Paper No. 219. Ron Zimmer, Brian Gill, Jonathon Attridge, and Kaitlin Obenauf, June 2013. This paper uses individual student-level data from Ohio–which permits a wide range of organizations to authorize charter schools—to examine the relationship between type of authorizer and charter-school effectiveness, as measured by students’ achievement trajectories.
  • What Works Clearinghouse (WWC)
  • Administered by Mathematica for the U.S. Department of Education's Institute of Education Sciences, the WWC produces a variety of reports that assess and summarize education research. WWC reports can help educators make important decisions about what curriculums to implement, what products to purchase, and what methods to use in their classrooms and schools.

    See the WWC's latest releases at whatworks.ed.gov and explore available Practice Guides, Intervention Reports, and Quick Reviews or take a guided tour of the site.

Social Solutions Pay for Less/Social Impact Bond (SIB) Webinar SeriesBuilding a SIB-Ready Sector: Getting Ready—Webinar—May 8, 2014, 1:30-3:00 p.m. (ET), click here to register.
Scott Cody, Presenter

American Educational Research Association Annual MeetingThe Power of Education Research for Innovation in Practice and Policy—Philadelphia, PA—April 3-7, 2014

Annual Conference of the Association for Education Finance and PolicyNew Players in Education Finance and Policy—San Antonio, TX—March 13-15, 2014

Active Living Research Annual ConferenceNiche to Norm—San Diego, CA—March 9-12, 2014
Susanne James-Burdumy: "Impact of Playworks on Play, Physical Activity, and Recess: Findings from a Randomized Controlled Trial"

Society for Research on Educational Effectiveness Spring ConferenceImproving Education Science and Practice: The Role of Replication—Washington, DC—March 6-8, 2014

National Center for Analysis of Longitudinal Data in Education Research Annual Conference—Washington, DC—January 23-24, 2014
Kevin Booker, Brian Gill, and Others: "Charter High Schools' Effects on Long-Term Attainment and Earnings" Working Paper

American Economic Association—Philadelphia, PA—January 3-5, 2014
Bing-ru Teh, Steven Glazerman, Ali Protik, Julie Bruch, and Jeffrey Max: "Moving High-Performing Teachers to Low Achieving Schools"

University of Pennsylvania, Graduate School of Education, Institute of Education Sciences Pre-Doctoral Training Program in Interdisciplinary Methods for Field-Based Research in Education Lecture Series—Philadelphia, PA—November 11, 2013
Philip Gleason: "Are Experiments Necessary in Education Research? Assessing the Validity of Non-Experimental Methods for Estimating Charter School Impacts"

Brookings InstitutionCharter School Research: Generating the Evidence Needed for the Charter Sector to Reach Its Potential—Washington, DC—October 18, 2013
Philip Gleason: "Charter School Research: Expanding Methods for Rigorously Estimating Charter School Impacts"

Society for Research on Educational EffectivenessInterdisciplinary Synthesis in Advancing Education Science—Washington, DC—September 26-28, 2013
Peter Schochet: "Partially Nested Designs in RCT: Theory and Practice" [Abstract]
Jill Constantine, Annalisa Mastri, and Sarah Avellar, Panelists: Systematic Reviews: Growing Up to Meet Practitioner, Policymaker, and Researcher Needs

University of Pennsylvania WorkshopTeaching Cases: Nuts and Bolts of Randomized Controlled Trials in Education—Philadelphia, PA—August 13-14, 2013
Allen Schirm, Speaker: National Program Evaluation: Upward Bound

Jacobs FoundationGood Enough: When is Evidence-Based Intervention Ready for Dissemination?—Z├╝rich, Switzerland—May 30-June 1, 2013
Jill Constantine: "Systematic Reviews as a Tool in Evidence-Based Decision Making: Improving Research and Informing Practice"

Mathematica and the Institute of Education Sciences of the U.S. Department of Education Forum/WebinarEqual Access to Effective Teaching: What New Research Has to Say About the Problem and a Possible Solution—Washington, DC—December 10, 2013
Steven Glazerman and Eric Isenberg, Presenters

Institute of Education Sciences at the U.S. Department of Education WebinarDemystifying the What Works Clearinghouse: A Webinar for Developers and Researchers—December 3, 2013
Jill Constantine, Scott Cody, Neil Seftor, and Others, Speakers

Albert Shanker Institute and the American Federation of Teachers Forum/WebinarQuality Assessments for Educational Excellence—Washington, DC—November 13, 2013
Steven Glazerman: "Standardized Testing: Mend It, Don't End It"

Mathematica Policy Research and the Institute on Education Sciences of the U.S. Department of EducationAddressing Teacher Shortages in Disadvantaged Schools: Alternative Routes to Teacher Certification and Student Achievement—Issue Forum/Webinar—September 12, 2013
Melissa Clark, Jill Constantine, and Others, Speakers

Center for Education Policy Research at Harvard University—WebinarDecember 3, 2012
Ali Protik: "Moving High-Performing Teachers: Implementation of Transfer Incentives in Seven Districts"

American Enterprise Institute for Public Policy ResearchTeacher Pay Incentives: Lessons from North Carolina's Teacher Bonus ProgramWashington, DCJune 28, 2011
Duncan Chaplin, Panelist (Video)

Jill ConstantineJill Constantine, vice president, Director of NJ Human Services Research, and area leader for Mathematica's education research, testified before the California Commission on Teacher Credentialing on our evaluation of teachers trained through different routes to certification. Her testimony is available as an audio file or slides.