How to Buy Smart: Creating Well-Informed Consumers of Educational Interventions

How to Buy Smart: Creating Well-Informed Consumers of Educational Interventions

Mar 23, 2018
Teacher Discussions

Editor’s Note: This blog originally appeared on the U.S. Department of Education’s Office of Educational Technology (OET) Medium page and is being re-posted with their permission. You can follow the full series at https://medium.com/building-evaluation-capacity.

The word “procurement” can bring to mind a bureaucratic process involving lots of paperwork, but the substance of procurement is critical — how do we identify the best products and make best use of educational funds to improve outcomes for students? This is particularly important when it comes to educational technology (ed tech) products, on which districts spend billions of dollars every year. The Every Student Succeeds Act further highlights the need to have reliable information about the effectiveness of products that are purchased with federal funds.

As district staff integrate new technologies into instruction, they often struggle with the challenge of making informed choices about ed tech interventions. With thousands of products to choose from, how can district staff determine which ones are effective and will work well in their own schools with their own students? How do they know whether the information vendors provide about their products is reliable? And once the products are adopted, how can they tell whether the software is having the desired effects and is worth the resources being allocated to them?

These questions boil down to two challenges: (1) making evidence-based decisions about what ed tech products to purchase, and then (2) determining whether these products are working as intended. With the emphasis on accountability in education today, schools need better ways to assess the effect of the resources they buy.

In recognition of this problem, the Office of Educational Technology at the Office U.S. Department of Education funded the development the Rapid Cycle Evaluation Coach (RCE Coach), a set of online tools and resources designed to support evidence-based decisions about procurement. This blog describes these two challenges in more detail and identifies ways in which the RCE Coach can help address them.

Challenge #1: Difficulty selecting new ed tech products due to lack of high-quality, relevant evidence of effectiveness

When purchasing educational technologies, it is hard to predict whether a product will have the desired effect. Many emerging technologies have not been around long enough to have rigorous tests of effectiveness, in view of the long cycles for traditional education research. When studies showing effectiveness do exist, it is not always clear whether their results can be replicated in one’s own schools given differences in context, student populations, and implementation approaches. And the quality of available evidence varies widely, from testimonials to large scale causal studies.

As such, making evidence-based decisions about new technologies is challenging. According to a 2014 study by Digital Promise, less than half of districts use rigorous evidence to make decisions about ed tech purchases, relying more heavily on peer recommendations and pilots. In addition, a survey of over 500 district staff, conducted by an interdisciplinary team of researchers, policy-makers, practitioners and private sector leaders in 2016 found that only 11% of respondents required peer reviewed evidence of effectiveness to adopt a technology.

The net effect is that it is extremely difficult for a chief technology officer or director of instructional technology to be confident that a proposed intervention will “move the needle.”

How the RCE Coach can help:

  • The RCE Coach provides professional development resources, such as a guide to help districts understand the quality of evidence provided by developers. It also provides a guide for choosing technologies. These tools enable district staff to have more informed conversations with developers and ask the right questions about their evidence of effectiveness.
  • By enabling districts to run and then share quick turn-around evaluations of ed tech used in their schools, the RCE Coach has increased the availability of research on both widely used and new products. Moreover, the RCE Coach prompts users to enter information about their district environments and how the products are being used. Thus, districts can search for evaluations that are relevant for their local contexts.

Challenge #2: Making evidence-based decisions on whether to continue or expand use of an ed tech

Some districts conduct pilots to test new technologies in their classrooms, but the resulting feedback is often anecdotal. These pilots typically lack the analytic weight to authentically measure impacts on learning. Moreover, the goals for a particular use of a technology are sometimes not well defined. Decisions to renew licenses thus are often based on subjective factors like individual feedback, academic outcomes that could be influenced by a range of other variables, institutional inertia, or intuition.

How the RCE Coach can help:

  • The RCE Coach walks users through a few simple steps to define their goals for the technology and formulate the questions they really want to answer. In other words, it facilitates a conversation among key stakeholders about what outcomes the district cares about and wants to measure.
  • The Coach makes it easy to set up and conduct evaluations that can isolate the effects of the technology by accounting for other factors that may influence student performance (or other outcomes of interest).
  • Additionally, districts can use the Coach to aggregate data about technologies across schools and districts — increasing confidence in evaluation results, and allowing for analyses of specific subgroups of students.

What does this look like in practice?

Diane Lauer, Assistant Superintendent of Priority Programs and Academic Support for Colorado’s St. Vrain Valley Schools, used the Coach to test two different products designed to improve student literacy. The findings enabled her to examine one developer’s data showing positive effects and to have a more probing conversation about the conditions required for the software to be effective.

Her colleague, Kahle Charles, the executive director of curriculum, is using the RCE Coach to help determine the effectiveness of a high school eCredit recovery pilot. He says “With the RCE Coach, our goal is to change the evaluation and implementation method in our school district to include a process of evidence-based decision making, helping us to make choices more effectively to increase student achievement.”

In North Carolina’s Wake County Public School System, Matthew Lenard, the Director of Data Strategy and Analytics, used the Coach to examine the effects of a digital literacy resource. This evaluation prompted a conversation within the district about their goals for purchasing the database, and they ended up concluding that the original outcome measure — academic achievement — did not reflect their desired theory of change.

Matthew also wanted to scale up efforts to determine whether educational technologies were having the desired effects and found that rapid cycle evaluation provided a practical way to pursue this aim. His colleague Marlo Gaddis, the district’s Chief Technology Officer observed that: “Rapid cycle evaluations have become a valuable part of our district’s strategic procurement, licensing, and evaluation decisions across a range of education technology products.”

Rapid cycle evaluations can play a valuable role in expanding the education community’s understanding of how ed tech products work in different settings, among different student populations, and using different implementation models. By building on this base of knowledge, districts will be able to better judge the return-on-investment of technologies and justify spending on ed tech in the future.

About the Authors

Mathematica logo icon

Kecia Ray

Executive Director, Center for Digital Education
View More by this Author
Mathematica logo icon

Rebecca Griffiths

Principal Education Researcher, SRI International Center for Technology in Learning
View More by this Author