How to Approach Data-Driven Decisions in Education

Jun 06, 2016

“Data-driven” has become a popular term in recent years. It proudly declares that the issues with which we wrestle have been informed by simply involving data in the process. But being driven by data requires more than just the existence of good data or superior systems to warehouse the data. It requires a meaningful process for developing questions, conducting exploration and analysis, and ultimately, using the data to help us arrive at solutions.

In education, the issues vary across classrooms, schools, districts, and states, but the process for using data to inform decisions can follow a common set of activities. The figure below offers a picture of the key elements of a data-driven process in education.

Data decision infographic  

At the heart of the process of effectively using data in educational decision making are three sequential steps:

1. Assemble high-quality raw data. The sheer amount of potentially available education data is vast. Data can include formative or summative test results, interviews, observations, surveys, financial records, and transcripts. The decision at hand should suggest which data should be assembled.

2. Conduct analysis that ensures results are relevant and diagnostic. Educators often find themselves drowning in data rather than being driven by it—burdened by irrelevant data that will not be used and non-diagnostic data that might be used inappropriately.

Relevance depends on who receives the data, the timeliness, and the level of detail. Giving high school counselors data on how former students perform in postsecondary settings is largely irrelevant because the students are already gone and because the counselor likely has no way to identify the reasons for their success or failure. Teachers and school-level staff typically need student data that are fine-grained—at the level of individual students and specific skills—and rapidly delivered to give them useful evidence to adjust instruction. Higher-level decision makers are likely to need data on programs and staff, as well as broader data on students.

The same data can be diagnostic for some decisions and not for others. A teacher’s value-added score, for example, might be diagnostic for informing a principal’s hiring decision, but is not, in itself, diagnostic for how to improve a teacher’s practice. (That is, this data point does not pinpoint specific strengths or areas for improvement that should be addressed in order to change the score in the future.) To be diagnostic, data must be reliable and valid for informing the decision at hand. Reliability means that the data are stable—when measured repeatedly, the same results are generated. Validity means the data are appropriate; data that are improperly analyzed or interpreted can lead to invalid inferences. In other words, invalid data can cause decision makers to draw exactly the wrong conclusions. For example, raw student test scores might be valid for understanding a student’s skills but invalid for understanding a teacher’s contribution to that student’s learning, because they don’t account for the student’s underlying ability.

3. Use relevant and diagnostic data to inform instructional and operational decisions. Even the right data and clear analysis aren’t helpful if the results are filed away and forgotten!

 

Key Organizational Supports

The best decisions require infrastructure, policies, and practices to support them. Three key organizational traits must be in place, as the figure shows:

1. Data infrastructure. The creation and improvement of data systems are essential for effectively collecting, transferring, and manipulating information. Establishing links between distinct databases facilitates analyses that require connections across data types. Creating low-burden data collection mechanisms and certifying and monitoring data collectors (such as principals conducting observations) also support data quality. Adjusting data access and management practices to ensure timely delivery enhances the likelihood that data will be used, and employing verification systems ensures data integrity.

2. Analytic capacity. The ability of staff to create analysis plans and make sense of findings may require support and training. Relevant training topics might include implementing data practices, accessing and analyzing data, or data management and security. Technical assistance providers, either in-house or external, can conduct such training and offer additional insights. Likewise, improving accessibility of data, such as offering web access to data or ensuring data are presented in user-friendly formats, supports capacity for data use.

3. Culture of evidence use. Strong leadership and systems of accountability facilitate successful data use. For example, organizations may have formal policies requiring and monitoring data use, provide incentives for data use, or follow a strategic plan for using data. Promoting data sharing allows staff to reflect on data together, and allocating time and resources for examining and using data encourages staff to do so.

In data-rich environments, education decision makers have access to a wealth of information about students, staff, operational activities, and the communities that they serve. These data, however, have limited use—and could possibly be detrimental—if decision makers do not understand the benefits and limitations of data, the types of data relevant for the decisions they are confronted with, and how data can be appropriately used for decision making.

Learn more in our detailed report describing this conceptual framework for data-driven decision making.

Read more about our education work.

SHARE THIS POST

The opinions expressed are those of the author(s) and do not represent those of Mathematica Policy Research.

Recent Comments

Join the conversation: You can register for an account to comment on Evidence in Action. Log in to comment through this account or through Facebook, Twitter, LinkedIn, or Google+.

Log in | Register

View the comments policy

Evidence in Action: Comments Policy

We encourage comments on the Evidence in Action blog—all viewpoints are welcome. Commenters can register through our simple form to create an account for Evidence in Action. Commenters can log in through this account or through their social media accounts. Comments are moderated, and we reserve the right to edit, reject, or remove any that include off-topic statements or links; abusive, vulgar, or offensive content; personal attacks; or spam. Those who violate this policy will be blocked from commenting in the future.

Users who log in through a social media account will be identified by information associated with that account (i.e., a Twitter handle or the user name registered with a Facebook, LinkedIn, or Google+ account). Your comment will not include links to your social media account. Mathematica will not post to your social media account.

Feel free to email us with any questions.