Views from the White House Behavioral Science Summit

Views from the White House Behavioral Science Summit

Dec 15, 2016
Jonathan McCay, Alex Resch, and Irma Perez-Johnson

Last week, researchers from Mathematica and several of our practitioner partners convened for an important gathering titled “Using Innovation, Evidence, and Behavioral Science to Build Economic Opportunity and Create Stronger Communities: A Summit for State and Local Governments.” The event was sponsored by the White House and ideas42 and took place at the stunning United States Institute of Peace in Washington, DC. Here, our staff offer their reflections on the event’s presentations and discussions.

The Importance of Getting Things Wrong

Jonathan McCay

“It would be an error to suppose that the great discoverer seizes at once upon the truth, or has any unerring method of divining it. … Fertility of imagination and abundance of guesses at truth are among the first requisites of discovery; but the erroneous guesses must be many times as numerous as those that prove well founded.” (Principles of Science, 1874)

Reflecting on his experience as an inventor, 19th century British economist William Stanley Jevons fittingly described in the quote above how vastly errors and mistakes outnumber successes in the process of innovation.

I found myself recalling Jevons’ words at last week’s summit as panelists reiterated the importance of embedding and building innovation capacity in state and local governments. The culture of an organization is a critical—yet often overlooked—ingredient in the successful promotion of innovation. Insights from behavioral science offer us a profound set of tools to diagnose inefficiency and ineffectiveness in government programs; diagnosis, however, is only the first step. Behavioral insights also aid the development of strategies to address bottlenecks that impede productivity. Thoughtfully designing, testing, and refining changes to programs and policies based on these insights requires space for both idea percolation and experimentation. And in an environment of scarce resources and high stakes, this flexibility can be particularly challenging to come by. However, by not allowing for innovative experimentation in government programs, decision makers might be selling themselves and the people they serve short.

White House Behavioral Science Summit

As many panelists pointed out, external partnerships are a great way to build capacity for innovation and leverage additional resources to support the iterative (error-prone) process of innovation. Bringing together researchers, practitioners, and even businesses and other private sector stakeholders encourages a new ecosystem of data-driven decision making grounded in the realities of government programs. David Yokum of LAB @ DC noted that relationships are fundamental to the success of any such partnership; it’s essential that we embed innovation and research efforts within the very service environment in which we’re attempting to problem-solve. Through these connections and relationships, ideas evolve into plans and, ultimately, actions.

If, as the summit’s title suggests, we are to build economic opportunity and create stronger communities by using innovation, evidence, and behavioral science, we as a research and practice community should embrace a “test and tweak” approach to program and policy development and continuous improvement. By applying methods like predictive analytics and rapid-cycle evaluation, we can test with greater rigor and timeliness, identify the errors with greater precision, and tweak with higher confidence. In his book, Where Good Ideas Come From, Steve Johnson notes that “error often creates a path that leads you out of your comfortable assumptions.” For social services, like any other sector, the road to better outcomes is paved with errors; our organizational cultures should embrace them to learn from them.

Putting Humans First

Alex Resch

I was really excited to hear how often human-centered design (HCD) came up at the summit. Tom Kalil of the Office of Science and Technology Policy mentioned it in his opening remarks as one of the major trends in innovation, and it came up in every panel. We've been thinking about HCD a lot at Mathematica. We used this approach in developing the Ed Tech Rapid Cycle Evaluation Coach and also incorporate HCD principles in all of our work to improve programs using behavioral insights and evidence-based technical assistance.

It seems obvious that humans should be at the center of our work—if our ultimate goal is to improve public well-being, we need to put the people using public programs and those implementing them at the center of program design and improvement efforts. And we do better research when we understand how people interact with the programs we're evaluating.

For instance, in the panel on “Building Outcomes Mindsets in Government,” Oliver Wise, the director of the City of New Orleans Office of Performance and Accountability, noted that HCD facilitates an important shift in how governments think about delivering services to their citizens. Instead of concentrating on how to improve programs within siloed agencies, a focus on the end user encourages public managers to think about how to best structure and coordinate services across agencies to work together to meet citizen’s needs and make the best use of scarce public resources. As one example, Lynn Overmann of the Office of Science and Technology Policy described the value of HCD to the work of the Data-Driven Justice Initiative. The initiative brought together 26 agencies to talk about problems they face in their work and to develop potential solutions. One common challenge across jurisdictions is that jails have become the front line for dealing with individuals with mental health and substance abuse problems, but are not well equipped to meet their needs. The initiative has supported the development of user-centered tools and strategies that criminal justice agencies are using to connect vulnerable citizens with appropriate services.

Learning about these and other examples during the summit, I was encouraged to see that HCD is being used by so many different organizations to improve government programs.

Key Steps in Applying Behavioral Insights

Irma Perez-Johnson

The summit provided a precious opportunity to connect with others involved in efforts to apply behavioral insights and use evidence to improve social policies and programs, and to both reflect on and celebrate the fruits of this important work. The sessions that most resonated with me were those toward the end of the meetings, when panelists discussed lessons learned and pondered in which direction our efforts and attention should turn next. Several important themes stuck with me, and I share them here with others who did not have the opportunity to attend the meeting:

  • The process matters. When practitioners, researchers, and others come together to problem-solve and innovate, it is crucially important to start with—and maintain throughout—a common definition and understanding of the targeted problem. A careful diagnosis, in which all involved share their experiences, perspectives, and assumptions about the problem—and test them together—is indispensable.
  • Data are an important resource in several ways. During diagnosis, they help confirm or call into question assumptions, and can also yield new, unexpected insights. Data are also essential in order to maintain an outcomes or results orientation in these collaborations.
  • It is important to recognize the value and contributions of all parties involved. Researchers, behavioral experts, and data scientists contribute specialized knowledge and expertise that are essential. However, they must also approach the work with humility and deference to our practitioner partners, who possess the nuanced institutional knowledge and understand both the practical and political constraints.
  • For programs or agencies seeking to engage in these types of efforts for the first time, it can be most productive to focus on a problem or “pain point” on which there is a fair amount of consensus and over which they have substantial control (that is, discretion to modify key elements of the process). This will likely lead them to test program or policy modifications that might improve outcomes at the margins. After having learned the process, however, it is important to remain vigilant for opportunities to apply behavioral insights from the outset. At this early stage, these insights could lead program designers to test very different prescriptions relative to their traditional approaches. For example, if the goal is to get more people to sign up for a beneficial service, program designers may automatically enroll their clients (while still allowing them to opt out), as behavioral research shows that people are more likely to stick with the default option than signing up if it’s presented as an option.
  • The context also matters. While behavioral solutions tend to be scalable because they tend to be low-cost and integrated into existing procedures and policies, they are also tailored for a specific target population and programmatic context. For this reason, when seeking to replicate effective strategies with a new target population or in a new context, it might be necessary to engage in the same process of diagnosing, adapting, and testing as when the effective intervention was first developed. This is the only way to ensure that the solution continues to yield the desired results.
  • Collectively, we should aim to move beyond nudges. This can be accomplished by identifying “moon shots”—that is, great social challenges on which we’ve failed to make significant progress. It’s also accomplished by establishing groups or efforts to explicitly focus on these high-risk, high-return challenges, coming together within improvement networks and a more formal community of practice, and working to reduce barriers to the expansion and institutionalization of this evidence-based, process-focused, and results-oriented approach to developing more effective policies and social programs.

About the Authors

Jonathan McCay

Jonathan McCay

Senior Managing Consultant
View More by this Author
Mathematica logo icon

Irma Perez-Johnson

View More by this Author