Head Start: The Family and Child Experiences Survey (FACES)
Since its founding five decades ago, Head Start has served as the nation’s premier federally funded early childhood intervention. Focusing on children in the years before formal schooling, often from families with multiple risks, it has served as a natural and national laboratory for a wide range of basic, prevention, early intervention, and program evaluation research. The Head Start Family and Child Experiences Survey (FACES) was launched in 1997 as a periodic, longitudinal study of Head Start’s performance. There are 12 regions for federal management of Head Start. FACES gathers data on Head Start programs, staff, children, and families from Regions 1 through 10, which are the 10 geographically based Head Start regions nationwide. As of 2014, a study of Region XI (American Indian and Alaska Native; AI/AN) was added (see below). The FACES study is designed to be a reliable source of data for describing the skills of Head Start children; the experiences of the children and families served by Head Start; the quality of Head Start classrooms; and the qualifications, credentials, and opinions of Head Start staff. The U.S. Department of Health and Human Services, Administration for Children and Families, funds the study.
FACES 2006 and 2009
In 2006 and 2009, Mathematica conducted five-year studies, each comprising about 3,400 3- and 4-year-old children newly enrolled in 60 Head Start programs around the country. Our studies followed Head Start children and families from entrance into the Head Start program through one or two years of program participation, with follow-up in the spring of kindergarten. Researchers gathered comprehensive data through direct child assessments in multiple domains; observations of Head Start classrooms; and interviews with Head Start parents, teachers, and administrators.
For both studies, Mathematica analyzed the data and prepared a series of products. We also prepared data files and data file documentation to make the data available to the community of researchers for secondary analyses. Subcontractors for FACES 2006 and 2009 were Educational Testing Service and Juárez and Associates.
Mathematica redesigned FACES to include a Core Plus design to provide key data more rapidly and with greater frequency (Core studies) and to help researchers examine more complex issues and topics in greater detail and efficiency (Plus studies). Using a Core Plus design provides more flexibility in incorporating special topic or methodological studies. The two FACES 2014 Core studies are the Classroom + Child Outcomes Core and the Classroom Core. The Classroom + Child Outcomes Core took place in the 2014–2015 Head Start year. In fall 2014 and spring 2015, we assessed the school readiness skills of more than 2,000 3- and 4-year-old Head Start children in 60 programs, conducted surveys with their parents, and asked their Head Start teachers to describe their social-emotional skills and diagnosed disabilities. In spring 2015 we visited 176 programs (including the original 60) to conduct surveys with program directors, center directors, and teachers and observe 667 classrooms. Therefore, the Classroom + Child Outcomes Core collected child-level data along with program and classroom/teacher data from 60 programs. The program and classroom/teacher data collected across all 176 programs conducted in spring 2015 represent the first round of the Classroom Core. In spring 2017, FACES conducted a second round of the Classroom Core with an updated sample of programs to ensure that it was nationally representative of all Head Start programs at that time. In spring 2017, 178 programs participated in the study.
FACES 2014-2018 included five Plus studies: (1) the Family Engagement Plus study, (2) the Five Essentials Measurement System for Early Education Educator Survey Pilot study (5E-Early Ed Educator Survey Pilot Study), (3) the first-ever study of the children and families served by Region XI AI/AN Head Start programs, conducted in 21 programs with over 1,000 children during the 2015-2016 program year AI/AN FACES 2015, (4) a Plus topical module on programs’ perspectives on the new Head Start Program Performance Standards (HSPPS Plus module), and (5) a Plus topical module on program functioning using the Early Education Essential Organizational Supports measurement system (Early Ed Essentials Plus module).
Mathematica prepared a series of analytic reports documenting study findings. We also prepared data files and data file documentation so that these data are available to the community of researchers for secondary analyses.
Educational Testing Service and Juárez and Associates served as subcontractors for Mathematica.
FACES 2019 is the current phase of this important endeavor. FACES 2019 follows a similar structure as FACES 2014-2018; it has a Core Plus study design that provides data on a set of key indicators more rapidly and with greater frequency (Core studies) and serves as a vehicle for studying more complex issues and topics in greater detail and with increased efficiency (Plus studies). It also aligns the timing of the study of Region XI American Indian and Alaska Native Head Start (AI/AN FACES 2019) with the Classroom + Child Outcomes Core study in Regions 1-10.
Mathematica and its partners, Juárez and Associates, Educational Testing Service, and consultants Margaret Burchinal and Martha Zaslow, will develop the instruments and data collection procedures for two Core studies. We will select and recruit a nationally representative sample of 180 Head Start programs and 360 Head Start centers in Regions 1-10. We will assess the school readiness skills of 2,400 children and survey their parents and Head Start teachers in fall 2019 and spring 2020 (Classroom + Child Outcomes Core from a subsample of 60 of the 180 programs). We will observe 720 Head Start classrooms and survey Head Start staff in spring 2020 and spring 2022 (Classroom Core in all 180 programs).
Future planning will determine the nature of any Plus studies. Mathematica will analyze the data from each round of the study and prepare a series of products. We will also prepare data files and data file documentation so that these data are available to the community of researchers for secondary analyses.
Survey Methodology Highlights
Technological innovations in data collection increase the accuracy and timeliness of data management and analysis. These include web-based computer-assisted direct assessments of the study children, as well as web instruments for teachers to use when reporting on children’s social-emotional, cognitive, and physical outcomes. We also use a dual-screen administration for child assessments, a web option for parent surveys in addition to the computer-assisted telephone survey, and web options for Head Start staff surveys.
Ashley Kopack Klein