Student absenteeism : Who misses school and how missing school matters for performance

A broader understanding of the importance of student behaviors and school climate as drivers of academic performance and the wider acceptance that schools have a role in nurturing the “whole child” have increased attention to indicators that go beyond traditional metrics focused on proficiency in math and reading. The 2015 passage of the Every Student Succeeds Act (ESSA), which requires states to report a nontraditional measure of student progress, has codified this understanding.

The vast majority of U.S. states have chosen to comply with ESSA by using measures associated with student absenteeism—and particularly, chronic absenteeism. This report uses data on student absenteeism to answer several questions: How much school are students missing? Which groups of students are most likely to miss school? Have these patterns changed over time? And how much does missing school affect performance?

Data from the National Assessment of Educational Progress (NAEP) in 2015 show that about one in five students missed three days of school or more in the month before they took the NAEP mathematics assessment. Students who were diagnosed with a disability, students who were eligible for free lunch, Hispanic English language learners, and Native American students were the most likely to have missed school, while Asian students were rarely absent. On average, data show children in 2015 missing fewer days than children in 2003.

Our analysis also confirms prior research that missing school hurts academic performance: Among eighth-graders, those who missed school three or more days in the month before being tested scored between 0.3 and 0.6 standard deviations lower (depending on the number of days missed) on the 2015 NAEP mathematics test than those who did not miss any school days.

Introduction and key findings

Education research has long suggested that broader indicators of student behavior, student engagement, school climate, and student well-being are associated with academic performance, educational attainment, and with the risk of dropping out.1

One such indicator—which has recently been getting a lot of attention in the wake of the passage of the Every Student Succeeds Act (ESSA) in 2015—is student absenteeism. Absenteeism—including chronic absenteeism—is emerging as states’ most popular metric to meet ESSA’s requirement to report a “nontraditional”2 measure of student progress (a metric of “school quality or student success”).3

Surprisingly, even though it is widely understood that absenteeism has a substantial impact on performance—and even though absenteeism has become a highly popular metric under ESSA—there is little guidance for how schools, districts, and states should use data about absenteeism. Few empirical sources allow researchers to describe the incidence, trends over time, and other characteristics of absenteeism that would be helpful to policymakers and educators. In particular, there is a lack of available evidence that allows researchers to examine absenteeism at an aggregate national level, or that offers a comparison across states and over time. And although most states were already gathering aggregate information on attendance (i.e., average attendance rate at the school or district level) prior to ESSA, few were looking closely into student-level attendance metrics, such as the number of days each student misses or if a student is chronically absent, and how they mattered. These limitations reduce policymakers’ ability to design interventions that might improve students’ performance on nontraditional indicators, and in turn, boost the positive influence of those indicators (or reduce their negative influence) on educational progress.

In this report, we aim to fill some of the gaps in the analysis of data surrounding absenteeism. We first summarize existing evidence on who misses school and how absenteeism matters for performance. We then analyze the National Assessment of Educational Progress (NAEP) data from 2003 (the first assessment with information available for every state) and 2015 (the most recent available microdata). As part of the NAEP assessment, fourth- and eighth-graders were asked about their attendance during the month prior to taking the NAEP mathematics test. (The NAEP assessment may be administered anytime between the last week of January and the end of the first week of March, so “last month” could mean any one-month period between the first week of January and the first week of March.) Students could report that they missed no days, 1–2 days, 3–4 days, 5–10 days, or more than 10 days.

We use this information to describe how much school children are missing, on average; which groups of children miss school most often; and whether there have been any changes in these patterns between 2003 and 2015. We provide national-level estimates of the influence of missing school on performance for all students, as well as for specific groups of students (broken out by gender, race/ethnicity and language status, poverty/income status, and disability status), to detect whether absenteeism is more problematic for any of these groups. We also present evidence that higher levels of absenteeism are associated with lower levels of student performance. We focus on the characteristics and outcomes of students who missed three days of school or more in the previous month (the aggregate of those missing 3–4, 5–10, and more than 10 school days), which is our proxy for chronic absenteeism.4 We also discuss data associated with children who had perfect attendance the previous month and those who missed more than 10 days of school (our proxy for extreme chronic absenteeism).

Given that the majority of states (36 states and the District of Columbia) are using “chronic absenteeism” as a metric in their ESSA accountability plans, understanding the drivers and characteristics of absenteeism and, thus, the policy and practice implications, is more important than ever (Education Week 2017). Indeed, if absenteeism is to become a useful additional indicator of learning and help guide effective policy interventions, it is necessary to determine who experiences higher rates of absenteeism; why students miss school days; and how absenteeism affects student performance (after controlling for factors associated with absenteeism that also influence performance).

Major findings include:

One in five eighth-graders was chronically absent. Typically, in 2015, about one in five eighth-graders (19.2 percent) missed school three days or more in the month before the NAEP assessment and would be at risk of being chronically absent if that pattern were sustained over the school year.

Absenteeism varied substantially among the groups we analyzed. In our analysis, we look at absenteeism by gender, race/ethnicity and language status, FRPL (free or reduced-price lunch) eligibility (our proxy for poverty status),5 and IEP (individualized education program) status (our proxy for disability status).6 Some groups had much higher shares of students missing school than others.