The Intersection

Missing Students & the Equity Implications for Student Growth Data

April 29, 2021

As policymakers across the country grapple with how to administer 2021 assessments, The Hunt Institute and Data Quality Campaign have hosted a series of webinars highlighting different aspects of measuring student growth this year. During these webinars, we’ve received many questions—and both organizations have teamed up with SAS to answer them in a series of blog posts. Read our fourth and final installment below.

Data Quality Campaign: States across the country are still reporting alarming numbers of “missing students,” who have been unaccounted for since last spring’s closures. Other families may choose to keep students at home out of concerns about in-person assessments. Leaders should take action to test as many students as possible this year—yet they will not be able to reach everyone.

The Hunt Institute: It seems inevitable that far more students will be missing from assessment data this year than ever before. It is also critical that policymakers and education leaders understand why they are missing and the implications of this for calculating student growth.

SAS: When opt-out numbers are significantly higher than normal, there should be caution in interpreting results just before and after the pandemic since the pool of test takers may not be comparable. Fewer test takers could influence a state’s understanding of the pandemic’s impact on student learning.

It would be important for the state to consider which groups of students are missing. Are they spread across the district or state or are they localized or concentrated among certain groups of students? If for some reason a certain group of students has a large proportion of opt-outs, then the lack of data hinders our understanding of the pandemic’s impact on this group of students.

Data Quality Campaign: It is critical that states and districts have a data point on every student: either a result from a statewide assessment, or details about why that student did not participate in the assessment. By looking at the full scope of performance and opportunity-to-learn (OTL) data available to them, educators and leaders can better understand how students have been served over the past year and where supports are most acutely needed.

SAS: We will have more information on students’ achievement and growth if they test, and that’s better than no information.  Additionally, it may be possible to aggregate interim/benchmark assessments to determine growth if the summative assessment is missing.

The Hunt Institute: When a child is missing from the dataset because his family did not want him to take the test in person, but he consistently participates in online learning, we can draw on other measures to assess his learning. But when tens of thousands of students are unaccounted for both in our test data and in our physical and virtual classrooms, we can only assume that learning loss is even more severe than the data show.

Hence, if we believe in educational equity, we must both continue to identify those “missing” students and take steps to re-engage them and their families at every opportunity. We must also make a rigorous plan for how we will support them when they come back.

Watch our webinar series:

Read the other installments of this blog series:

Share This