April 22, 2021
As policymakers across the country grapple with how to administer 2021 assessments, The Hunt Institute and Data Quality Campaign have hosted a series of webinars highlighting different aspects of measuring student growth this year. During these webinars, we’ve received many questions—and both organizations have teamed up with SAS to answer them in a series of blog posts. Read more on #EdData in the time of COVID-19 in our first post in the series, second post here, and in our third conversation below.
The Hunt Institute: In 2019-2021, state summative assessments were canceled across the country due to the coronavirus pandemic and the result is a year of missing data. This should not stop states from considering calculating skip-year growth as they approach administration of 2020-2021 assessments. To ensure skip-year growth is valid and reliable it is recommended that state education agencies work alongside testing vendors and state legislatures to determine how best to calculate and report skip-year growth measures.
Data Quality Campaign: Research into the use of skip-year growth models finds that the resulting growth data closely match year-to-year growth data over the same period. Before deciding to measure skip-year growth in 2021, Florida tested this approach using actual data from its 2017, 2018, and 2019 statewide assessments. After measuring 2019 growth using both a skip-year and a year-to-year approach, the state found that the resulting growth data was nearly identical.
Skip year growth is not just a possibility—it is a tried and tested solution to measuring growth when year-to-year growth is not an option. In Massachusetts, for example, ninth grades do not participate in statewide testing. In the absence of 9th-grade assessment data, Massachusetts measures student growth for high school students using their 8th- and 10th-grade assessments. By employing this skip-year strategy to measure high school growth, Massachusetts state leaders are better able to understand the impact of high school transitions on student learning.
SAS: There is a lot of speculation out there, and yet, it is possible to measure growth in any state that proceeds with their assessment. Measuring student growth using a skip-year model will allow us to see how much progress students have made over the last two years. And It will also help us understand the different impacts of the pandemic on different groups of students. Quite simply, it is difficult to address recovery without a sense of where students currently stand.
From a modeling perspective, growth is the change in achievement from one point in time to the next, regardless of whether time spans one or two years. In other words, many growth models can measure growth from the 2018-19 school year to the 2020-21 school year just as easily as they could in previous years. However, the interpretation of these results may change with a two-year measure since they can span multiple grades, schools and even districts. This is important information for educators and policymakers to consider when reviewing results and forming school improvement plans.
Skip-year growth measures are possible with some statistical models (such as Value-Added Models and Student Growth Percentiles) and reporting that supports educators in understanding their usage and limitations. This is not a new problem to solve. In 2016, Tennessee did not administer summative assessments in grades 3-8. The following year, the state measured growth over a two-year period from 2015 to 2017. To assist with this decision, SAS used prior years’ data to compare growth measures with and without a missing year of data. The simulations showed the simulated skip-year results were highly correlated to the actual results observed over the same two-year period. Skip-year growth measures, when considered with the appropriate context, can provide insights and support decisions for our students, teachers, and programs.
The Hunt Institute: Data validity is incredibly important, but so, too, is how we use skip-year growth data. The 2020-21 school year has been like no other. Even as test data are statistically valid, this year it is not appropriate to use them to dole out consequences. Doing so offers no benefit to our educators or to our students. At the same time, the data are invaluable as a tool to assess where we are and to inform where we are going, and we ought to use them as such. State policymakers should work to ensure legislation related to student growth is not used for accountability purposes, but instead to support schools and students.
Watch our recent webinars on the topic of student growth and register for the third installment of our series with DQC – Thinking Creatively to Evaluate Student Growth – today at 1pm ET!!