Why Assessments Are Still Useful — and Accurate — in a Pandemic Year

Opinion | Assessments

Why Assessments Are Still Useful — and Accurate — in a Pandemic Year

By Alex Fernandez and Janice Pavelonis     Feb 12, 2021

Why Assessments Are Still Useful — and Accurate — in a Pandemic Year

As most school systems have learned by now, assessing learning in a pandemic comes with all sorts of challenges, including but not limited to how (and where) to assess students and whether the data being gathered is even accurate. As assessment and curriculum leaders for two different districts, we’re still discovering the answers. But what we do know is that assessment can still reveal a lot of essential information about student learning, and despite the challenges, they’re still worth conducting.

Our districts, Imagine Schools and Carbondale Elementary School District #95 (CES), might not seem to have much in common at first glance. Imagine is a public charter school system with more than 30,000 K–12 students spread across seven states and the District of Columbia, while CES is a more traditional district with four elementary schools in southern Illinois.

What we share is a commitment to data-driven instruction and, despite the challenges of the past year, a desire to accurately assess our students in a way that helps them recover learning they may have lost during the COVID-19 closures. Here’s how we made key decisions about assessments and what to do with the data we gathered.

Is Remote Assessment Accurate?

During this spring’s closures due to COVID-19, one of Imagine’s many concerns was the efficacy of remote assessment. What good is assessment, after all, if it doesn’t accurately tell us what students know? For a definitive answer to this question, Bill Younkin of the Biscayne Research Group performed some research for us comparing our in-person and remote assessments from the 2019–2020 school year. Sixteen of our schools and approximately 5,000 students agreed to participate. Since we use Renaissance’s Star Assessments for reading and math at all our schools, whether they’re on-campus or remote, that was consistent throughout the study.

His findings indicated that remote assessment was indeed as effective as in-person assessment, with a couple of minor caveats. Exceptionally low scores were a bit less common with remote assessment, and exceptionally high scores were a bit more common. Both effects were observed more frequently at lower grades and almost disappeared at higher grades.

According to Younkin, the lack of extremely low scores suggests that students take the assessments more seriously at home, with parents nearby. On the other end, the slightly higher prevalence of extremely high scores is probably a result of parents helping their children on the assessment. For this reason, we made a point to remind families that the purpose of assessments is to provide teachers and principals with the data they need to make important decisions about student learning. “Helping” students to answer test questions does not really help them at all.

We were encouraged to find when comparing data from the previous four years to this fall’s initial assessments results, which were primarily taken remotely, that they follow a very similar pattern, adding even more substantiation for Younkin’s finding that “for the vast majority of Imagine students, remote testing yields similar results as traditional testing does.”

Adjusting Assessment Strategies by Age Group

At CES, the key insights from this year’s assessments have been more about the importance of good assessment practice than measured student achievement. For our youngest learners, we had them come to school to be assessed in person, one at a time. For older students, whom we assessed remotely, we talked to our teachers about the importance of doing it in small groups; they, in turn, talked to their students about the importance of assessments.

Overall, testing went very well, but we hit a little trouble especially with second grade remote testing. Some teachers reported that they saw parents leaning over their students’ shoulder, or a hand reaching across the screen at times to help their children. Our parents are committed to the positive shifts in our district, so moving forward we will partner with them more deeply to ensure validity in the assessments if our students remain remote.

Making the Most of Assessment Results

Another thing that will be slightly different this year in both of our districts is how we put assessment results to work. With the interruption of in-person learning in the past two school years, we have seen even more variation in student skills, abilities and knowledge than in most years.

To give teachers the data they need to help students progress as quickly as possible, our assessment provider, Renaissance, has released a set of free tools centered around “Focus Skills.” Every teacher knows that some skills are more essential than others. Learning to compare and contrast the written fairy tale of Cinderella and the Disney movie, for example, is important. But learning what sounds each letter makes is a fundamental building block that students must have in place before they can learn to read. In this extraordinary year, we are using assessment to discover gaps in those essential skills so we can get students back on track as soon as possible.

In the end, our fundamental response to school interruptions is to continue doing what we do every day: Find out where students are, and support them in learning what’s next.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up