On pandemic ‘learning loss,’ schools look forward, not back


By Patrick Wall - and Kantele Franko - Chalkbeat and Associated Press



Camden City School District Superintendent Katrina McCombs poses for a portrait at her desk at the school district office Oct. 21 in Camden, N.J. A complete picture has yet to emerge of how much learning was lost by students during the pandemic.

Camden City School District Superintendent Katrina McCombs poses for a portrait at her desk at the school district office Oct. 21 in Camden, N.J. A complete picture has yet to emerge of how much learning was lost by students during the pandemic.


Michael Perez | AP

ONLY ON LIMAOHIO.COM

Read more about the pandemic at LimaOhio.com/tag/coronavirus.

NEWARK, N.J. — A complete picture has yet to emerge of how much learning was lost by students during the pandemic. That’s all right with educators like Superintendent Craig Broeren, whose top concern is figuring out where each student stands now.

Wisconsin Rapids, his small school district in central Wisconsin, isn’t administering any special test to measure how much districtwide progress stalled after classrooms closed in March. Such data wouldn’t capture a student’s unique circumstances or point a way forward, Broeren said.

Instead, the district is sticking with its usual fall assessments. Those tests can roughly estimate learning loss since the spring, but leaders say they are most useful for pinpointing what students know now and tracking how much they learn.

“Frankly, what we lost is less of an issue than where a kid is starting from,” Broeren said, “and using that to inform instruction.”

That approach is the norm nationwide. Most states aren’t requiring all districts to administer uniform tests to measure students’ slippage. Rather, districts generally are using the tests they give each fall to guide instruction for the school year and, in many cases, also assessing students’ mental health and well-being — an approach favored by many experts and educators who say a rush to quantify learning loss could demoralize students and teachers.

But as many schools continue distance learning or brace for more virus-related closures that could further slow progress, the patchwork approach to testing this fall worries some advocates and policymakers who say it’s difficult to plan academic recovery this year without consistent data across districts and states.

“We’re in this data black hole,” said Kyle Rosenkrans, executive director of the New Jersey Children’s Foundation, an advocacy group that plans to hire researchers to estimate how much students have fallen behind. “You can’t prescribe solutions unless you have a sound diagnosis of the scale of the problem.”

Using data from past school closures, researchers have estimated some students might have lost several months to a year’s worth of academic growth after school buildings around the country closed last March. Some policymakers say data is needed urgently to support districts with the largest gaps or plan more drastic statewide responses, such as extending the school year.

Among those calling for a more aggressive effort to measure that loss is New Jersey Sen. M. Teresa Ruiz, a Democrat who co-sponsored a bill to require testing to assess academic and social-emotional needs and to require the state to analyze the results.

New Jersey offered new diagnostic tests to help districts identify students needing extra support, but those aren’t designed to measure statewide trends, and most of its districts are already using other assessments.

“The fact that it’s optional and they’re not requiring the data to go back to them, it just misses the whole intent of what is critically needed,” Ruiz said. “We need to know what has happened during this pandemic.”

Some places are starting to get a partial glimpse of the pandemic’s academic toll. Idaho, which requires grades K-3 to take a fall reading test, found an overall decline in reading skills. And the Washington, D.C., school system likewise discovered a significant drop in the share of young pupils meeting reading targets.

Many educators have braced for such results. Yvette Gonzalez, a fashion design teacher at a high school in El Paso, Texas, said many of her students have had to take on jobs or care for siblings during the pandemic.

“When they’re constantly thinking about COVID and how to survive on a day-to-day basis, a lot of them are not worried about how they’re going to finish their work,” she said.

In Newark, New Jersey, teacher Wirmarie Morales said her own son, a high school senior, worries he’ll have to take remedial classes in college to make up for material he missed. And her fourth-grade bilingual students appeared to start this school year with less confidence and fluency in English than usual.

“They don’t know basic, simple things that they would normally know,” she said. “It is time they lost that maybe, at some point, we can’t get back.”

To measure learning loss on a large scale, researchers need recent test results from across districts, but consistent data will be hard to come by.

States canceled their annual exams last spring at the start of the pandemic, and most are letting districts decide whether and how to test this fall. Few states appear to be collecting the results of those tests.

“It’s information that everybody wants to have,” said Daniel Domenech, executive director of AASA, the School Superintendents Association. “But, right now, the priority is focusing on kids’ needs.”

Educators highlight other arguments for avoiding wide-scale testing this fall.

They worry it could lead to a focus on reviewing past topics, at the expense of teaching new material. Students who faced hardships in the spring or couldn’t access online classes might be misidentified as academically challenged. And testing could eat up instructional time and put more pressure on students already stressed by the pandemic.

Parents and educators in Portland, Oregon, cited such concerns in opposing planned diagnostic testing, which the district suspended to focus instead on “engagement and instruction.” Teachers also raised concerns that having students take tests from home could skew results.

“We know that students don’t have equal opportunities to have good Wi-Fi connections, high-quality technology, a quiet working environment,” said Elizabeth Thiel, president of the Portland Association of Teachers. “What is the outcome when we are using that data to make any assumptions about our students?”

Several states that recommend testing this fall, such as California and Ohio, provided lists of approved assessments, with a focus on diagnostic tests that deliver quick, student-specific results that teachers can use to tailor their lessons and target students who need extra help.

Wisconsin, too, favors teacher-driven assessments and urged districts to “not focus on large scale gap-finding assessments or diagnostics.”

Some school districts are tweaking their usual fall assessments.

Dayton Leadership Academies, a K-8 charter school in Dayton, Ohio, began classes remotely but staggered appointments for students to visit school for several hours of assessments. That allowed kids to be tested in a controlled environment and meet their teachers, Principal Tess Mitchner Asinjo said.

“It felt more like real school,” she said.

The results brought instructors some relief. More students were categorized as being behind their grade level, but the numbers weren’t nearly as bad as they’d anticipated.

Schools in Camden, New Jersey, planned to use teacher-created quizzes and external tests such as the i-Ready online assessment tool, and possibly the assessments the state created to help districts identify learning gaps, Superintendent Katrina McCombs said.

“My gut tells me we are going to take a hit in some of our skills,” she said.

The assessments, she said, will allow the district “to drill down in a laserlike way to see what those skill gaps are and how we can rapidly close those gaps.”

Camden and other districts plan to use the data to design districtwide interventions, such as after-school programs and academic “boot camps” for the students furthest behind.

Many districts are comparing students’ fall scores with midyear tests taken before schools closed, which will give a rough idea of how they’ve fared during the pandemic. But to precisely measure learning loss, districts would need to compare the change in each student’s scores this year to their estimated academic growth in a normal year, said John Gatta, CEO of ECRA Group, an education research and analytics firm.

Most school systems probably can’t do such an analysis themselves, Gatta said.

There is some hope that federally mandated testing in the spring could provide a more complete picture of the so-called COVID-19 slide. Those tests might reveal whether certain districts or racial or ethnic groups lost more ground than others, but the results won’t be available until fall 2021.

That leaves policymakers without data they can use this school year, Massachusetts board of education member Michael Moriarty said at a recent meeting. The state offered free diagnostic tests, he said, but its education department doesn’t have the authority to make districts give the tests or submit the results.

“We’re flying blind right now,” he said.

Camden City School District Superintendent Katrina McCombs poses for a portrait at her desk at the school district office Oct. 21 in Camden, N.J. A complete picture has yet to emerge of how much learning was lost by students during the pandemic.
https://www.limaohio.com/wp-content/uploads/sites/54/2020/10/web1_125712864-04a8446af55a46a9b67cccd695a7808c.jpgCamden City School District Superintendent Katrina McCombs poses for a portrait at her desk at the school district office Oct. 21 in Camden, N.J. A complete picture has yet to emerge of how much learning was lost by students during the pandemic. Michael Perez | AP

By Patrick Wall

and Kantele Franko

Chalkbeat and Associated Press

ONLY ON LIMAOHIO.COM

Read more about the pandemic at LimaOhio.com/tag/coronavirus.

Post navigation