ISO Better School Performance Reporting
From The Alexandria Times, October 3, 2024. And, click below to watch Agenda: Alexandria's September 23 program "How Are the Kids Doing and How Many Are Coming?" on Alexandria's public schools.
Alexandria City Public Schools recorded a 55% pass rate in math and 61% in reading on Virginia’s standardized tests, known as the Standards of Learning or SOLs – according to an Aug. 20 report by The Washington Post.
“The district has long trailed other Northern Virginia districts in performance metrics, in part because the division has a higher concentration at about 67% of economically disadvantaged students than its neighbors. Alexandria also has a large share – 43.6% – of English language learners, traditionally one of the lowest-performing student subgroups on assessments,” the Post article reads.
Officials were quoted as saying that ACPS “will continue to work on a tiered intervention approach to reach students who need the most support.”
The Post’s reporting is more evidence, if any is needed, that proficiency percentage, or pass rate, reports do little more than capture or reflect student demographics. Pass rates are a very limited measurement.
Here’s an example: School A serves primarily affluent students, all of whom start the school year with state assessment scores that are well above the state designated level of proficiency, say 70%. Assume that School A scores in the 80th percentile. School B starts the year with mostly students below the proficiency threshold – School B might be at the 25th percentile of the state assessment. At year end, School A might not move the students in the test score distribution, meaning they stay at the 80th percentile. They still learn, but not more than other students that started at the same place – but School B’s students may have learned a lot relative to students at other schools, raising their performance to the 50th percentile by the end of the school year.
In this example, School A would appear in media reports based on pass rates as a fine school, while School B would appear to be struggling, even though the data indicates that School B is contributing more to student achievement than School A.
Because a school’s standardized test results reflect its enrollment demographics, they tell us nothing about individual student progress. The goal should be to measure, as accurately as possible, the progress of every student with similar measurements at the classroom or school level. This information, on an on-time basis, will assist parents, teachers and students irrespective of their position on the achievement scale.
The report card, for schools or students, is from the “Dark Ages,” especially with the information generated by computer-assisted learning.
An Alexandria economist recently described “waiting more than 50 years” for school systems and the media to report on the distance traveled – how effective schools were in moving students up in their learning – rather than on reporting pass rates. The Alexandria Tutoring Consortium, in a contrast with many school systems, is assiduous about describing reading growth in the students it serves.
A useful analogy showing the need for more meaningful metrics might be professional baseball. Prior to the revolution in baseball analytics that began about 30 years ago, players were evaluated on familiar statistics, for example, batting average, runs batted in and pitcher wins. New measures of player performance that more accurately indicate success were presented compellingly in Michael Lewis’ book, Moneyball, and the movie of the same name starring Brad Pitt. Those measurements – a new language – have thoroughly permeated baseball and the media. These metrics are gathered in every game and are used to teach players how to improve.
The media is complicit in the lack of reporting on how schools contribute to student academic growth over time. It is easier to report on what the schools and state authorities hand out than it is to analyze whether that data provides what the public really needs to know.
School systems, including ACPS, have data on the systemwide, school and teacher levels about student progress over specific periods, particularly in areas like vocabulary and moving from single-digit to double-digit addition. There is more variability as inquiries become more specific: The yearly progress students make in a fourth-grade class cannot be compared to the progress in a year in a high school chemistry course.
Education data nerds – you have the data on who you are – and reporters who cover schools should join forces to renounce old ways. The goal should be to develop, and carefully explain, new ways to measure how effective schools and teachers really are in contributing to student learning over time – the “distance traveled.” Call it the SCF – the School Contribution Factor.
If we can move from reporting and talking about standardized test pass rates to understanding how effective schools are at contributing to longitudinal academic growth, we will make a huge step to helping our schools and students.
We need a revolution and we need it now.
The writer is a former lawyer, member of the Alexandria School Board from 1997 to 2006, and English teacher from 2007 to 2021 at T.C. Williams High School, now Alexandria City High School. He can be reached at aboutalexandria@gmail.com, and free subscriptions to his newsletter are available at aboutalexandria.substack.com.
Mark: As you know I with two others started the tutoring Consortium years ago. It first helped young struggling students but almost as important it got community members into the schools. 30 years. later it is still going strong. I took the liberty of forwarding this to a dozen educators across the country. It is really good.
I think this is a needed addition (and shout out to Alexandria Tutoring Consortium - volunteered with them for years) but I think this is a “both and” situation. We need to see the data on progress- how much have schools moved kids’ skills and knowledge - but also where students are with respect to achievement. If we think these benchmarks for achievement are necessary to see if kids have learned what they need to, we still need to know how many kids are meeting them and not.