Why Did So Many States Choose to Use These Two ESSA Indicators?
The Every Student Succeeds Act was meant to "unleash a flood of innovation" when it came to, among other things, measuring school quality and student success beyond test scores.
So why did so many states end up picking two ways to measure this?
According to my Politics K-12 colleagues' tally, at least 33 states have picked to measure in their accountability system chronic absenteeism. And roughly the same number have picked college- and career-readiness.
One big reason is state departments' ability, or lack thereof, to collect and report reliable data they can hold school officials accountable on.
It's an issue I wrote about a year ago as state departments started rejecting outright some pretty unusual and innovative ideas from parents and teachers about how best to measure their schools. This caused consternation and confusion amongst advocates who wanted to break away from heavy reliance on testing.
One big issue: whether states and districts are able to retrofit their data-collection systems to answer new and increasingly difficult questions, a potentially arduous and expensive task.
For many measures, state officials say they lack the infrastructure to collect enough reliable information to attach high stakes. Many districts' data-collection sytems are scattershot and outdated. Scores of technicians responsible for processing data have been laid off in recent years amid budget cuts. And local superintendents have complained that they're already required by states to collect an inordinate amount of data.
In addition, states must navigate a myriad of data privacy laws passed in recent years.
A report from the Data Quality Report this week said that in 2017, 42 states passed 53 new laws that explikcitly address how the state collects, manages, uses, reports and protects data about students and schools.
With so many states picking the same indicators, some experts have told me that states will start working together to build robust data collection tools and intervention strategies, similar to the way states worked together with the passing of Common Core State Standards.
But a lot of legislatures and some advocates (especially from the school climate community) are disappointed with the monolithic plans, and hope that as data becomes more readily available in future years, states will change or reconsider their indicators.
In the meantime, ESSA requires the collection and public reporting of several new data points, including student arrest rates, teacher experience and average pay, and school-by-school spending. While these data points will be collected and reported, schools will not be held accountable for disparities.
Expect a lot more talk in the coming months at state school board meetings about how best to collect and report this data. States will have to negotiate with local officials common definitions and the design of their new report cards.