Opinion
School & District Management Opinion

Partnering to Improve: Insights from Baltimore

By Urban Education Contributor — August 10, 2017 5 min read
  • Save to favorites
  • Print

This week we are hearing from the Baltimore Education Research Consortium (BERC, @BaltimoreBERC). Today’s post is the practitioner perspective on the approach introduced in Monday’s post: The Case for Improvement Science.

This post is by Christian Licier, Assistant Principal at Baltimore City Public Schools, Jarrod Bolte, Executive Director of Improving Education and previously Baltimore City Public Schools, and Marc L. Stein, Research Co-Director of BERC and Associate Professor at Johns Hopkins University School of Education.

At the Baltimore Education Research Consortium, we have put to use Improvement Science methods to significantly shift student outcomes. Improvement teams of researchers, teachers, and administrators have been working to improve literacy outcomes for elementary school students as well as reduce chronic absence in high schools using improvement science.

In this post, we detail two specific things that we as practitioners have found to be helpful in supporting school teams in implementing improvement science projects: Process mapping and identifying data for improvement.

The Importance of Process

A key learning from our work is encapsulated by a quote from W. Edwards Deming: “If you can’t describe what you are doing as a process, you don’t know what you are doing”. Collective process mapping, and the shared understanding that comes from the exercise, has been invaluable in helping us identify opportunities to introduce systematic changes that can lead to improvement.

Our example comes from a team of Kindergarten and first grade teachers working on improving literacy outcomes who noted that the loss of instructional time during transitions from centers was a problem in their daily practice. There were many ideas as to what was causing the delays, and many solutions as to how it could be improved. By mapping their ideal process, teachers were able to uncover misconceptions, clarify steps, and discuss concrete ways to improvement. They uncovered variation in the existing transition processes in each classroom and envisioned what an improved, shared process might look like. Following this exercise, teachers predicted what improvements they would see and began collecting their own data on the process. They were interested in identifying causes for slow transitions and refining the process that could be shared.

Over the next 3 weeks, teachers collected data on the time it took to transition and the number of student disruptions in the classroom during instruction. This allowed them to implement a change to decrease transition time, but also to learn if there was an impact on student engagement or understanding. Teachers tested different strategies, making adjustments each day based on the data and what they were learning, and began to determine which strategies worked and which did not. By continually refining the process, they arrived at an efficient shared process for transitions used by all members of the team.

The Importance of Identifying Data for Improvement

Our example here comes from a secondary school working on student attendance and chronic absenteeism. We were confronting the following problem:

Over the course of the first semester of the school year, a number of students had accrued enough absences to have almost a 100% probability of becoming chronically absent. What could we do to intervene with these students to miss fewer days over the second half of the school year?

We realized that while we knew that these students had missed a lot of days of school, we did not actually know how they were missing those days: Did absences happen all at once? Did they happen a day or two a week and add up over time? Were absences whole days or parts of days? Were students missing morning classes or afternoon classes?

We decided to look more deeply at student attendance by class, but quickly realized that the existing attendance reports and dashboard metrics available to us were insufficient for the task as they required significant effort to extract, reconfigure, and analyze. We developed simple tools to transform what was available from the existing systems into useable data - improvement data. Once we were able to inspect attendance data at this level, in its most basic and raw form, we discovered that for most of these students, class cutting and tardiness were major contributing factors to their low attendance rates. In other words, most of the students who were at-risk of becoming chronically absent were actually attending school most days, they just weren’t attending all of their classes. While aware of issues with class skipping and tardiness, the school team had not previously had empirical data to describe the extent of it or its patterns.

By being able to see an individual student’s pattern of class cutting we could develop appropriate interventions for each student. For example, we could identify that a student with a 78% daily attendance rate had a 35% attendance rate in their third period class. We were able to look at the student’s schedule, sit down with the student, and figure out that the real issue was that the student was struggling in his English class - so he skipped that class. This level of understanding allowed us to develop and implement individual student and whole-school intervention strategies to address root causes of tardiness and class cutting.

Conclusions

While schools and districts grapple with improving outcomes through large-scale initiatives, we often forget about the underlying processes and data in our daily work. The key to Improvement Science is to appropriately identify the underlying cause of a problem, and to develop the simplest and most efficient system for solving it. Process mapping and identifying data for improvement have helped us shed light on hidden information and and thus make progress on challenges that we have struggled with for so long.

Improvement Science can help practitioners overcome apprehensions about data and its use that have built up over years of data obsession and overuse without specificity that has led to countless hours of work loss with little to no positive results to show for it. Educators are not analysts that can afford to spend the majority of their time examining metrics and raw data banks. Our work with researchers, who do have the skills and time to work with data, has allowed us to put data in its appropriate supportive role as an informational tool that allows educators to do their actual jobs. At the same time, Improvement Science engages practitioners in the design, testing, and iterative prototyping that lead to accelerated learning about our systems. It shifts us to become learning organizations, which only makes sense for a sector focused on helping others learn.

Related Tags:

The opinions expressed in Urban Education Reform: Bridging Research and Practice are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.