Professional Judgment: Beyond Data Worship
We've been "data driven" for at least a decade in education, with many a fortune made on assessment training for educators. I have no problem with using data to inform instruction, but I am starting to think we've gone too far in demanding that instruction be driven by data.
Collecting data is not a neutral or free activity; you have to decide what's worth measuring and how you're going to measure it, and you have to take time away from something else in order to collect data.
Assessments can be, but often aren't, meaningful learning opportunities for students. They can be helpful, and they're necessary to some extent, but I think we've gone too far when we implore educators to let data drive their instruction.
At the same time, we've seen a sustained assault on professional judgment, which suggests that educators are somehow less reliable than other types of professionals in making decisions based on their expertise. We expect doctors to give us the necessary tests and scans, but we don't say this data drives their diagnoses or treatment plans; it merely informs.
In other words, professional judgment can be bolstered by good data, but data is useless without good professional judgment.
The reason for this is fairly straightforward: data can help paint the picture, but they don't tell the whole story. This is true for several reasons.
First, data only provide answers to the questions that we've posed. They tell us nothing about the questions we aren't asking, the outcomes we aren't measuring, the results we may have neglected to track. The doctor may take your temperature, but if you need a CAT scan, the best thermometer in the world isn't going to do the job.
Second, while data can help us understand the current reality, they don't tell us what to do about it. Assessments may tell me which kids aren't grasping the concepts in a math lesson, but they don't give me any clue as to what I should do differently.
Non-practitioners will often interject, at this point in the discussion, that reteaching is the obvious answer. If a kid's not getting it, go back and spend more time explaining the concept. Of course. But teaching doesn't occur in a vacuum where mastery is all that matters; show me a class where everything is taught to mastery and I'll show you a class that's still on Chapter 3 in May.
Education abounds with dynamic tensions like this: Whether to reteach or move on; whether to spend time with the individual student or address the whole class; whether to repeat the tried-and-true explanation or try something new; and a thousand more that teachers encounter every day.
School leaders face just as many tensions: Whether to stay the course or try a different tack; whether to remain focused on a few programs and strategies or try a new initiative; whether to make another suggestion for improving the lesson, or just let the first one sink in.
In an environment of complexity and dynamic tensions, professional judgment is enormously important. We have multiple courses of action to take at any given time, and while the data may be extremely helpful, it does not substitute for judgment.
Third, data don't tell us if our chosen course of action was the best one. Collecting good data does not make education an experimental science; school is not a laboratory, but a culture living out a unique history.
Data can only tell us whether we got good results; it cannot identify the precise cause of those results. Correlation does not equal causation, even when a new strategy correlates with rising scores. Over time, drawing on our varied experiences, we can make good inferences about the reasons for success or failure, and can adjust our actions accordingly. This is professional judgment at work.
It's time to reassert the importance of professional judgment, informed but not driven by data. Data is merely one tool at the disposal of skilled, experienced, and knowledgeable professionals.