A DIY Approach to Educational Research-Literacy
Rounding out our month of guest bloggers this week is RHSU-veteran Eric Kalenze. Eric is the Director of Education Solutions at the Search Institute in Minneapolis and the U.S. organizer of researchED. He has nearly two decades of experience as a teacher, coach, administrator, and author. Regularly found at his own blog, "A Total Ed Case," Eric has kindly agreed to pull double-duty with us this week.
Welcome to the end of the week, RHSU reader, and thanks for joining me! Thanks also to Rick for letting me take the wheel this week and to Team RHSU's Amy and Grant for all the excellent coordination. It's been an honor and a pleasure to be here.
My Monday post noted that a research-and-evidence "pulse" is indeed beating in U.S. education reform, but not strongly enough to do our various systems much good. Wednesday, I then called out some bad, blockage-causing implementation habits we'd be wise to give up if we want to strengthen this emergent pulse. Today, I'll build further onto the "health plan" with some "exercise" advice—things anyone in the field can do to build their research literacy.
Now, I've shared themes like these fairly often in my writing, talks, and independent work with schools, and right about here I'm always sure to emphasize something really important: Just as with any self-improvement endeavor, whether we're talking about weight loss or physical therapy or learning a new language, there's no way to avoid that this part is on us. The same way a personal trainer can't lift the weights for us to strengthen our muscles, the "exercise" part of building our research-and-evidence health is (1) going to take some real work and (2) not something we should count on as being provided for us.
Think of it as you might find it in Rick's Cage-Busting Teacher: You can stay in the cage, waiting for your district's PD department to give you the right kinds of time and structure (yeah, right), or you can outline a plan of study and get reading. If you want to use what you learn to "manage up" and effect school-wide change, maybe push back on some evidence-weak fad initiative. Whether you do or not, you're still likely to come away with some ways to improve your personal craft—and maybe even rub off on a few teammates' practices. (And then, boom: you're not just a cage-busting teacher, but also at the front of a truly classroom-up reform—a leader of evidence-informed practices. Congratulations.)
The good news is that, if you're willing to put in the work, this is a great time to become more research-literate on your own. A research-and-evidence pulse is now beating out there, after all, causing lots of good reading to hit shelves for your first-level study. Additionally, the web can quickly connect you to many great guides and partners. Open a Twitter account and poke around a bit, and you'll be conversing about sound evidence bases, reading practical blogs by other educators, and arguing about effective approaches within minutes.
With all that in mind, here are a few tips that might give some shape to your personal research-and-evidence "exercise plan." Whether you're a classroom teacher, a building leader, a journalist, a policymaker, or whoever, minding some of these guidelines should minimize risk of "injury" and, I hope, collectively push our new research-and-evidence pulse to maximum efficiency and effectiveness.
1. To begin, ask ONE question. Beginning with a single focus for inquiry ("What is the effect of balanced literacy programs on students' reading outcomes in my state?" or "Is there evidence that 1-to-1 iPad initiatives have improved students' learning in districts that have implemented them?") can cordon off more workable spaces and keep distracting or erroneous information under control.
2. Run, don't walk, to the cognitive science. Many of the least-effective yet furthest-reaching approaches in education are based on ideals about how people learn best, not on the science of how they learn best. The good news is that the past few decades of advances in cognitive science can give us much more reliable information about human learning based on scientific trials. Regardless of your role in education, I'd heavily suggest getting up to speed with these concepts as quickly as possible. Here are some starter resources:
- Deans For Impact: The Science of Learning document
- The work of Dr. Daniel Willingham. All of it. Start at Why Don't Students Like School?
- Make It Stick: The Science of Successful Learning
- The Learning Scientists website: cognitive scientists bridging to educational application (includes downloadable FREE resources)
3. Plug in to other practitioner-researchers. A worldwide community of professionals passionate about education research can be found on Twitter, and many of them maintain blogs that discuss practice, reform, resources, etc. If you've put off opening an account, at least do it for this. You won't regret it.
Seek conferences that emphasize evidence-based educational improvement. I've mentioned researchED a few times throughout this series and will do so again here because its practitioner-level engine and its focus on evidence suit this "personal research-and-evidence exercise plan" vision so well. Our next U.S. event is not yet scheduled, but be in touch if you'd like us to bring one closer to you.
4. Know how we got here. Education's history is important to know, likely because we're so inclined to repeat our mistakes—a point Rick makes well in the "Why History Matters" chapter of Letters. Plus, histories are rich with gateways to other study via their deep references and indices. Try these for starters:
- Diane Ravitch's Left Back and The Death and Life of the Great American School System
- Herbert Kliebard's The Struggle for the American Curriculum, 1893-1958
- E.D. Hirsch's Why Knowledge Matters
- David Tyack and Larry Cuban's Tinkering Toward Utopia
5. Stay focused on what "good evidence" is. Pay special mind to matters of context, sample size, and actual student impacts. When you look under the hoods of research studies (and certainly under the hoods of media reports of research studies) for these pieces of information, you may be surprised by how they can color "promising findings."
For example: If a study's abstract reports that a particular math intervention showed "significant effects on student outcomes" but the methods show the sample size to be 30 students in an affluent suburban school, you may not want to rely on only that study as your evidence—especially if the context you work in is markedly different. Similarly, be wary of studies that measure effectiveness based on things like teacher-satisfaction surveys. This matters, of course, but teachers' satisfaction is not (or shouldn't be, anyway) the first measure of whether or not something is "working" in a school.
And yes, studies and journalistic pieces like these are out there. Stay focused on what matters (kids' progress—that is, not just kids' or teachers' satisfaction or that the school was awarded a Blue Ribbon by Apple or whatever), and look for whether it happened in sufficiently impressive chunks. And if the data presented seems vague ("Some of our kids grew by 200%," the school founder said...), either move on or request more specificity.
With that, happy researching! Thanks in advance for doing the exercise that'll strengthen our collective research-and-evidence pulse—and reach me via the contact info on my blog or on Twitter (@erickalenze) if you have questions.