Opinion
Assessment Opinion

Fractal Patterns, Playing Jazz, and Other Variations on a Metacognitive Theme

By Contributing Blogger — January 29, 2015 5 min read
  • Save to favorites
  • Print

This post is by Sarah Collins Lench, the Director of Policy and Innovation at the Educational Policy Improvement Center in Eugene, Oregon.

This is not another education think piece on “measuring what matters.” I’m not writing to echo the clarion call for more and better assessment of 21st Century skills, noncognitive factors, metacognitive learning strategies, or whatever the nom du jour may be.

That’s not to say these factors don’t matter. Quite the contrary: there is widespread agreement among researchers, business leaders, educators, and policymakers that success after high school requires more than content knowledge and academic skills. Dr. David Conley and my colleagues at the Educational Policy Improvement Center (EPIC) have synthesized two decades of research into a definitional model of college and career readiness that includes metacognitive components such as goal setting, self-direction, and self-efficacy. Surveys of major employers consistently identify “soft skills” such as creativity, collaboration, and communication as critical attributes for the 21st Century workforce. Drs. Carol Dweck and Angela Duckworth have added growth mindset and grit, respectively, to the aspirational vernacular of “what our kids need to succeed.”

The Innovation Lab Network (ILN), a group of states convened by the Council of Chief State School Officers, has worked over the past four years to develop a shared understanding of college and career readiness across dimensions of knowledge, skills, and dispositions. Individual states are increasingly codifying commitments to the deeper learning outcomes of their high school graduates; see Colorado’s Postsecondary Workforce Readiness criteria, Maryland’s Skills and Success, Maine’s Guiding Principles, New Hampshire’s Work Study Practices, and Hawaii’s definition of readiness that incorporates the indigenous concept of wayfinding. Proclamations like these send a signal to the field that skills and dispositions are important, essential even, yet leave us wanting in terms of how to support, compel, and incentivize the education system to actually incorporate them into regular classroom practice.

So how do we operationalize essential skills and dispositions into policy and practice? EPIC is partnering with the University of Kentucky’s Center for Innovation in Education (CIE) and a group of school, district, and state leaders to address this very question.

There’s an instinct here to jump right to assessment as the policy solution: “Now that we’ve proclaimed, we need to assess. Assess to define, assess to measure to hold accountable.” But that poses a whole new set of problems: these things are notoriously hard to measure using large-scale assessment instruments (though people far smarter than me are doing incredibly interesting work in this area); these things are often ill-defined for the field as fixed standards of practice (or worse, fixed character traits); and it assumes the current configuration of measurement and accountability (and responsibility) can deliver the deeper learning outcomes we seek. “If only we could find the right test, we could drop that sucker into our school quality index, high five ourselves for using multiple measures, and call it a Next Generation day.”

My colleagues and I are proposing a different approach: to lead with the learning. We’re working with educators from around the country to unpack and explore how a set of essential skills and dispositions - collaboration, communication, self-direction, and creativity - develop across content areas and over time. Adapting a model of skill acquisition inspired by how people learn to play jazz, we’ve created developmental frameworks that describe the progression of these skills from beginner to emerging expert, from rules to analysis to intuition, from tinkering to practice to flow, from controlled context to far transfer across domains.

As a community of practice, teachers use the frameworks to inform how they design learning experiences, learning environments, and learning relationships. Take, for example, the classic and dreaded group project. You know, the kind where one person ends up doing all the work. Why? Because it’s just easier that way. Why? It might be because group members aren’t very good collaborators, but more often than not, it’s because the project isn’t truly a collaborative task. A deeper understanding of collaboration helps a teacher construct learning tasks where the division of labor produces efficiency and where combined contributions increase the quality of work product. Frameworks can also help identify collections of evidence that make the collaborative process visible alongside the final work product (e.g., reflective journals, process portfolios, or think-alouds).

Understanding the development of essential skills and dispositions, we can better understand the contexts and conditions in which they thrive. Understanding contexts and conditions, we can make better policy choices. Our work is fairly nascent, but we’ve got a few hunches based on observations of early innovators. First and foremost, these deeper learning outcomes have a fractal effect on the whole system. The very skills and dispositions (and mindsets and behaviors) we hope to foster in students, we need them present in the adults of the system: to be self-reflective, proactive, willing to try new things, with the courage to fail.

In the interest of being self-reflective myself, I’ll return to my early assertion that this post is not about measurement and assessment. That’s not entirely true. We can’t do this work, we can’t “lead with the learning,” without good assessment. Good assessment, however, reaches back to the Latin origin of the word “to sit beside.” We need rich and authentic means of collecting evidence and providing feedback to guide the continued development of these skills and dispositions. Student self-reports, teacher observations, and curriculum-embedded performance tasks hold promise as evidence of learning, more so when combined with other information. And just as learning tasks need to make product and process visible, measures of school quality may need to be designed to make visible both student outcomes and school enabling conditions. Each of these evidentiary approaches bump up against validity, reliability, and feasibility constraints of accountability policy as it is currently conceived.

But a classroom designed to foster essential skills and dispositions will look very different than one focused on the compliant transfer of content knowledge. So, too, will a policy system. What that looks like, we’re still figuring that out. Like learning to play jazz.

The opinions expressed in Learning Deeply are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.