« What Can Research-Practice Partnerships Learn From Each Other? | Main | 5 Questions on Design-Based Research Partnerships, Answered »

Why Isn't Research Used in Practice? Hint: It Actually Might Be

| No comments

This week we are taking a break from our regularly scheduled programming to reflect on some important concepts relevant to research-practice partnerships. In today's post, NNERPP's Director (@RPP_Network), Paula Arce-Trigatti, shares her thoughts on the many ways research can get used in practice.

With NNERPP's latest convening at the Annual Forum in late July, I was reminded once again of a deeply held belief that I only recently realized I had to unlearn: when referring to a practitioner's "use of research," people often mean one very specific thing.

In this world, a researcher produces some (breathtakingly) important findings that those in education practice (obviously) flock to, read, and immediately correct course. The research to practice pipeline is simple and linear in this world: research is produced, it is then disseminated, practitioners incorporate it as great advice to their decision making process, and voilà — research is used in practice. Given my academic background, it shouldn't be surprising that I found myself as part of this crew. 

And as far as I can tell, this way of thinking is widespread. Traditional academic researchers across many disciplines seem to believe that this is the way research is used in practice; generally speaking, it is the only way we are trained to interact with practitioners.  

After being immersed in the world of research-practice partnerships, I later learned this viewpoint is quite flawed. Not only is this model antiquated, research shows that it is wrong. As Coburn, Honig, and Stein write in their paper, the use of research in practice is "complex and at times messy" (p. 1), which is pretty much the opposite of "simple and linear."

In today's blog post, I'd like to spread the word about the many faces of research use in practice. If we are truly concerned with increasing the use of research evidence in practice and having an impact on education practices, then the least we might do is be aware that research use can take many forms. Even better, we might think about strategies we could adopt that would complement these numerous opportunities, whether working in an RPP or not.  

The many faces of research use

For this post, I draw from a well-known book in circles examining the use of research in practice: "Using Evidence: How research can inform public services" by Sandra M. Nutley, Isabel Walter, and Huw T.O. Davies (2008). The authors' goal (as well as mine) is not to suggest that there is a neat, simple typology of research use wherein we can place behaviors in buckets and easily identify when it occurs. Indeed, Nutley and colleagues are careful to point out that research use "emerges as an iterative, fluid and non-linear process, which may progress through many different types of research use in sometimes unpredictable ways" (p. 46). In many cases, examples are not readily observable and empirically, they are difficult to code.

Instead, they (as I) suggest a more attainable goal: to illuminate and start a conversation around the many ways research can be used in practice. Here I've included the most common examples of research use I've seen (note that several different frameworks for how to describe research use have been developed over time; if you're interested in learning more, I suggest reading the book).

Instrumental use: This is the most common understanding of research use and is the name for how I described research use in the simple linear model earlier: research is applied directly to decisions of policy and practice. In the instrumental use of research, for example, a practitioner might say something like: "I was debating whether or not to implement Policy A. After I read the research on Policy A, I decided to go with Policy A."

Conceptual use: The conceptual use of research also impacts policy and practice decisions but in less obvious or easily measurable ways. These can include causing a shift in an individual's way of thinking about (or "conceptualizing") a problem, and can show up as a change in knowledge around or their attitudes towards a particular problem. Caitlin Farrell and Cynthia Coburn recently highlighted the importance of the conceptual use of research in district decision making and provide several illustrative examples for how it might show up.  

Strategic or tactical use: As the name implies, the strategic use of research describes a scenario where evidence is used to obtain a specific result. For example, a given research study might be cited to engender support for a particular policy or alternatively, the same study might also serve to discredit an opposing viewpoint. Interestingly, Nutley and colleagues suggest that this type of research use may be more common than instrumental uses.

Process use: Finally, in this use type, the process of conducting or designing research can affect how individuals think about a problem, much like in the conceptual use of research. An added bonus here, though, is that it gets policymakers or practitioners involved in the actual research, bringing important expertise to a typically one-sided endeavor. This use of research may also result in improving communication patterns among different stakeholders and could impact the research itself.

More than the other types of research use mentioned here, the process use of research helps illustrate the promise of RPPs to improve the connection between research and practice. By explicitly integrating both researchers and practitioners in the research, there is a high probability that evidence is used in practice, especially in terms of the process use type.

Putting these uses into practice

Whether or not you find yourself working in an RPP, you are now hopefully more aware that there are multiple ways to think about research use. This awareness can help us realize that evidence might get used in practice a lot more than we think—in different and perhaps unexpected ways outside of the widespread 'instrumental use' definition. Armed with this knowledge, one might next brainstorm around strategies to help support the many uses, as a whole new world connecting research and practice may have opened up.


Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Urban Education Reform: Bridging Research and Practice are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog


Most Viewed on Education Week



Recent Comments