« Reader Comment on Performance Pay | Main | Subscribe to Eduwonkette & Procrastinate More Effectively »

Reviewing External Quality Reviews, or: Consultant Whack-a-Mole!

I teach at a college that periodically commissions external reviews of the institution and its academic programs. Sometimes these external institutional reviews are "high stakes," such as regional accreditation reviews (e.g., North Central Association, Middle States, etc.) or professional accreditation reviews (such as the National Council for the Accreditation of Teacher Education). Out of the corner of my eye, I've been seeing an increase in the reliance of large urban school districts, such as New York City and Washington, DC, on external reviews (sometimes labeled "quality reviews.") I'm intrigued by the similarities and differences I'm observing.

Most external reviews begin with a self-study, which typically has three major dimensions: (a) What are your unit's goals? (b) How well are you meeting these goals, and what's the evidence? (c) What are you going to do about it? This is then followed by the proverbial "site visit," in which an individual or team from outside of the institution reviews the self-study, comes to the campus for a day or two, pokes around and asks questions, and retreats to write a report which is shared with the institution and its leaders. Often, the institution then will write a response to the report. Then the report goes on the shelf.

The composition of the site visit team can arouse some passion. In postsecondary institutions, site visitors typically are conceived of as peers of the faculty; but who counts as a peer is a matter of debate. How can someone from Eastern Podunk College ever understand how we at Elite University do business? Is a site visitor who studies 18th-century English literature really a peer of the faculty in an English department that focuses on contemporary American fiction?

I'm intrigued by the fact that in New York City and Washington, DC, the site visitors are external management consultants who are not educators within the system, and in fact may not be teachers or administrators in other systems. Consultants such as these would be laughed out of the room in a review of a college department; but nobody's laughing in large urban districts. I think this is because college faculty are assumed to have stronger claims to disciplinary knowledge and expertise than do K-12 teachers and administrators, and because the shared governance model in colleges and universities give faculty more control over academic decision-making than K-12 educators are typically granted.

Scholars of organizations make sense of external reviews by drawing on institutional theory. Institutional theory focuses on the relationship between organizations and their external environments, including the ways in which organizations are perceived to be legitimate by their external environments. An organization (e.g., school, district, or college) that is perceived to be high-performing generally doesn't have to worry about its legitimacy. But many educational organizations are not seen as high performers. In this case, they have to rely on some other way to be seen as legitimate than a demonstration of good outcomes. A common strategy is to imitate the practices of other social institutions that are seen as legitimate, in the hopes that the legitimacy will "rub off."

Many cases of education imitating the business world can be explained in this way. (Not that the business world has such a great track record to warrant serving as the ideal standard.) So, for example, because it's seen as rational for organizations to set goals and measure progress towards them, this is an integral part of most external review processes-much more so than direct inspection of what the organization is actually doing to meet those goals. This would account for the use of management consultants as external reviewers in New York City and Washington. In this sense, external reviews are mostly symbolic, rather than substantive.

This is, of course, a highly cynical view of external reviews-perhaps more than is warranted. I'd like to pose a couple of questions to eduwonkette's readers: (1) What are some legitimate purposes of external reviews of K-12 schools? (2) Based on these purposes, what should the composition of an external review team look like? The purpose in asking these questions is not to play whack-a-mole with consultants (although that may be a consequence), but rather to introduce a topic that I hope to post a bit more about over the next couple of days. I'm also curious if readers know of any evidence of external reviews actually improving teaching and learning in K-12 schools. Please feel free to e-mail me at skoolboy2 (at) gmail (dot) com to point me in a fruitful direction.

I served on the Chancellor's Quality Review Team for a high school in Washington DC. While I think our team could have offered insights, all we did was visit classes and fill out rating forms. The whole process was without a context -- a plan for school improvement. So you make a very good point, Eduwonkette, questioning the purpose of these reviews. Our team was made up primarily of central office personnel -- just three outsiders: one from the PTA, me, representing the community, and a consultant from a firm called "Insight" who seemed to be in charge becasue he was writing the report on his little laptop as we spoke. The Consultant from the outside firm knew nothing about the district and, although he had been a teacher in a past life, he hardly came across as having much expertise in education reform or school improvement. We, the team, will never see the report being written in our name. We will never interact with the staff at the school over the written report.

The whole purpose seemed more about making judgments that justify intervention, or worse, contracting the management of schools out to third parties, than it is about improvement in the quality of teaching and learning. There was no context for our review team visit to the school. There is no plan in DCPS under the new Chancellor for improving teaching and learning or even a clear definition of what good teaching is. How could the team be clear on what they were seeing and how could the school staff process the feedback they might get? These Quality Reviews only make sense in the context of a clear professional growth system -- a commonly shared definition of an effective school and of what good teaching looks like that the school system is implementing. Otherwise they are just random acts of judgment. Or worse, they are part of an abdication of the Chancellor's responsibility to nurture good schools and good teaching. I have participated in and helped to design Learning Walks that have a lot of legitimacy and can be very useful, in another school district. The devil's in the details -- the make-up of the teams, knowledge of the standards by the school staff, a language of improvement in the district, collaboration by the teachers' union -- and we need some groundrules or at least criteria for what needs to be in place to make these useful and not an insult.

I believe the quality reviews in NYC are conducted by Cambridge Education, an education consulting firm from the UK, not management consultants. Moreover, it is my understanding that these reviews are targeted to specific domains, namely the school's effective use of data to drive instruction.

You should also consider the review processes used by charter school authorizers. I'm familiar with charter school site visits in NY and Massachusetts and have found them to be conducted by knowledgable educators using specific frameworks for evaluating schools. These school evaluations are part of a high stakes process where evidence is collected to determine whether schools will remain open or be closed. And their reports are posted on the Internet for all to see.

Hi Gideon,

That's a great idea to look at external evaluations of charter schools. Yes, the NYC quality reviews are contracted with Cambridge Education, which I would characterize as a management consulting firm. You'll be able to see my take on the content of the NYC quality reviews tomorrow.

Comments are now closed for this post.


Recent Comments

  • skoolboy: Hi Gideon, That's a great idea to look at external read more
  • Gideon: I believe the quality reviews in NYC are conducted by read more
  • simar: I served on the Chancellor's Quality Review Team for a read more




Technorati search

» Blogs that link here


8th grade retention
Fordham Foundation
The New Teacher Project
Tim Daly
absent teacher reserve
absent teacher reserve

accountability in Texas
accountability systems in education
achievement gap
achievement gap in New York City
acting white
AERA annual meetings
AERA conference
Alexander Russo
Algebra II
American Association of University Women
American Education Research Associatio
American Education Research Association
American Educational Research Journal
American Federation of Teachers
Andrew Ho
Art Siebens
Baltimore City Public Schools
Barack Obama
Bill Ayers
black-white achievement gap
books on educational research
boy crisis
brain-based education
Brian Jacob
bubble kids
Building on the Basics
Cambridge Education
carnival of education
Caroline Hoxby
Caroline Hoxby charter schools
cell phone plan
charter schools
Checker Finn
Chicago shooting
Chicago violence
Chris Cerf
class size
Coby Loup
college access
cool people you should know
credit recovery
curriculum narrowing
Dan Willingham
data driven
data-driven decision making
data-driven decision-making
David Cantor
Dean Millot
demographics of schoolchildren
Department of Assessment and Accountability
Department of Education budget
Diplomas Count
disadvantages of elite education
do schools matter
Doug Ready
Doug Staiger
dropout factories
dropout rate
education books
education policy
education policy thinktanks
educational equity
educational research
educational triage
effects of neighborhoods on education
effects of No Child Left Behind
effects of schools
effects of Teach for America
elite education
Everyday Antiracism
excessed teachers
exit exams
experienced teachers
Fordham and Ogbu
Fordham Foundation
Frederick Douglass High School
Gates Foundation
gender and education
gender and math
gender and science and mathematics
gifted and talented
gifted and talented admissions
gifted and talented program
gifted and talented programs in New York City
girls and math
good schools
graduate student union
graduation rate
graduation rates
guns in Chicago
health benefits for teachers
High Achievers
high school
high school dropouts
high school exit exams
high school graduates
high school graduation rate
high-stakes testing
high-stakes tests and science
higher ed
higher education
highly effective teachers
Houston Independent School District
how to choose a school
incentives in education
Institute for Education Sciences
is teaching a profession?
is the No Child Left Behind Act working
Jay Greene
Jim Liebman
Joel Klein
John Merrow
Jonah Rockoff
Kevin Carey
KIPP and boys
KIPP and gender
Lake Woebegon
Lars Lefgren
leaving teaching
Leonard Sax
Liam Julian

Marcus Winters
math achievement for girls
meaning of high school diploma
Mica Pollock
Michael Bloomberg
Michelle Rhee
Michelle Rhee teacher contract
Mike Bloomberg
Mike Klonsky
Mike Petrilli
narrowing the curriculum
National Center for Education Statistics Condition of Education
new teachers
New York City
New York City bonuses for principals
New York City budget
New York City budget cuts
New York City Budget cuts
New York City Department of Education
New York City Department of Education Truth Squad
New York City ELA and Math Results 2008
New York City gifted and talented
New York City Progress Report
New York City Quality Review
New York City school budget cuts
New York City school closing
New York City schools
New York City small schools
New York City social promotion
New York City teacher experiment
New York City teacher salaries
New York City teacher tenure
New York City Test scores 2008
New York City value-added
New York State ELA and Math 2008
New York State ELA and Math Results 2008
New York State ELA and Math Scores 2008
New York State ELA Exam
New York state ELA test
New York State Test scores
No Child Left Behind
No Child Left Behind Act
passing rates
picking a school
press office
principal bonuses
proficiency scores
push outs
qualitative educational research
qualitative research in education
quitting teaching
race and education
racial segregation in schools
Randall Reback
Randi Weingarten
Randy Reback
recovering credits in high school
Rick Hess
Robert Balfanz
Robert Pondiscio
Roland Fryer
Russ Whitehurst
Sarah Reckhow
school budget cuts in New York City
school choice
school effects
school integration
single sex education
small schools
small schools in New York City
social justice teaching
Sol Stern
Stefanie DeLuca
stereotype threat
talented and gifted
talking about race
talking about race in schools
Teach for America
teacher effectiveness
teacher effects
teacher quailty
teacher quality
teacher tenure
teachers and obesity
Teachers College
teachers versus doctors
teaching as career
teaching for social justice
teaching profession
test score inflation
test scores
test scores in New York City
testing and accountability
Texas accountability
The No Child Left Behind Act
The Persistence of Teacher-Induced Learning Gains
thinktanks in educational research
Thomas B. Fordham Foundation
Tom Kane
University of Iowa
Urban Institute study of Teach for America
Urban Institute Teach for America
value-added assessment
Wendy Kopp
women and graduate school science and engineering
women and science
women in math and science
Woodrow Wilson High School