What Indicates Quality? What Demonstrates Accountability?
The New York Post's Mellissa Klein and Angela Monetefinise tell a fairly disappointing story of the city's Leadership Academy for aspiring public school principals.
The Academy was founded in 2003 "as a separate non-profit organization to recruit, train, and support a new generation of outstanding principals" - 1400 to be exact. It "was modeled after successful private sector initiatives such as General Electric’s John F. Welch Leadership Center and the Ameritech Institute."
In September 2006, "[F]or the Second Year in a Row, NYC Leadership Academy was named as one of the top ten non-profit Leadership Development Programs in North America" by Leadership Excellence - number 4 to be exact. Rating criteria include:
1. Vision/mission. Are these statements linked to strategy, meaningful to participants, and focused on target outcomes?
2. Involvement and participation. How broad is the involvement and how deep the participation?
3. Measurement and accountability. What ROI measures are made and reported and to what degree is accountability for performance and results part of the program?
4. Design, content, and curriculum. How well designed is the program? How credible is the content? How relevant is the curriculum? How customized is the program?
5. Presenters, presentations, and delivery. What are the qualifications of the presenters, how effective are their presentations, and how is the program delivered?
6. Take-home value. What do participants take away and apply to improve themselves, their families, their teams, and their volunteer work?
7. Outreach. What is the impact of the program on stakeholders?
So far, so good. But after reading the reporters' account of the Academy's cost, output and how well their alumni are performing, one has to wonder if Leadership Excellence doesn't do enough due diligence, doesn't place all that much weight on the third factor, or has set a terribly low bar for nonprofit leadership training in particular or leadership training in general.
Let's start with costs, reported to average out to $146,000 per program participant. Each participant spends about two years in the program. We're talking about New York City, but even so, that's comparable to sending someone to one of the nation's best teachers' colleges on a full scholarship for about the same amount of time. Heck, Harvard Business School tuition is about $40,000, leaving plenty of stipend for a student with a family to subsist for two years. At that price we have reason to expect results that stand out head and shoulders from almost any alternative method of principal training.
In his 1960 classic "Marketing Myopia," Harvard B School prof (and later Harvard Business Review editor) Theodore Levitt wrote that people don't buy drills, they buy holes. By the same token, NYC Public Schools aren't buying trainees, they are buying school-level administrators. So far the Academy has turned out 277 graduates at a cost of over $40 million. Of these 191 are principals, 86 are not heading schools today, but 34 are assistant principals, so 139 are doing what they were trained to do. By this measure, the Academy costs nearly $300,000 to fill the seat of a principal or assistant.
That's just what it costs to train working principals. Let's look at their results. Academy grads manage 118 schools. The reporters tell us that last year, 27 were graded at D or F, 33 at C, and 58 at A or B. What we don't know is whether these principals brought their schools up, down, or held them level. And we don't know whether they did any better than a random sample of principals drawn from traditional career paths and placed in a similar set of schools - or even the average principal in the average school or relevant school subgroup.
In short, we don't really know what the city got for its $300,000 principals, whether it might have gotten the same thing for paying America's objectively "top" principals a $300,000 signing bonus, or whether it might have done just as well if it never spent the money in the first place.
The ultimate measure of effectiveness here is 1) total program cost divided by something like 2) how many schools it improved - and to what degree - above whatever results we might have expected before the program started sending principals to schools. When we have the second - and it's knowable, if not known now, we can say something about "value (results at a price)."
1. This is another example of why there's nothing special about nonprofits serving public education.
What we have is another research-free school improvement program. If this story was about Edison or Kaplan or Platform Learning, the principal program would be in jeopardy. But it's a nonprofit, and so held to a lower standard of accountability and outcomes. Let's hope both Kleins and Montefinese pursue the story further.
2. This is a classic example of management by "it sounds plausible to me."
The idea of the Academy is credible, and if this were Newark instead of New York, it might be good enough. But New York is so big it both can afford to conduct a more systematic investment strategy and has a harder time recovering if it makes a bad guess.
No one really knows or has proved the one best way to train principals. Consequently, a better human resources investment strategy is to set some measures of merit like the ones I've outlined, decide on several plausible strategies, measure annually, provide the results to the participants and public, and invest more in the ones that work and less or nothing in the ones that don't. Under conditions of great uncertainty, the best procurement strategies involve multiple contractors leading to a "fly off," not a roll of the dice.
New York must have the first by default. Principals are born or made by more than one process. But the Academy is the "crown jewel" and the rest of what's required is lacking. A Chancellor who previously served as chairman and chief executive officer of Bertelsmann, and a Mayor who built his own media giant, in a city that rightly sees itself as the business capital of the world, could manage this.
It's the rare organization that voluntarily reports anything less than great news; that's something purchasers have to require by contract. Similarly, when district management puts too many eggs in one basket and becomes identified with a particular program rather than decisions about strategy, it generally becomes too invested in what it has decided to see much beyond the good news. That tends to cloud judgment and limit its willingness to share unpleasant results. And so why taxpayers' need the press.
3. This is an example of the one-way accountability that prevails in public school systems.
There's a whole lot of talk in education about accountability - generally coming from the top down and more broadly from the k-12 managerial/technocratic elite community down.
I'm all for accountability, but if and only if it runs both ways. Anything less is morally failed. Moreover, fundamental fairness lies at the core of good management. The troops won't follow leaders who don't share their real burdens and responsibility for outcomes. Here $40 million may be a rounding error in the New York City Schools' financial budget, but it is another draw on the leadership's "credibility budget" with rank and file employees.
We ask demanding, objective questions about the "value-add" of school staff to student performance every day, all the time. We need to ask the same questions of those who supply "guidance" and "support" and create at least as much pressure for them to produce. There are legitimate questions about the Academy's objective value - not whether people like the program or its graduates, but whether they raise student performance at a reasonable cost. That two-way accountability puts money in the credibility bank. Frankly, it's the only hope for school improvement at scale.