Federal

Students Show Progress Under Teacher-Bonus System

By Stephen Sawchuk — February 27, 2013 5 min read
  • Save to favorites
  • Print

A performance-bonus system that made use of “student learning objectives"—academic growth goals set by teachers in consultation with their principals—helped improve student achievement in schools using the measure, concludes a new study issued today.

The study, by the Community Training and Assistance Center, a Boston-based nonprofit technical-assistance and policy-evaluation firm, found that students taught by participating teachers in math improved on average at a rate 12 percent higher than those in comparison schools. That rate of growth was enough to narrow gaps with their peers in those comparison schools, who started somewhat ahead of them. In reading, students in participating schools also improved by a rate that was 13 percent greater than that of their peers in comparison schools.

Student Learning Objectives, or SLOs, are a particular kind of growth measure that doesn’t require the use of traditional standardized tests. Teachers typically engage in a process of goal-setting, by determining, for example, to increase the number of students demonstrating proficient skill in dividing fractions. Then, in consultation with principals and colleagues, they select the appropriate way to measure progress toward the goal, and devise a teaching plan to reach it. If they reach it, they get a bonus payout.

The study underscores, though, that the findings reflect “more than money": Using the SLOs purposefully requires principals and teachers to engage in a deliberative process for planning and delivering instruction toward the selected goals.

“SLOs provide a vehicle to ensure that high quality and rigorous student assessments are matched by high quality and rigorous instructional practices,” the report says. “Measuring teacher performance rigorously through the use of student results is not, in and of itself, an improvement strategy.”

Many states are also working to introduce SLOs into their teacher-evaluation systems, as are thousands of districts. Districts that use SLOs for either pay or evaluation purposes include Austin, Texas, and New Haven, Conn. States that have adopted them include Colorado, New York, Ohio, Rhode Island, Pennsylvania, Lousiana, Hawaii, and Indiana.

Grant Funding

The SLOs were introduced to the Charlotte-Mecklenburg, N.C., district in 2007, as part of its winning bid for a Teacher Incentive Fund grant. TIF is a federal initiative that helps states and districts develop performance-based teacher- and principal-compensation plans.

Charlotte’s program began in 2007-08, and the SLOs were rolled out in 2008-09.

The Charlotte grant underwent some significant changes midway through, including the addition of a “value added” model to the bonus system during the 2009-10 school year, which rewarded teachers the 70th percentile or higher. In addition, the amount of the bonus payouts changed each year.

For the study, CTAC analyzed student-achievement results from state tests, nearly 4,000 SLOs that had been submitted by teachers and approved by their principals, and surveys of educators in the participating schools.

They used a quasi-experimental methodology to look at achievement growth in the TIF schools, and to compare it with that in a host of comparison schools. As described above, the project had positive effects overall on student achievement.

The researchers also examined the specific relationship of the SLOs and student learning. They found links between the quality of the growth goal and student achievement, or between the attainment of the growth goal and student achievement in some instances, but not in others. During the “peak” year, 2009-10, the quality of the teachers’ growth goals was linked to student achievement in elementary reading and math, as well as in middle school math. Also during that year, growth in student achievement for elementary pupils was also linked to teachers who attained their growth goals.

Generally, teachers who participated in the TIF program for three years developed higher-quality SLOs and attained them more frequently, the analysis says. Those who wrote higher-quality SLOs also tended to attain them in the 2009-10 and 2010-11 years, the analysis finds. And finally, those teachers who received bonuses for were more likely to have high-quality SLOs than those who did not earn the bonus.

Interest Growing

The findings are consistent with those from CTAC’s study of the impact of the Denver ProComp pilot program which it helped to oversee.

The findings also have implications for teacher evaluation because more states and districts are adopting SLOs. They seem to be especially popular for evaluating teachers of grades and subjects for which standardized-test-score data are not available.

The SLO approach to teacher evaluation has some tradeoffs, compared with value-added. It can be adapted for any subject; but unlike value-added, the individual growth goals typically aren’t comparable across classrooms (though teachers can band together to create common ones).

The paper highlights another potential advantage of the SLOs: They are supposed to include a purposeful process in designing and setting them that doubles as a form of professional development. (One of teachers’ complaints about value-added is that they don’t know how to hit the targets.)

Although more than a dozen states are using or plan to use SLOs as part of their teacher-evaluation systems, CTAC Executive Director William J. Slotnik noted that not all of them are requiring teachers and principals to research and outline the instructional steps they will take to meet the goals as part of the process. That potentially weakens their ability to help teachers focus on pedagogical changes.

“When [evaluation] is more about instruction, you have a greater chance of it being educationally valid for teachers,” he said.

In the meantime, the report also gives insights into teachers’ feelings about the program and the methods of including student achievement. Based on the survey data collected, some teachers in Charlotte-Mecklenburg participating in TIF reported distrusting or not understanding the value-added component added midway through the grant, or feeling as though that decision wasn’t made with their input.

But at the same time, they were generally more open to using VAM as part of a rewards system than were teachers not participating in the program.

We’ve added a new discussion feature where you can converse about this issue. Vote, comment, and share this discussion—we look forward to hearing your opinions. (Facebook login required.)

A version of this news article first appeared in the Teacher Beat blog.