« Social Promotion Rap Up | Main | Educational (Anti)depressants: Some Links »

Professors Strike Back?

| 5 Comments
profs%20strike%20back.jpg
What's worse: evaluating college quality using standardized tests (Madame Secretary's pet project), or relying on Rate My Professors? At Rate My Professors, students rate their professors on "educational" qualities like their hotness, their easiness, their helpfulness, and their clarity. (Here's a nice Village Voice article about RMP; hat tip: Mike Arnzen). Now MTV has kicked off a spoof called "Professors Strike Back," in which profs respond to comments ranging from "I want to be her slave" to "Eats children for breakfast."

A mocking blog called Rate Your Students has emerged in response - you can read about some unbeloved students in this post (Head-Nodders, Laptop Kids, Winter Flip Floppers, and Some Nefarious Wannabe Gangsters. Where is that Walmart Application?).

Don't get me wrong - I'm all for the course evaluations that are typical at most campuses. Because everyone (who shows up) completes one, you have a full sample of students - not just the angry and elated - and narrative sections allow students to provide meaningful feedback on how to tweak the course in the future. Propositioning is generally not included, though students still throw in the occasional pediatric temper tantrum.

I'm undecided on whether and how colleges should make course evaluations public. On one hand, the public release of formal evaluations would help students decide among many courses. On the other hand, a student-driven evaluation system creates incentives to pander to Gen Facebook, and further encourages the "I'm paying, so I deserve an easy A" consumerism of many students.

So I'm on the fence about the role of course evaluations in assessing college teaching. Readers, what's your take? How should profs' teaching be evaluated?

Update: To clarify, there are at least 4 questions raised by this post:

1) How should learning be evaluated in college?

2) Are course evaluations a fair and comprehensive measure of college teaching? (Of course not, in my opinion.)

3) What should universities do with student course evaluations?

4) What are the potential risks/benefits to students and profs of making them public?
5 Comments

Wait a second... when did a survey become "evaluation"?

This is a double-barreled post, eduwonkette. Are you interested in how colleges should evaluate faculty teaching, or what should be done with students' end-of-term course evaluations? Let me be provocative (a skoolboy's prerogative) and say that I don't think that end-of-term course evaluations should be treated as a measure of what students have learned, regardless of what the students have said. That's because the depth and scope of that learning may not be evident on the last day of the semester. It may not be until sometime later, when the course and its content are juxtaposed with different contexts and subsequent learning experiences, that what was learned in the course might come into focus. I'd rather rely on a student's retrospective account a year later. In my own experience as a student, I haven't always been able to predict what would stick with me (or transfer, the more technical learning term) after a course is over.
(Of course, maybe I'm just rationalizing the fact that some of my students come away from my classes saying, "What the hell was that all about?")

Sherman and Skoolboy,

You're right that there are multiple questions posed in this post:

1) How should learning be evaluated in college?

2) Are course evaluations a fair and comprehensive measure of college teaching? (Of course not.)

3) What should universities do with student course evaluations?

4) What are the potential risks/benefits to students and profs of making them public?

The postponed evaluation approach is interesting, skoolboy. I can recall two very negative course evaluations that I wrote in college. Given the chance to write them again, one of the two profs would receive a positive evaluation.

1) How should learning be evaluated in college?
I agree with skoolboy that evaluations are less about "what" a student has learned, but rather how they learned it. Was it effective, provocative, did they think, explore, change viewpoints? Evaluations can assess this in a more abstract kind of way.

2) Are course evaluations a fair and comprehensive measure of college teaching? (Of course not, in my opinion.)
While course evaluations may be arguably unfair, they do present a random enough sample of student opinion to offer the school and the professor a wide angle view of course value. Most often (at least in my experience) evaluations were optional, and anonymous, making it easier for students to evaluate the professor fairly, ie: honestly, thoroughly and without it affecting their grade.

3) What should universities do with student course evaluations?
In the case of public university, those state funded ones, evaluations should be made public. People paying for the education of the their children, their neighbors and others should have the right to know how professors are being evaluated, just like any other state paid professional. However, I do think that releasing evaluations for the previous year would help to prevent any "professor shopping" by students who are looking to skate through school, while providing a way to publicly reveal trends of courses or professors that might need restructuring or reconsideration.

4) What are the potential risks/benefits to students and profs of making them public?
I think the potential risk would be that professionals would be required to remain up-to-date in their field, students would be allowed to choose courses/professors that suited their learning style/preferences, and higher education, like public education in the primary grades, would be forced to reform to meet the demands of a changing society and workforce. Unlike technical schools, or research programs, much of college education still takes place in a traditional lecture format that is increasingly irrelevant to today's students.

I had a colleague that received criticism about his score on the end of term evaluations.

Next term he told students that anyone that received an A on his exams would get free pizza and cola at a pizza party, his treat.

His course was just as hard, same requirements, same hard exams, his evaluation scores went significantly up.

It's all about perception. They now perceived him as "caring" about them. Before all the hard work he put in the class wasn't enough evidence that he cared what they learned. The power of free pizza cannot be ignored!

Comments are now closed for this post.

Advertisement

Recent Comments

  • Doctor T: I had a colleague that received criticism about his score read more
  • everydayjae: 1) How should learning be evaluated in college? I agree read more
  • eduwonkette: Sherman and Skoolboy, You're right that there are multiple questions read more
  • skoolboy: This is a double-barreled post, eduwonkette. Are you interested in read more
  • Sherman Dorn: Wait a second... when did a survey become "evaluation"? read more

Archives

Categories

Technorati

Technorati search

» Blogs that link here

Tags

8th grade retention
Fordham Foundation
The New Teacher Project
Tim Daly
absent teacher reserve
absent teacher reserve

accountability
accountability in Texas
accountability systems in education
achievement gap
achievement gap in New York City
acting white
admissions
AERA
AERA annual meetings
AERA conference
AERJ
Alexander Russo
Algebra II
American Association of University Women
American Education Research Associatio
American Education Research Association
American Educational Research Journal
American Federation of Teachers
Andrew Ho
Art Siebens
ATR
Baltimore City Public Schools
Barack Obama
Bill Ayers
black-white achievement gap
books
books on educational research
boy crisis
brain-based education
Brian Jacob
bubble kids
Building on the Basics
Cambridge Education
carnival of education
Caroline Hoxby
Caroline Hoxby charter schools
cell phone plan
charter schools
Checker Finn
Chicago
Chicago shooting
Chicago violence
Chris Cerf
class size
Coby Loup
college access
cool people you should know
credit recovery
curriculum narrowing
D3M
Dan Willingham
data driven
data-driven decision making
data-driven decision-making
David Cantor
DC
Dean Millot
demographics of schoolchildren
Department of Assessment and Accountability
Department of Education budget
desegregation
Diplomas Count
disadvantages of elite education
do schools matter
Doug Ready
Doug Staiger
dropout factories
dropout rate
dropouts
education books
education policy
education policy thinktanks
educational equity
educational research
educational triage
effects of neighborhoods on education
effects of No Child Left Behind
effects of schools
effects of Teach for America
elite education
ETS
Everyday Antiracism
excessed teachers
exit exams
experienced teachers
Fordham and Ogbu
Fordham Foundation
Frederick Douglass High School
Gates Foundation
gender
gender and education
gender and math
gender and science and mathematics
gifted and talented
gifted and talented admissions
gifted and talented program
gifted and talented programs in New York City
girls and math
good schools
graduate student union
graduation rate
graduation rates
guns in Chicago
health benefits for teachers
High Achievers
high school
high school dropouts
high school exit exams
high school graduates
high school graduation rate
high-stakes testing
high-stakes tests and science
higher ed
higher education
highly effective teachers
Houston Independent School District
how to choose a school
IES
incentives in education
Institute for Education Sciences
is teaching a profession?
is the No Child Left Behind Act working
Jay Greene
Jim Liebman
Joel Klein
John Merrow
Jonah Rockoff
Kevin Carey
KIPP
KIPP and boys
KIPP and gender
Lake Woebegon
Lars Lefgren
leaving teaching
Leonard Sax
Liam Julian

Marcus Winters
math achievement for girls
McGraw-Hill
meaning of high school diploma
Mica Pollock
Michael Bloomberg
Michelle Rhee
Michelle Rhee teacher contract
Mike Bloomberg
Mike Klonsky
Mike Petrilli
narrowing the curriculum
National Center for Education Statistics Condition of Education
NCLB
neuroscience
new teachers
New York City
New York City bonuses for principals
New York City budget
New York City budget cuts
New York City Budget cuts
New York City Department of Education
New York City Department of Education Truth Squad
New York City ELA and Math Results 2008
New York City gifted and talented
New York City Progress Report
New York City Quality Review
New York City school budget cuts
New York City school closing
New York City schools
New York City small schools
New York City social promotion
New York City teacher experiment
New York City teacher salaries
New York City teacher tenure
New York City Test scores 2008
New York City value-added
New York State ELA and Math 2008
New York State ELA and Math Results 2008
New York State ELA and Math Scores 2008
New York State ELA Exam
New York state ELA test
New York State Test scores
No Child Left Behind
No Child Left Behind Act
passing rates
Pearson
picking a school
press office
principal bonuses
proficiency scores
push outs
pushouts
qualitative educational research
qualitative research in education
quitting teaching
race and education
racial segregation in schools
Randall Reback
Randi Weingarten
Randy Reback
recovering credits in high school
Rick Hess
Robert Balfanz
Robert Pondiscio
Roland Fryer
Russ Whitehurst
Sarah Reckhow
school budget cuts in New York City
school choice
school effects
school integration
single sex education
skoolboy
small schools
small schools in New York City
social justice teaching
Sol Stern
SREE
Stefanie DeLuca
stereotype threat
talented and gifted
talking about race
talking about race in schools
Teach for America
teacher effectiveness
teacher effects
teacher quailty
teacher quality
teacher tenure
teachers
teachers and obesity
Teachers College
teachers versus doctors
teaching as career
teaching for social justice
teaching profession
test score inflation
test scores
test scores in New York City
testing
testing and accountability
Texas accountability
TFA
The No Child Left Behind Act
The Persistence of Teacher-Induced Learning Gains
thinktanks in educational research
Thomas B. Fordham Foundation
Tom Kane
Tweed
University of Iowa
Urban Institute study of Teach for America
Urban Institute Teach for America
value-addded
value-added
value-added assessment
Washington
Wendy Kopp
women and graduate school science and engineering
women and science
women in math and science
Woodrow Wilson High School