In Defense of Bubble Tests
I'm sure that people far more knowledgeable about higher education innovation than I have plenty of smart and well-informed things to say about Nathan Heller's recent New Yorker piece on elite universities' entry into the MOOC market. So I'll offer only three observations. First, it's refreshing to read an article in an elite media publication that actually acknowledges that there's a lot more to higher ed than Ivy League schools and state flagship universities. Second, while a minor point in the overall piece, I found this discussion of assessment striking, given the widespread derision of "bubble tests" in K-12 education circles:
In Nagy's "brick-and-mortar" class, students write essays. But multiple-choice questions are almost as good as essays, Nagy said, because they spot-check participants' deeper comprehension of the text. The online testing mechanism explains the right response when students miss an answer. And it lets them see the reasoning behind the correct choice when they're right. "Even in a multiple-choice or a yes-and-no situation, you can actually induce learners to read out of the text, not into the text," Nagy explained. Thinking about that process helped him to redesign his classroom course. He added, "Our ambition is actually to make the Harvard experience now closer to the mooc experience."
You mean multiple choice tests can measure deeper comprehension?!?!?!?!
Third, after reading this piece I really want to check out Harvard Professor Gregory Nagy's CB22x course on heroism in Classical literature--and now I can.
The entire piece is available to non-subscribers on the New Yorker site here.