« Can D.C. Merit Pay Plan for Teachers Deliver? | Main | The Education Disaster Campaign »

Educational Reportage Needs Improvement

The cover story of the latest issue of TIME magazine serves as an example of how journalists often get in over their heads in reporting on educational issues ("Should Schools Bribe Kids? Apr. 19). Amanda Ripley looked into the work of Harvard economist Roland Fryer Jr., who wanted to know if cash is the answer to motivating students to learn.

Ripley described how Fryer eventually convinced 143 schools in Chicago, Dallas, New York and Washington to participate. Half the students were randomly selected as a control group (students received no cash) and the other half were randomly selected as the experimental group (students received cash) based on their performance on 10 routine tests throughout the year. In all four cities, a majority of students were African American or Hispanic and from low-income families. Ripley attempted to explain why the results varied so widely from city to city.

But nowhere did she mention what is known as the Hawthorne effect as a likely factor. People often behave differently at the beginning of an experiment or innovation than they do later on. This temporary change in performance - typically for the better - is in response to a change in environmental conditions. The mere fact that something is new often spurs change. The trouble is that as the novelty wears off so does the motivation of those involved, even though all other things remain unchanged.

Although the term Hawthorne effect was first coined in 1955 in connection with experiments conducted by Henry A. Landsberger at the Hawthorne Works outside of Chicago, it has been observed in educational settings as well. Because Fryer's study focused on children whose attention span is limited compared with adults' attention span, it is an ideal candidate for the Hawthorne effect to make itself prominently felt. Yet Ripley overlooks its obvious relevance.

Ripley is not alone. Coverage of other educational issues by many of her colleagues frequently reflects lack of familiarity with such pertinent terms as Campbell's law, Simpson's paradox, John Henry effect, principle of the flat maximum, and the base-rate fallacy. So maybe it's time for those reporting or commenting on education to take a crash course. I suggest they begin with "Reading Educational Research: How to Avoid Getting Statistically Snookered" by Gerald W. Bracey (Heinemann, 2006). They may come away appreciating how Occam's razor applies to their work.

(Ripley e-mailed me after the above was posted to explain that she had spent two months on the assignment, and took great pains to get the details right. She said that she was indeed aware of the Hawthorne effect, but did not mention it because "the study did not include data for more than a year in some of the cities." Moreover, "the effects of the experiment did not seem to entirely fade the year after the kids stopped getting paid." I appreciate her amplification.)

Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.


Most Viewed on Education Week



Recent Comments