Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Assessment Opinion

How U.S. STEM Practices Compare Internationally

By Rick Hess — May 20, 2019 4 min read
  • Save to favorites
  • Print

The OECD recently issued its new book-length report, “Measuring Innovation in Education 2019.” As I recently noted, the authors offer some fascinating peeks at how the OECD nations compare when it comes to K-12 policy and practice. Today, I’ll flag five big questions that they help to answer in the case of STEM. (Note: All of the following results were calculated using TIMSS data.)

How frequently do students use computers when learning math? The authors calculated the share of fourth graders who “frequently” practice skills and procedures “on computers during math lessons,” according to teachers. In the U.S., between 2007 and 2015, the figure rocketed from 18 to 79 percent (Figure 2.3). That 61-point jump outpaced the growth in the OECD average, which leapt 42 points, from nine percent to 51 percent, and was outpaced only by the increases in Australia and New Zealand.

Do American students do more—or less—math homework than their peers abroad? We hear a lot about American students doing too much homework . . . or too little. Well, the authors use teacher reports to calculate the share of eighth graders whose teachers gave them homework “at least twice a week.” The U.S. turns out to be on the high end, with 86 percent of eighth graders getting homework at least twice a week in 2007, compared to an OECD average of 56 percent (Figure 8.1). That gap has narrowed some, however, as the OECD figure has remained static while the U.S. figure fell by 11 points, to 75 percent, in 2015.

When teaching science, do teachers demonstrate experiments to their students? The OECD average, when it came to fourth-grade teachers who reported doing “an experiment or investigation in at least half” of their lessons, rose steadily, from 13 percent of fourth graders in 2007 to 35 percent in 2015 (Figure 3.9). The figures in the U.S. closely tracked the OECD average, rising from 16 percent in 2007 to 36 percent in 2015.

While in science class, do students actually perform experiments? Teacher reports were used to calculate the share of fourth graders who “conduct[ed] experiments or investigations in at least half” of their lessons. In 2007, 45 percent of fourth-grade teachers in the U.S. “conduct[ed] experiments or investigations in at least half” of their lessons, a rate substantially higher than the OECD average of 33 percent (Figure 3.11). Between 2007 and 2015, however, the gap closed, as the OECD average climbed to 46 percent (fueled by huge increases in nations like Australia, Singapore, and Norway) while the U.S. figure rose at a slower rate, to 53 percent.

How do U.S. efforts to use incentives to recruit and retain STEM teachers compare? In 2007, according to principal reports, seven percent of U.S. eighth graders attended schools that “use[d] incentives to recruit or retain eighth grade math teachers,” while 10 percent of their peers across the OECD nations did so (Figure 12.5). In 2015, the figures had ticked down just a hair, to six percent in the U.S. and eight percent across the OECD. When it came to eighth-grade science teachers, the story was nearly identical (Figure 12.6). While a handful of nations, like Singapore, England, and the Russian Federation have made incentives a substantial element of their recruitment and retention efforts, that is clearly not the case in the U.S.—and, for better or worse, it’s not the norm across the OECD nations.

What should we make of all this? At least three points seem noteworthy.

First, it’s not at all clear whether a given finding means the U.S. is doing something “right.” For instance, we just don’t know if more math homework is a good thing or not. Now, I tend to suspect that having students do experiments and using incentives to attract STEM teachers are probably good ideas—but I’m confident that how you do these things matters more than whether you do them, which means the simple numbers I’m reporting can’t tell us which nations are getting it right.

Second, for all the fuss and fury about how we’re doing on STEM, the fact is that our policies and practices don’t look all that different from those of the other OECD nations—not in 2015, nor even in 2007. In fact, outside of computer utilization in classrooms, it was tough to find STEM practices or policies where the U.S. was dramatically different from the norm.

Third, when one surveys a couple hundred pages of tables, it’s striking that the high-performing STEM countries aren’t doing anything that’s obviously distinctive. Some high-fliers have students self-correct their homework, employ a lot of classroom computing, and devote a lot of time to basic skills . . . and others don’t. Moreover, it’s not like high-performers (or low-performers) have been moving in any particular direction over time. Despite popular efforts to distill the secret sauce of high-achieving nations, there’s not a lot of evidence here that there is an obvious route to STEM excellence. So it goes.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.