Assessment

Girls Outperform Boys on First NAEP Technology, Engineering Test

By Jaclyn Zubrzycki — May 17, 2016 6 min read
  • Save to favorites
  • Print

On the first national assessment of technology and engineering skills, 8th grade girls scored higher on average than 8th grade boys.

And suburban and rural students significantly outscored their peers in cities.

But those gaps were significantly smaller than the familiar gaps between wealthier students and their less-affluent peers, and among racial and ethnic groups, that crop up on national assessments in other subject areas.

The technology and engineering results come from the first new test in a decade from the National Assessment of Educational Progress, also known as NAEP or the “nation’s report card.” The Technology and Engineering Literacy exam, or TEL, was piloted in 2013 and administered to 21,500 students in approximately 840 public and private schools around the country in 2014.

The test was designed to give educators and policymakers a glimpse at students’ skills in what Bill Bushaw, the executive director of the National Assessment Governing Board, which sets policy for the NAEP, referred to as the “T and E in STEM.”

“This will establish an important baseline from which to grow our knowledge about what students know and can do in this area,” Bushaw said.

In addition to gauging students’ problem-solving skills, the new test also surveyed students about their in- and out-of-school experiences with technology, engineering, and problem-solving.

The test is a harbinger of elements to come in future NAEP exams. Rather than answering a series of multiple-choice questions or writing essays, the 8th grade test-takers worked through a series of virtual scenarios aimed at testing their problem-solving abilities and their ability to use information about technology and engineering to develop solutions. Officials said similar scenarios would likely make appearances on future NAEP tests, starting with history and social studies. The test was also the NAEP’s first entirely digital test; NAEP plans to administer all its tests digitally by 2017.

“It’s a very different kind of test,” Bushaw said.

Scores and Insights From Home and School

Overall, 43 percent of the tested 8th graders scored proficient or above on the test. Since this is the first year the test was administered, these results will be a benchmark for future administrations of the test. (The next is slotted for 2018 and will also be given only to 8th graders.)

Forty-five percent of female students and 42 percent of male students scored proficient or advanced. Female students scored particularly well on portions of the test focused on information and communication technology, but also outperformed or matched male peers on questions about design and systems and about technology and society.

Female white students scored the highest of any group. Female white and black students significantly outperformed male peers in their own racial groups, while the gaps between genders were smaller for Asian and Hispanic students.

The breakdown by geographic area also yielded striking differences: Suburban students scored a full 10 scale-score points higher on average than students in cities (154 out of 300, compared to 144 out of 300). Rural students scored closer to their suburban peers.

On a call with members of the press on Monday, Peggy Carr, the acting commissioner of the National Center for Education Statistics, which administers NAEP, noted that many rural students said that they had experience problem-solving at home or out of school.

But gaps between wealthier students and students eligible for free or reduced-price lunch outstripped those differences: Fifty-nine percent of more-affluent students scored proficient or above, compared to 25 percent of students who were eligible for free and reduced price lunch. The more-affluent students scored 28 points higher on average than lower-income students on the test (163 points out of 300, compared to 135). And while Asian and white students scored 160 points on average, the average Latino student scored 138 points and the average black student, 128 points.

In a press release, Carr said the 3 percentage point gap between girls and boys made a strong case that “girls are better able to understand and evaluate technology and then use it to solve problems and achieve goals.”

Carr said that the NCES and NAGB noted the gender gap in particular because it butted against conventional wisdom about the technology skills of girls and women. She said that while the test required a significant amount of reading, there was no evidence that the gap in scores was due to girls’ reading ability. “It was slightly unexpected and noteworthy,” she said of the results for girls.

In and Out Of School

NCES’s Bushaw said that when considering gaps between other groups of students, it was worth considering “gaps of opportunity.”

According to the results released today, 48 percent of students were not taking any technology, computer, engineering, or industrial technology course at all. But about two thirds percent said they had studied the topics in science class.

Many students reported out-of-school experiences in problem-solving: Some 87 percent of students reported that they had at some point figured out what was wrong with something in order to fix it.

And two-thirds of students said that what they knew about problem solving and fixing things had been taught by family members, not at school.

Students who expressed confidence in their problem-solving and engineering abilities scored better on the test.

The report from NCES noted that students who had regular out-of-school experiences with learning about technology had higher scores overall.

“There are opportunities to learn both in school and outside of school,” Bushaw said.

Scenario-Based Tasks

The National Assessment Governing Board spent more than two years developing a framework for what kinds of skills and abilities should be tested in a technology and engineering test. The framework focused on three main topics: technology and society; design and systems; and information and communication technology.

The idea was that the NAEP would gauge students’ problem-solving skills rather than their background knowledge.

As they take the test, students work through multistep scenarios that range from creating a historically accurate museum exhibit about a drought to developing safe bike lanes in a city. Students are provided with background knowledge about the topics before they are asked to answer questions about them: One of the scenarios included a background video about iguanas before students were asked to design an ideal iguana habitat.

Carr said that the organization avoided some of the technical difficulties that have plagued other online and digital tests by bringing its own Internet and devices into schools.

Students’ progress on each of the scenarios’ steps are scored. For instance, on a task related to designing a bike lane, 76 percent of students successfully identified components of a safe bike lane, the first step; 64 percent were able to identify potential adjustments to a sample set of bike lanes to make them safer by, for instance, expanding the lanes; 45 percent were able to successfully redesign the route using an interactive tool. But a smaller portion, 11 percent, could explain the rationale behind the route that they chose.

Carr said that the scenarios would provide testmakers and educators with information about students’ approaches to problem-solving. “We have data we haven’t even analyzed yet” about students’ approach to the scenarios, Carr said.

Carr said that NAEP plans to introduce similar scenario-based tasks on its other exams in the near future. She said the first is likely to be in social studies or history.

Here’s the governing board’s “A Closer Look At NAEP.” And here’s a set of videos describing the assessment:

Related stories:

Related Tags:

A version of this news article first appeared in the Curriculum Matters blog.