« Us (Meaning the U.S.) Against the World | Main | Senior Year Alternatives in Math »

National Math Panel: Under A Microscope


Less than a year after a federal panel offered its blueprint for how to improve teaching and learning in math, a number of academic researchers have put some sharply worded critiques of that work in print.

Their reviews have been published in a special issue of the Educational Researcher, a journal of the American Educational Research Association. The AERA, a well-known, nonpartisan Washington organization, invited and published the essays, which examine the final report of the National Mathematics Advisory Panel, titled “Foundations for Success.”

The math panel was appointed in 2006 by President Bush to study effective strategies for improving student learning in math, particularly in steeling them for algebra. In sum, the panel, comprised of academic scholars, cognitive psychologists, and others, called for a more streamlined pre-K through grade 8 math curriculum, with a strong emphasis on making sure that students master certain content at early grades—particularly whole numbers, fractions, and aspects of geometry and measurement. The panel’s 19 voting and 5 nonvoting members reviewed about 16,000 total documents over a 18-month period. The final, 90-page report, released in March, struck a conciliatory tone with regard to the so-called “math wars,” ideological disputes about how to teach math, calling for a mix of curricular approaches and teaching styles.

Many of the essays in the AERA journal, not surprisingly, take issue with one of the more controversial aspects of the panel’s work: the standards of evidence its members relied on to judge the effectiveness of math programs and curricula. The panel gave the strongest weight to scientific studies that “meet the highest methodological standards,” and which have been replicated in different kinds of settings. To critics, those standards resulted in too much weight being given to a research method known as a randomized control trial. The panelists’ reasoning (as explained in one of the AERA essays) is holding math programs to high standards was necessary, if the panel’s recommendations were to have relevance on a national scale in schools around the country.

One of the essays, written by Paul Cobb and Kara Jackson, criticizes the panel’s “unflagging adherence” to experimental studies, which they say “adversely affects the quality and usefulness of [it’s] recommendations.” Another essay, whose lead author is Jere Confrey, asserts that the panel applies its own standards inconsistently from math topic to topic, which results in “serious breaches” of the panel's ability to produce a high-quality, objective report. (A few years ago, Confrey led a panel of the National Research Council, which produced a 2004 report on how to judge the effectiveness of math curricula. The NRC is an independent research entity chartered by Congress.)

Confrey and her co-authors also allege that the panel’s work is already “contributing to a marginalization of mathematics educators and to the neglect of decades of research on children’s learning of mathematics.”

Another essayist, Finnbar C. Slone, of Arizona State University, has a different take. He takes issue with the panel’s reliance on randomized trials, but also suggests a new “working model” for studying math education.

The panel’s chair and vice-chair, Larry Faulkner and Camilla Persson Benbow, respond to these critiques with their own essay defending their standards of evidence. They also seek to explain the panels’ methods, and the constraints under which its members worked. They note that the panel needed to establish clear criteria for judging math research, even if definitions of what constitute scientific evidence amount to a “moving target.” Several panelists, during the group's open discussions, voiced surprise at the lack of research about what works in K-12 math education, despite the broad public worry about U.S. students’ mixed performance in that subject. Faulkner and Benbow write that they hope the panel’s work can direct academic research where it is most needed.

Readers of the report should see it not “as the end of an initiative” they write, but as “the first step of a more formalized process that moves from rhetorical handwringing to the framing of initiatives and the development of future research directions.”

After you’ve sampled the AERA essays, I hereby solicit your own commentaries in this forum.


The tables and what readers infer from them in the National Math Advisory Panel Report are what truly matters.

A streamlined curriculum means that mandated topics need not be 100% of a math course. This is the great unwritten benefit of the report. Without stating it, NMAP agrees with the Core Knowledge, CK, method of using a core curriculum: the core hovers around 50% of a course in a CK classroom. The actual percent of time spent on streamlined curriculum will depend on students and their groups, which is a difficult issue to address in print. This allows enrichment and remediation to occur in real-time, while staying on an acknowledged, understood national path.

The Benchmark Table is what regulates math instruction. The table implies that mastery learning be implemented within a streamlined curriculum and its assessment. It's about learning, not teaching!

The School Algebra Table, California Algebra 2 without conics, adjusts the pathways of the high school math curriculum, striking a balance between rigor and realism. Algebra 2 is absolutely necessary, but High School Geometry, isn't! Knowing how to determine the measures of inscribed angles isn't really vital. Yes, A1-G-A2 may be the standard and wonderful, but A1-A2 works.

All of the comments and responses on research methods, etc. are embarrassing. They really are beside the point. Words on research-based are merely buzz or rants of the chattering class, but a reasonable, streamlined curriculum that demands mastery learning - now that's something special. Are there any comments that the tables of the NMAP Report are unreasonable or a poor starting point? No. In short, opponents of the report, need to ask themselves: are we bricks in the wall? (with thanks to Pink Floyd)

Should be "Paul Cobb," not "Paula."

Now changed to Paul Cobb, thank you.


“Without any doubt, the foundational skill of algebra is fluency in the use of symbols.”

This statement appears in the Task Force Report on Conceptual Knowledge and Skills of the National Math Panel. Notice that the language used is "the" foundational skill, namely the one that is most important to all. Yet this statement, as strong as it is never makes it into the actual report of the entire panel.

In the body of the report, instead, one will find this statement, "The most important foundational skill not presently developed appears to be proficiency with fractions."

If the most foundational skill of algebra is the ability to work with symbols, as stated in the task force report, success with fractions will have a limited impact upon the students' ability to work with symbols.

I recall my days as a mathematics supervisor where it was common knowledge among the math teachers that students who were good in arithmetic in the 8th grade, including fractions, would not necessarily do well in algebra. Algebra was more abstract, and it required the ability to think and work with abstract symbols.

The NMP report seems to rely upon their national survey of 743 randomly chosen high school teachers of algebra who indicated that their students had a special weakness in working with rational numbers. Apparently, the panel wanted to remedy this weakness by suggesting that more focus be given to the teaching and learning of fractions in the earlier grades.

It would have been of interest if the Panel had actually tested a group of students in Algebra 1 and correlated their grades or algebra achievement with their performance on basic skills. Is it true, as some seem to understand the Panel to be saying, that those students with strong arithmetic and computational skills would also do well in algebra? What percent of the students with strong computational skills still do poorly in algebra? After all, the task force report noted that the foundational skill is the ability to work with symbols. Simply teaching fraction does not address this need.

In the same NMP survey of high school teachers noted above, the respondents noted a concern even greater than poor student ability to work with fractions, namely "working with unmotivated students." The teachers noted that this was the single most challenging aspect of teaching algebra 1 successfully.

Hence, one would think that the Panel would have looked into what approaches have been used, if any, that have motivated student interest in algebra. Since the goal of the Panel was to see what could be done to improve student success in algebra, one would think that the Panel would have given due attention to any programs in the elementary and middle schools that are already producing strong success in both achievement and student motivation in algebra.

At least one such program was presented to the Panel, namely, Hands-On Equations®. (Presentations to the Panel were extremely limited... they were limited to a mere 5 minutes; no overhead projector or any other technology was made available with which to display information or research results; a paper, however, could be presented to the Panel.)

It seems to me that when the Panel was informed that more than 1500 Making Algebra Child’s Play workshops had been conducted for elementary and middle school teachers on the teaching of algebra using a visual and hands-on methodology, and that the teachers were quite enthusiastic about what they were learning, that the Panel had an obligation to look into what was happening.

It seems to me that when the Panel was informed that 4th and 5th graders, including inner city students were successfully solving equations such as 4x + 3 = 3x + 9 using Hands-On Equations, that the Panel had an obligation to look into the research, to ask for supporting data, to interview the teachers in the study. Was it true that 85% of inner city students were correctly solving 4x + 3 = 3x + 9 and similar equations? Shouldn’t the Panel have vigorously pursued this question, either to confirm or to deny the results? And if the results were confirmed, was this not important avenue of research for the Panel to pursue? What was it about this program that made it as successful as it was?

For the Panel to focus only on research of a particular type, such as randomized studies, if that is what was done, and to ignore work and results actually taking place in the classroom --along with the accompanying research, consisting of pre- and post-tests results showing dramatics gains in student learning —was to miss out on a valuable line of inquiry.

In the Presidential Order establishing the Panel, the Panel was given the latitude to work or report on "other matters related to mathematics education as the Panel deems appropriate." Certainly the Panel had the obligation it would seem to me, to look into the methods of instruction used in an Algebra 1 class and to inquire if those methods contributed to the lack of student motivation which the Algebra 1 teachers themselves cited as their primary concern limiting their ability to teach algebra successfully.

The Panel was presented with a survey result of 751 teachers from grades 3 to 8. These teachers noted that their expectation of success in teaching algebraic concepts to their students would rise dramatically if they were to use the Hands-On Equations approach rather than the "traditional teaching methods."

Indeed the percentage increased from 16% to 98%. That is, 16% of these teachers believed they could successfully teach 80% or more of the students in their lowest classes to solve equations such as 2x + x + x + 2 = 2x + 10 and 2(x + 4) + x = x + 16, using "traditional teaching methods" whereas 98% of these teachers believed that with Hands-On Equations they would be successful in teaching these same concepts to at least 80% of the students in their lowest classes.

The results of this survey, it seems to me, along with the subsequent student achievement results which confirmed the teacher expectations of future success, should have alerted the Panel to the need to look into teaching methodology, and in particular, the methodology of Hands-On Equations. The research results on the value of gestures in learning in general and in learning mathematics in particular, would have further suggested this line of inquiry.

A major drawback to the work of the Panel, therefore, seems to be that it did not question or question sufficiently the methods of instruction used in an Algebra 1 class, whether or not better methods existed, and whether or not those methods can be presented to students many years earlier. It is indeed unfortunate that evidence was presented to the Panel to suggest this line of inquiry but, to all outward appearances, this line of inquiry was not pursued.

Borenson's response is so dead-on: my simpler comment was too simple. I teach remedial Algebra to 17-year-olds and to get past the extreme difficulty they have with fractions and mixed numbers, I employ two line, natural display calculators extensively (Note: the Casio fx-300 is easy to use.). Students can solve Algebra problems without really knowing fractions, but their number sense is so limited that success on tests that don't permit useful calculators or later in college seems problematic. Sorry no research on that.

Borenson's use of research is right in teaching Algebra 1, but are the two tables in the NMAP Report actually wrong? Mastery learning of some mechanical skills seems pretty important for useful success, not just passing a course. Wouldn't it be good to adopt the streamlined curriculum suggested in the report? Is there some best that isn't the enemy of the better?

Having read the Panel's recommendations, I would agree that students need to know basic foundational skills at an earlier age then are presently required in the mathematics curriculum. Most Algebra teachers note that students need to have a sound understanding of fractions but isn't it possible that even without that knowledge, students can and do understand algebraic concepts if those concepts are presented in a visual/kinestethic way? Aren't a great many students in classrooms today, visual/kinesthetic learners? Isn't the use of traditional methods and strategies in teaching Algebra the reason that "Algebra for All" hasn't been as successful as anticipate (no research finding to quote)? It is unfortunate that the Panel did not for want of time consider actual successes in classrooms using a visual/kinestetic approach to teaching algebraic concepts, analyzing those methods, and then deciding to recommend successful strategies? The Panel is to be commended on its work. As any Panel, its recommendations can be critizied.A positive outcome of the Panel's recommendations would be that Mathematics curriculum would be reviewed, analyzed], and compared to the top international Mathematics curricula used in countries/states such as Singapore and elsewhere. Why is the United States not in the top 10 of Mathematics? Is it the curriculum, lack of mathematical training of elementary teachers, or outdated strategies for teaching Algebraic concepts? Is the belief that young children can not learn the concepts of Calculus or Algebra? How about our brightest students who are not being challenged by an outdated curriculum and standards that they have already met and are proficient at those standards? Why is the US holding students back in Mathematics due to outdated beliefs that they can not learn algebra at earlier ages? Using systems such as Hands On Equations gives access to successful understanding to many elementary students, students who have not experienced success, bilingual students, and low ability students (who often are the first to understand the concept if presented in a visual manner). I would agree with Henry Borenson that all students can learn Algebra if the strategy provides them with an opportunity to learn.

Ask teachers why students have found algebra to be a difficult subject to learn and they will tell you it is the abstractness of the subject; the use of symbols and the disconnect they feel when they use these symbols. In spite of what teachers know we still rely on the past "traditional" methods that causes students much difficulty and anxiety. What is missing is a clear understanding of the algebraic concepts that allow students to solve equations. Hands-On Equation uses a visual model, not abstract symbols, that allow students to set-up equations using physical objects. The active participation and kinesthetic moves the students make to solve equations develop the core essential skills needed to solve Algebra 1 equations. I have used this program in successfully teaching 5th grade students, as well as instructing teachers from the elementary and secondary levels, how to use the visual/kinesthetics methods in the classroom. I strongly urge the panel to talk with teachers who have used the program, review the research data on the effectiveness of this program with elementary and middle school students and consider recommending this program as bridge program that helps students to successfully acquire the skills to use the symbols and algorithms of traditional algebra.

There is much concern regarding the determination of "what works" in teaching practices. The problem with this question is that the answer is so complex. What works for some students is different than what works for other students. Research with different groups of students can show us that some practices are more effective than others; however, the challenge is to arm teachers with a toolbox of strategies so that they can select appropriate strategies for their particular group of students. The other aspect of this question that makes it so complex is the level of teacher understanding of the content. If we don't address the gaps in teacher understanding, even the best materials and strategies will be insufficient to fill the gaps in student understanding.


The National Research Council (NRC) and its parent organization, the National Academy of Sciences (NAS), are not part of the federal government.

Therefore, the panel chaired by Jere Confrey for the NRC was not a "federal panel" as stated.

The NRC is an independent, congressionally chartered entity that does not receive direct federal appropriations. While individual research projects are, by the NRC's own description, funded by federal agencies, they are also paid for by other government and private sources. A description can be found here:

So, for this particular item, I will change the original "federal" reference, thanks.


Comments are now closed for this post.


Most Viewed on Education Week



Recent Comments