National Center on Education and the Economy: Can you give us an overview of the history of Australia's National Assessment and curriculum efforts and what spurred on their development and the decision to create ACARA?
Barry McGaw: Australia has been involved in international studies of student performance since they began in the 1950s; however, the first national surveys of literacy and numeracy occurred in 1976. The evaluations were sample-based only, a bit like NAEP in the United States. And then various Australian states introduced sample-based surveys in other curriculum areas.
The first assessment of all students across the country in literacy and numeracy occurred in 1990 in New South Wales, which is the largest state in Australia. It was very controversial at the time, particularly with the teachers unions. So, New South Wales began testing all of their students in literacy and numeracy, and then Victoria, the second-largest Australian state, followed, and then other states gradually joined in. Western Australia participated with a really creative set of sample-based surveys that covered their entire curriculum, but only moved to test the entire student cohort when the then-federal Minister of Education made it a condition of federal funding.
By the mid-2000s, the state Ministers of Education decided it would be good to get results expressed in a way that made them comparable across the country. They instituted a process of creating a common scale allowing each state to continue administering its own assessments, but calibrated onto a common scale. After a couple of years, there was some concern about how good the calibration was, and they said, if we're all testing in order to get the results on a common scale, why don't we all use the same tests? And so from 2008, the six states and two territories in Australia have all used the same literacy and numeracy tests nationwide, also known as the National Assessment Program - Literacy and Numeracy (NAPLAN).
The next major event occurred in 2007. During the 2007 federal election campaign, the opposition announced that it supported the establishment of a national curriculum in English, Mathematics, Science and History. They won the election at the end of 2007. At the beginning of 2008, they set up an interim national curriculum board. I was appointed chair, with the goal of developing national curriculum in English, Science, Mathematics and History. The government added geography and languages other than English to the curriculum development plans in mid-2008, and then the arts were added later.
At this point, the intention of the federal government was to discuss with the state governments what kind of governance arrangement would be instituted in the longer term for this body. There was some debate about whether it would be set up as a not-for-profit that the ministers collectively owned or a more formal statutory authority of the Australian parliament. The federal minister won out, suggesting that the Council of Ministers (the federal minister and the six state and two territory ministers) would be the policy board for this new body. In late 2008, Australia established the new Australian Curriculum, Assessment and Reporting Authority with responsibility not only for national curriculum development, the national literacy and numeracy assessments, but also the sample-based assessments already underway on a three-yearly cycle for science, civics and citizenship and ICT literacy. Finally the task of public reporting on schools was added to the portfolio. That work has resulted in the creation of the My School website.
NCEE: You mentioned that these national assessments were controversial with teachers unions. Can you talk a little bit about what their concerns were, and how ACARA was able to address them?
BM: The teachers unions were unhappy about the public nature of the reporting and this kind of external assessment. They typically argued that these types of assessments don't tell us anything we don't already know about our students. The response was, "you know how your students are doing in relation to one another, but you don't know how your students stand compared to students across the country." What typically wins the day in this debate is parents. Parents value information that shows them the bigger picture. When the My School website came along, suddenly parents are not just getting reports on their own children, but they're able to see, collectively, for all the students in their school how they're doing in comparison with other schools. This information can be controversial, but to help on this front, we include information about the circumstances of the school.
Some other countries do what they call value-added analysis. Here is our approach. We obtain measures of students' background, in particular on their parents' education and their parents' occupation. We're not trying to measure socioeconomic status, we're trying to measure socio-educational status - what are the benefits that kids get from the occupation and education of their parents as they come to school. We use this information to create an Index of Community Socio-educational Advantage. And then for every school, we look at just 60 schools - 30 schools immediately above it and the 30 immediately below it on that index. These are schools that have got essentially the same kinds of kids. These are the schools that you can learn from, that can challenge you. Now we can show a school another school with kids exactly like theirs that is doing much better. That school can then ask themselves, what policies they might consider that are different from the ones currently being used? What practices could the school adopt that might reproduce the kind of superior performance that you see in the higher performing schools? So that's essentially the strategy of the My School website. Not only can it assist parents in their choice of schools, but it underpins attempts at school improvement.
NCEE: Last year, the OECD published a report on evaluation and assessment in Australia. What did you think about the recommendations that they made for Australia's system?
BM: I think it's a good report. The big thing that we are doing now, as the report pointed out, is developing a strategy for formative assessment. But let me explain where we are first.
The final version of the national curriculum in English, Math, Science and History for kindergarten to grade 10 was adopted in October 2011 and is up on the website. It was quite a historic moment, actually. Already the curriculum is being implemented in the Australian Capital Territory, which is like Washington, DC. Queensland and South Australia and the Northern Territory started implementation in January - our school year is the calendar year - and Victoria will have a major pilot in a couple of hundred schools; New South Wales and Western Australia will start in 2013.
What we now have to clarify is the achievement standards. For example, the curriculum states, that, in grade five, in mathematics, these are the things students should have an opportunity to learn. We see our curriculum as a kind of statement of student entitlement. What they should have an opportunity to learn is knowledge, understanding and skills, not just factual stuff. Then we declare in the achievement standards, if a student has satisfactorily learned this, what will a student be able to do? Those statements can be difficult to interpret in any kind of precise way, so what we are doing now, is putting on the website actual samples of students' work, produced in response to real classroom tasks with annotations to say, this student work meets the standards and why. This year, while the curriculum is actually being implemented, we'll be obtaining a richer set of samples illustrating different levels of achievement at the A, B, C, D and E levels. The samples of student work will be annotated, for the first time, by teachers across the country, so that we'll have nationally annotated samples of student work that can move in the direction of getting consistent use of formative assessments across the country.
NCEE: Can you expand on why it's important to have examples of student work when presenting the new curriculum to educators?
BM: You will see on the website, that there are statements of achievement standards to give teachers an idea of what students can do, given the opportunity to develop the knowledge, understanding and skills, set out in a particular part of the curriculum. We think that it is difficult to write such statements in a way that is unambiguous for teachers and that it is much more helpful to also provide samples of real student work in response to real tasks created by teachers, but then assessed by a group of teachers from across the country and annotated to provide an explanation for the judgments they make.
Under the previous federal government there was a requirement introduced that all schools report student performance to parents on an A-E (or equivalent) scale. Our annotated samples of students' work will illustrate performance for each score, A to E, for each subject, each year. We collected quite a few last year from schools involved in piloting the K-10 English, Mathematics, Science and History curricula, but will collect more during 2012 as some of the states will have already begun full implementation.
NCEE: While building NAPLAN and the national curriculum, what lessons did you draw from other countries? Are there any countries in particular that you used as a model, and in what ways? What do you see as distinctive about the Australian system?
BM: NAPLAN grew out of state-based assessments of literacy and numeracy that began in New South Wales in its then Basic Skills Testing Program in 1990. The other states followed over the years. While I was in Paris at the OECD, the Ministers for Education decided that the results should all be expressed on a common scale across the country. The separate tests were equated to achieve this, but then the Ministers decided that it would be better to use common tests. NAPLAN was the result and the first NAPLAN tests were introduced in 2008. Interestingly, there was no common curriculum behind NAPLAN. The new test reflected the separate tests that it replaced.
As part of the development of the Australian Curriculum, ACARA was directed also to develop literacy and numeracy continua and then to review and revise NAPLAN as necessary to reflect those continua. We will time this change on the basis of implementation of the new curriculum with a revised NAPLAN probably to come in 2014.
In our curriculum, we paid attention to practices elsewhere. Our mathematics curriculum, for example, has been increased in difficulty particularly at the elementary school level on the basis of our analysis of mathematics curricula in Singapore and Finland, two countries that outperform Australia in the international comparisons offered by programs such as OECD's PISA.
NCEE: Is there anything else you would like to talk about with regard to the report, or the direction the system is going in?
BM: I'd like to say something about the curriculum itself, rather than the assessment system. When we got started, we were calling what we did the development of content standards. I found out from talking to an American journalist that we borrowed that term from you. I also learned that in the United States you couldn't talk about national or state curriculum, so you used these words. What we are doing now is saying that we are developing curriculum or the learning entitlements. We say to schools that by whatever means you teach, this is the knowledge, understanding and skills that your kids are entitled to have the opportunity to acquire. You've got to get around the constitutional arrangements in order to do the right thing. Australia has strong constitutional arrangements that say that education is the responsibility of the states, not the commonwealth, not the federal government. So how did we get there? We got there by making it a collaborative arrangement. All of this is decided not by the federal minister; all of this is decided by the six states, two territories and the one federal minister sitting at the table together.
This interview has been edited for length and to reflect the correct dates. To read the full piece visit: http://www.ncee.org/2012/01/international-reads/