Reactions to the recently published PISA Results
The recent release of PISA data for New Zealand (top-line results here) has led to a wide range of reactions from teacher organisations and principal groups, to social commentators and government.
Some have responded by questioning the reliability of the PISA statistical methodology, while others point to social and economic conditions that impact on the results. Read in isolation such responses may be regarded more as excuse making or political positioning, but when viewed together in the broader milieu within which our education system operates, we gain an insight into the complexity of issues with which educational leaders must grapple — at every level, from those leading learning at the classroom level, to those leading schools and those leading the national level support and policy frameworks that guide this effort.
The 2013 rankings reveal New Zealand slipping from seventh to 13th in reading, seventh to 18th in science and from 13th to 23rd in mathematics. No one likes to feel they’re ‘failing’ in what they do, and the first response may often appear defensive. It’s ironic that when the previous results were announced a lot was made of New Zealand’s position in the top five countries in most areas. Now we read that ‘while we have dropped in most areas, we’re still in the top half of the OECD’, pointing to similar drops by other countries including Finland and Australia.
To react negatively or politically is to miss an opportunity for improvement
To respond only to the headlines, or to simply quote the ‘rankings’ to defend a particular ideology or political stance, is to miss the opportunity to learn from what the PISA data tells us. Over the next 12 months we’ll see the more in-depth reports released, based in further analysis of the data, and providing more specific insights into the achievement areas of literacy, numeracy and science. There’s a lot the data will be able to inform us about in terms of links between socio-economic status and achievement, or what areas specifically within the mathematics curriculum (or science, or literacy) are strengths and which are weak, for example.
The other useful thing about such a rich data source is that we can begin to explore any comparisons between what it reveals in terms of NZ students’ achievement in these international results and what our own, domestic results are telling us (NCEA, national standards, Bursary etc.). If these are consistent, then our response will be quite different than if we discover there are inconsistencies. This should also help us identify specific areas that we need to target for interventions such as targeted professional development, or resource allocations etc.
The key thing will be to take sufficient time to make a considered and informed response, and to avoid any sort of ‘knee-jerk’ reaction (although that will be a challenge for New Zealand given that 2014 is an election year). Rather than using the results for political points scoring, it would be great to see a genuinely cross-party and cross-sector approach to resolving what we need to do as a country, for the sake of all of our learners, now and into the future, regardless of which political party happens to be in government.
What can our schools do with the report findings as we plan for 2014?
As we plan for 2014 it’s likely we’ll see a number of responses at a national/policy level, including more targeted provision of PLD, even before the review of professional learning and development is completed, with a special focus on science being called for.
Within our schools we need to be encouraging more in-depth engagement with the PISA findings with our staff and communities in order to understand what an appropriate response should be. Even more important, our response should be to think more critically about how we are using data more broadly — not only the PISA data. As educators, we are collecting data regularly from a wide variety of sources — our challenge with all of it is to consider how we are using it to inform what we are doing with the programmes we offer, to ensure that student needs are being met, and student achievement remains the focus. The PISA results provide a challenge to consider what we’re doing in the areas of literacy, numeracy, and science — but what about our programmes in the arts or social sciences? Where is the data to show how we’re supporting achievement in these areas for instance?
Let’s not fall into the trap of laying blame or making excuses. Let’s use the challenge of the PISA results to become more intentionally engaged with data in our schools, in the way we collect it, interpret it, and make plans based on the evidence it provides — across all areas of the curriculum.


Latest posts by Derek Wenmoth (see all)
- Good teaching is like good cooking! - November 16, 2020
- Online conferences: an in-depth example of getting value for you and your team - August 28, 2020
- Future Ready – bringing your graduate profile to life! - February 26, 2020