David Weston is Chief Executive of the Teacher Development Trust. This is one of the articles in the TDT December Newsletter (sign up here).
Brace yourselves for a week of finger-pointing as it’s now widely trailed (£) that England’s performance on PISA 2012 will be disappointing. I can confidently predict that popular solutions to this will be:
- Bring back grammar schools
- Abolish remaining grammar schools
- Tighten accountability mechanisms
- Loosen accountability mechanisms
- Change the national curriculum immediately
- Leave the national curriculum the same for a long time
- Bring in new GCSEs
- Abolish GCSEs completely
- Recruit more great teachers and fire more bad ones
- Make further changes to teacher training
While changes in system structures, accountability mechanisms, curricula, examinations and initial teacher education have their place it does seem to me like these are very well-trodden policy paths. Yet more frantic pulling of these levers is, I would argue, unlikely to produce the sustained improvement that we are looking for on a national basis.
The bottom line, for me, is that we need to continue to work on our school system’s capacity to improve itself. I think that the introduction of Teaching Schools is a very positive step but doesn’t fully address three key questions for school improvement:
- What is our starting point?
- What does more effective practice look like (and where can I find it)?
- How can we learn from it and develop it here?
As it stands, schools’ self assessment is often little more than an effort to second-guess Ofsted. We need more support for this through national networks of self-audit, peer-audit and support/challenge. Every school should, I believe, be a member of such a network which is outside of its governance structure to prevent inward-looking, closed groupings of schools.
Such networks will also help schools to identify more effective practice, but we also need a national system to allow the pooling of recommendations on where best practice lies, along with support to help improve the evaluation of these recommendations. At the moment we have many pockets of outstanding practice of which the rest of the system remains unaware. This doesn’t necessarily require a central ‘quality mark’ – we can harness the expertise and knowledge in existing bodies such as academy chains, teaching unions, school networks, subject associations and third sector organisations with each developing their own recommendation systems and allowing schools to make their own judgements as to which they trust. Recommendations should be evidence-based, with support to ensure that we constantly raise the bar in terms of the standard of evaluation of effectiveness.
Finally, we must improve the way we transfer knowledge and practice through the system. We must move away from isolated and one off visits and training courses toward more sustained engagement and co-development of ideas. We need, as David Hargreaves would put it, more Joint Practice Development, where teachers work alongside others to co-plan, co-observe and discuss, for example through Lesson Study. We must help schools engage in significantly more robust and objective evaluation of impact to ensure they can prioritise the most effective over the merely helpful or interesting. We must not only give teachers and school leaders greater access to research but must ensure that they are able to report their findings back to the rest of the profession and pick up patterns in larger scale studies.
I’m delighted to say that the Teacher Development Trust is working to contribute significantly to these areas through our national database of training and support, GoodCPDGuide.com and our National Teacher Enquiry Network – a partnership of schools and colleges developing world-class, evidence-informed professional learning. Please get in touch with us if we can help you and/or collaborate with you on these important goals.
This is one of the articles in the TDT December Newsletter.
You can sign up for the free Teacher Development Trust newsletter for more articles like this using the form below:
[salesforce form=”2″]
What is rarely done is to see the PISA results in context – Singapore may excel, for example, but at a cost of evening cramming schools and the consequential destruction of normal, healthy child development which requires play. Taking PISA ranking in isolation is unwise.
Finnish miracle: fata morgana?
Finnish students’ achievement (15 y) declined significantly: study of University Helsinki
University of Helsinki – Faculty of Behavioral Sciences, Department of Teacher of Education Research Report No 347Authors: Jarkko Hautamäki e.a. Learning to learn at the end of basic education: Results in 2012 and changes from 2001
S.: The change between the year 2001 and year 2012 is significant. The level of students’ attainment has declined considerably: under the mean of the scale used in the questions. The difference can be compared to a decline of Finnish students’ attainment in PISA reading literacy from the 539 points of PISA 2009 to 490 points, to below the OECD average. The mean level of students’ learning-supporting attitudes still falls above the mean of the scale used in the questions but also that mean has declined from 2001.
Since 1996, educational effectiveness has been understood in Finland to include not only subject specific knowledge and skills but also the more general competences which are not the exclusive domain of any single subject but develop through good teaching along a student’s educational career. Many of these, including the object of the present assessment, learning to learn, have been named in the education policy documents of the European Union as key competences which each member state should provide their citizens as part of general education (EU 2006).
In spring 2012, the Helsinki University Centre for Educational Assessment implemented a nationally representative assessment of ninth grade students’ learning to learn competence. The assessment was inspired by signs of declining results in the past few years’ assessments. This decline had been observed both in the subject specific assessments of the Finnish National Board of Education, in the OECD PISA 2009 study, and in the learning to learn assessment implemented by the Centre for Educational Assessment in all comprehensive schools in Vantaa in 2010.
The results of the Vantaa study could be compared against the results of a similar assessment implemented in 2004. As the decline in students’ cognitive competence and in their learning related attitudes was especially strong in the two Vantaa studies, with only 6 years apart, a decision was made to direct the national assessment of spring 2012 to the same schools which had participated in a respective study in 2001.
The goal of the assessment was to find out whether the decline in results, observed in the Helsinki region, were the same for the whole country. The assessment also offered a possibility to look at the readiness of schools to implement a computer-based assessment, and how this has changed during the 11 years between the two assessments. After all, the 2001 assessment was the first in Finland where large scale student assessment data was collected in schools using the Internet.
The main focus of the assessment was on students’ competence and their learning-related attitudes at the end of the comprehensive school education, but the assessment also relates to educational equity: to regional, between-school, and between- class differences and to the relation of students’ gender and home background to their competence and attitudes.
The assessment reached about 7 800 ninth grade students in 82 schools in 65 municipalities. Of the students, 49% were girls and 51% boys. The share of students in Swedish speaking schools was 3.4%. As in 2001, the assessment was implemented in about half of the schools using a printed test booklet and in the other half via the Internet. The results of the 2001 and 2012 assessments were uniformed through IRT modelling to secure the comparability of the results. Hence, the results can be interpreted to represent the full Finnish ninth grade population.
Girls performed better than boys in all three fields of competence measured in the assessment: reasoning, mathematical thinking, and reading comprehension. The difference was especially noticeable in reading comprehension even if in this task girls’ attainment had declined more than boys’ attainment. Differences between the AVI-districts were small. The impact of students’ home-background was, instead, obvious: the higher the education of the parents, the better the student performed in the assessment tasks. There was no difference in the impact of mother’s education on boys’ and girls’ attainment. The between-school-differences were very small (explaining under 2% of the variance) while the between-class differences were relatively large (9 % – 20 %).
The change between the year 2001 and year 2012 is significant. The level of students’ attainment has declined considerably. The difference can be compared to a decline of Finnish students’ attainment in PISA reading literacy from the 539 points of PISA 2009 to 490 points, to below the OECD average. The mean level of students’ learning-supporting attitudes still falls above the mean of the scale used in the questions but also that mean has declined from 2001.
The mean level of attitudes detrimental to learning has risen but the rise is more modest. Girls’ attainment has declined more than boys’ in three of the five tasks. There was no gender difference in the change of students’ attitudes, however. Between-school differences were un-changed but differences between classes and between individual students had grown. The change in attitudes—unlike the change in attainment—was related to students’ home background: The decline in learning-supporting attitudes and the growth in attitudes detrimental to school work were weaker the better educated the mother. Home background was not related to the change in students’ attainment, however. A decline could be discerned both among the best and the weakest students.
The results of the assessment point to a deeper, on-going cultural change which seems to affect the young generation especially hard. Formal education seems to be losing its former power and the accepting of the societal expectations which the school represents seems to be related more strongly than before to students’ home background. The school has to compete with students’ self-elected pastime activities, the social media, and the boundless world of information and entertainment open to all through the Internet. The school is to a growing number of youngpeople just one, often critically reviewed, developmental environment among many.
The change is not a surprise, however. A similar decline in student attainment has been registered in the other Nordic countries already earlier. It is time to concede that the signals of change have been discernible already for a while and to open up a national discussion regarding the state and future of the Finnish comprehensive school that rose to international acclaim due to our students’success in the PISA studies.
Why even speculate about levers when you your starting point should be questioning the legitimacy and validity of what we now know to be a deeply-flawed international measurement which is abusively gamed and exploited? Otherwise you may find yourself ascribing solutions to problems that on’t exist and further damage the education of children that.