Wednesday 8 July 2015

Assessment without Levels - A Practical Approach


In the third in my series about developing an approach to assessment without levels I will explain in practical terms our assessment methodology and how this informs the monitoring of standards.

In developing our approach there were two elements to consider. First, how would we find out what the children had learned, how well they had learned it and what remained for them to learn from the current year's attainment targets? Secondly, having compiled what was essentially a list of skills, how could we use this information to monitor standards and identify systemic weaknesses that required strategic intervention?

The former of the 2 was by far the easiest to solve. Prior to the new arrangements the school had invested in the 'Assertive Mentoring' suite of assessment tools. At that time one of the key challenges facing our school was that assessment was not being used to inform teaching. Pupils were taught something because it was the next item on the scheme of work and not because it was the next thing they were ready to learn. The zone of proximal development was rarely if ever found and standards were poor as a result. The school required a way of clearly identifying what had been learned and what was next and the Assertive Mentoring System delivered this. Often, buying in an assessment system leaves one with a series of test papers generally 6 per year. The children sit the papers, they receive a score and this is equated to a 'grade' or 'level'. Assertive mentoring was a suite of assessment tools that included tests but also a wide range of other assessment devices with some very useful girds where what pupils had learned was visually represented, with unachieved attainment targets highlighted very powerfully. Although the system could be used to generate a level, the focus was more on highlighting what was missing rather than identifying the point at which the pupil was.

When the new assessment arrangements were launched we felt that the assertive mentoring system fit very firmly into the 'qualitative' over 'quantitative' principal I discussed in the previous article and so we purchased the updated system for the 2014 curriculum and continued to use it as we had done before. Using the range of devices available we quickly had a large amount of reliable information about what individual children had learned.

In many ways the idealist in me wanted to leave things there. Teachers had rich information about what their pupils had learned and what they needed to learn next. All they needed to do was to feed that into their planning and deliver well pitched lessons firmly within the 'zone of proximal development' and children would achieve well. The realist in me knew that it was my responsibility to monitor and improve standards and I could not do that without some kind of summary data. I needed to be able to answer questions like who is the teaching working for and who isn't it working for? What are we doing in maths, for example, that works to well that we aren't doing in reading? What expectations should we set ourselves and how are we doing in achieving those?

In our early assessment model prototypes we were seduced into adopting the emerging language from existing assessment providers of setting achievement milestones and naming them things like 'beginning' 'working within' and 'secure'. Other words like mastery floated about but it all amounted to a rebranding of what we had before. Any summary data became about where is the pupil? At what point are they in their learning, essentially, what level are they?

Instead we returned to the beginning and asked ourselves, what are we trying to achieve here? The answer of course lies in our statutory responsibility to ensure pupils leave us in year 6 with a secure understanding of the content of the primary national curriculum. The question became therefore, given what this pupil has achieved are they likely to leave us in year 6 with a secure knowledge of the primary curriculum?

This realisation led us into an important point about assessment. Pupils will only leave us in year 6 with a secure knowledge of the primary curriculum they need to leave year 5 with a secure knowledge of the year 5 curriculum, and so on down the years right back to the start of their education. All of the assessment materials that we examined in this early stage allowed pupils to be 'graded' as secure in a year group having achieved less that 100% of the learning. Our concern was that over time this would accumulate. Even if a child only fell short by a small amount each year this would add up year on year to create a much larger gap in year 6. The pupil would not have a secure knowledge of the content of the curriculum. This has been traditionally characterised by the cramming that often happens in year 6 before the SATS as teachers to to back teach the content that was missed from previous years even in children who apparently have progressed as expected. we characterised this phenomenon as 'The Secure Gap Paradox'.

We identified that each years expectations were critical in their own right and that pupils needed to learn all the content before they moved on. It was not useful to view a pupil's overall journey from Y1 to Y6. What was required was teaching that secured the content for all children every year. We decided that all our summary data would relate to a child's likelihood to achieve each years' attainment targets. Children who failed to do so could be targeted for additional support and intervention quickly and specifically. Although this has long happened for the most and least able but with this mindset we were able to reclaim what is sometimes referred to as the children lost in the middle. One of our leaders characterised this new mindset as all children stepping forward together in a line. I have heard others say 'stop trying to close the gaps and focus on preventing them from appearing in the first place'. It all amounts to the same thing. Pupils can and must achieve each years objectives and they will do so if the teacher teaches them in a way that allows them to do so. This has thrown up some real challenges for us which I will examine in a future post but we adopted the principal.

In order to make this analysis we required 2 pieces of information. How much of this year's learning has the pupil secured (attainment) and at the current rate of learning, are they likely to achieve the expected standards by the end of the school year? (Progress). We assigned descriptors to describe different amounts of learning secured beginning, beginning plus, working within, working within plus, secure and secure plus. We used these principally because it allowed us to continue to use Target Tracker but the point is subtle yet critical. It is not important THAT a pupil is a W, but WHEN a pupil is a W. We set expectations that pupils would be a B+ by the end of Aut 2, W+ by the end of Spring 2 and S+ by the end of summer. To secure an S+ pupils would need to demonstrate secure understanding of 90% of that year's learning. Yes. 90%. We caved in to ourselves because 100% seemed simply too daunting. Its a decision I regret because it opens us up to the secure gap paradox and actually, sold our teachers short. As we come to the end of this year and review the last data drop, it seems that a great number of teachers have delivered 100% of the learning for many of their pupils. We may well review this next year.

The next thing we adopted was the Target Tracker progress points scale that gives 1 point progress for each 'descriptor'. Ostensibly this reeks of APS and could be used to drive a similar assessment model to what we had before. The difference is again subtle but critical. Children should achieve 6 points per year. No less and no more. Less indicates a failing in the teaching and more demonstrates moving into the next years objectives at the expense of securing and deepening learning. The importance of this tool is that we can analyse the rate of learning. If pupils only make 1 point in a term then they next term they must make not 2 but 3 points to make up for what was lost previously and continue to secure the content required by the end of the year.

This approach has been very successful for us and early indications on the end of year data look very promising. There remain significant challenges which I will examine in the next post but we are confident that the current year 5 children will be able to demonstrate a secure understanding of the primary curriculum at next year's external examinations and for those we are concerned about, we have a detailed understanding of their learning that will allow us to put the work in to ensure that they do.

Tuesday 5 May 2015

Understanding Assessment without Levels


In the first of my series about developing a new assessment system in our 2 form entry primary school in suburban south London, I will try to shed light on how we arrived at the place in which we now find ourselves. I will share the information, discussions, interpretations and decisions that lay behind the assessment system that we now have in school. As I have already said in the introduction piece to this series, not for one moment do I claim that we have cracked this particular nut, that our system is correct, ideal or that it should necessarily be adopted by any other school. I share it because I hope that it will be useful to others that share the task of responding to this policy. I also hope to attract the thoughts of others in order that we can continue to refine our ideas and practices, challenge our assumptions and drive us to develop a world class approach to assessment. I deeply believe in the power of collaboration and the potential that can be fulfilled when educators seize the initiative rather than waiting for the slow death of 'policy paralysis' imposed by external organisations with their own agendas.

My school's basic philosophy and approach to assessing without levels is rooted in the conversations I had in late 2013 with Tim Oates, the chair of the expert panel that reviewed the National Curriculum and by extension the UK's approach to assessment. In a quirk of fate, Tim and I had a mutual former colleague and he kindly offered to come and speak to our staff about the thinking behind the new National Curriculum and the logic of dropping levels as a model of assessment. When Tim came in January 2014, his presentation was essentially a version of the content of this video.



We drew 6 key lessons from Tim's presentation that informed our subsequent approach to assessment.

1) Learning in the new National Curriculum is arranged such that it allows pupils to work at an appropriate pace to secure the key concepts, skills and understanding of the subjects. This key learning is made up of clear and progressive statements the build year on year though the child's education.

2) Levels are dysfunctional. They mean different things at different times to different people. They create in the mind of both pupil and teacher the idea that success is based on the reaching of a particular point rather than a sense of a strong, qualitative, evidence based judgement of whether a pupil properly and securely understands the key concepts required by the National Curriculum.

3) The new policy approaches are now predicated on what is often described as a 'growth mindset' model of attainment. That is to say that the assumption is that all pupils are capable of all things where it is presented to them in the right way and they apply an appropriate amount of effort.

4) Assessment should focus on whether a child has understood a particular thing, an idea, skill or body of knowledge rather on whether or not they have reached a particular point, i.e level.

5) In order to ascertain that a pupil has really understood a particular thing, teachers need to become experts in probing a pupil's understanding and providing opportunities for children to express their understanding.  Classroom activity needs to generate a volume of evidence to support judgements of understanding. This will come from better and better questioning and an increase in the amount of useful assessment that takes place in the classroom. 

6) Undue emphasis on progress promotes the practice of moving a child on with insecure understanding and this prejudices their future education.

Our interpretation of this was made up of 2 principles that would go on to inform the model of assessment that we adopted. First was that any assessment system that we developed should first and foremost be qualitative rather than quantitative . We have become used to, in recent years, all manner of statistical metrics to look at school performance. Chief among them was 'average points score', which was used to look both at average attainment and progress for all manner of groups in the school. We see no place for these types of metrics in the kind of assessment environment that Tim describes. What becomes so important is the rich and voluminous evidence that informs the robust assertions that a pupil has or has not yet learned a particular thing to a secure enough standard for that point in their learning. We wanted a system that could answer questions that APS never could. What have the pupils learned? What have they not learned? Why haven't they learned it? How sure are we that they have learned it?

The second principle was that whatever system we developed would place emphasis on depth and security of understanding and that progress needed to be thought about in a different way. The requirements in primary school was to secure the 2b in year 2 and then 2 levels progress before the end of KS2. Schools were judged on their ability to achieve the 2 levels progress and praised if they could deliver 3 or even more. The environment that Tim describes is fundamentally one of attainment. The pupils need to acquire a certain set of skills, understandings and knowledge. The measure of progress should be thought of as a way of keeping an eye on the overall trajectory of the pupil or group. That is to say, at their current rate of learning are they likely to achieve the required body of knowledge or not and if not why not and what can be done about it? Rather than an ever increasing demand for more and more content to be taught in order to meet the higher levels required to achieve 3 levels progress. A cursory conversation with any secondary colleague soon revealed that the learning that had been drilled into pupils in advance of the SATs was fragile to say the least. Very few secondary schools have much regard for the levels that came up from KS2 because they rarely reflected what the pupils really could or could not yet do. The learning was not secured.

Our aim therefore was to develop a qualitative assessment system that could provide us with reliable information about what a pupil could do and what remained to be addressed. It was to be predicated on the idea that depth and security, i.e attainment was the key measure of success and that progress be a tool for mapping trajectory over time. In my next post I will attempt to show what we adopted and developed and discuss the challenges we faced in doing so.








Friday 1 May 2015

Developing A New Assessment System


  


It was no surprise to me, in October 2014, when my head teacher placed at the top of my new performance management form the requirement to develop and implement a way of assessing pupils attainment and progress without levels. Since then, it feels like I have worked on little else. I have attended dozens of local authority meetings, meetings with cluster schools, conferences and briefings. I have read hundreds of pages and watched many youtube clips. I was even persuaded, against my better judgement, to speak at a conference myself. Working closely with our amazing assistant head teacher we have wrestled with the minimal guidance from the DFE, the tit bits and rumours from Inspectors speaking at events, hearsay and our own idea of what assessment should and could be.

By no means have we finished this work. There are still so many unknowns, so many problems to solve. I have decided to write a series of posts that will share what we are doing and why we are doing it. What this won't be is an analysis of the virtues, or otherwise, of this approach. It won't be a moan or a critique. That is not to say those discussions don't have a place but this will be for those of us charged with delivering what will soon be a statutory change. I would warmly welcome contributions from anyone involved in this process who wants to contribute to this discussion in this spirit.

I anticipate this series of posts being made up of 4 parts;

1) Our interpretation of assessment without levels
2) Our approach to assessment without levels
3) Sharing information with stakeholders
4) Ongoing challenges

I will aim to post once a week for the next 3 weeks, with the first post to come this bank holiday weekend!