The Commission on Assessment without Levels

I was a member of the Commission on Assessment without Levels, which met earlier this year to look at ways of supporting schools with the removal of national curriculum levels. The final report was published last week, and here are a few key points from it.

1. Assessment training is very weak

The Commission agreed with the Carter Review that teacher training and professional development in assessment was weak. It’s worth quoting the Carter Review at length on this.

“For example, our review of course materials highlighted that important concepts relating to evidence-based teaching (validity, reliability, qualitative and quantitative data, effect sizes, randomised controlled trials) appeared to be covered in only about a quarter of courses…there are significant gaps in both the capacity of schools and ITT providers in the theoretical and technical aspects of assessment. This is a great concern – particularly as reforms to assessment in schools mean that teachers have an increased role in assessment. There are also important links here with the notion of evidence-based teaching. The profession’s ability to become evidence- based is significantly limited by its knowledge and understanding of assessment – how can we effectively evaluate our own practice until we can securely assess pupil progress?”

It’s particularly frustrating that assessment training is so weak, as compared to a lot of other aspects of teacher training this is not hard to deliver. It should be relatively straightforward to design a taught course covering the topics above.

2. Performance descriptors have big weaknesses. Judging pupils against ‘can-do’ statements is popular, but flawed.

“Some assessment tools rely very heavily on statements of achievement drawn from the curriculum. For example, teachers may be required to judge pupils against a series of ‘can-do’ statements. Whilst such statements appear precise and detailed, they are actually capable of being interpreted in many different ways. ‘A statement like ‘Can compare two fractions to identify which is larger’ sounds precise, but whether pupils can do this or not depends on which fractions are selected. The Concepts in Secondary Mathematics and Science (CSMS) project investigated the achievement of a nationally representative group of secondary school pupils, and found out that when the fractions concerned were 3/7 and 5/7, around 90% of 14-year-olds answered correctly, but when more typical fractions, such as 3/4 and 4/5 were used, 75% answered correctly. However, where the fractions concerned were 5/7 and 5/9, only around 15% answered correctly.’”

I’ve written about this at length here.

3. Teacher assessment is not always fairer than tests

“Standardised tests (such as those that produce a reading age) can offer very reliable and accurate information, whereas summative teacher assessment can be subject to bias.”

4. Ofsted does not expect to see any one particular assessment system.

Here’s a link to a video of Sean Harford, another member of the commission and the National Director for Schools at Ofsted, making exactly this point.

5. A national item bank could be an innovative way of providing a genuine replacement for levels.

“Some schools use online banks of questions to help with formative assessment. Such banks of question give meaning to the statements contained in assessment criteria and allow pupils to take ownership of their learning by seeing their strengths and weaknesses and improvement over time. Some commercial packages exist with pre-set questions, particularly for maths and science. Other products allow teachers to create their own questions, thus ensuring they align perfectly with the school curriculum.

One of the flaws with national curriculum levels was the way a summative measure came to dominate formative assessment. One way the government could support formative assessment without recreating the problems of levels would be to establish a national item bank of questions based on national curriculum content. Such an item bank could be used for low-stakes assessments by teachers and would help to create a shared language around the curriculum and assessment. It could build on the best practice of schools that are already pioneering this approach. Over time, the bank could also be used to host exemplar work for different subjects and age groups.”

New Zealand appear to have something similar.

For more of my posts on assessment, see here.

Advertisements

4 thoughts on “The Commission on Assessment without Levels

  1. Tara Houle

    As a parent, I am left rather confused by the term “assessment”. Back in the day, kids were given regular, sometimes daily quizzes in the classroom, to ensure the work that had been covered the day prior, was fully understood. These quizzes were given by the teacher, to the class as a whole, and only took a few minutes so the teacher could be provided a snapshot into a child’s understanding and learning. Regular tests were also part of the school year. In short, there seemed to be a lot less confusion if students actually knew/mastered/fully understood the lessons they were being taught.

    Fast forward to today’s classroom, and the subject of assessment has come up time and time again, with teacher unions bashing standardized testing, and even regular tests, and School Districts promoting gibberish and very in depth ways of providing assessment on an ongoing basis. It all seems to be a very complicated mess.

    How best to ensure our kids know what they are being taught? Why are the old ways being deemed irrelevant when regular practice, regular quizzes and regular tests, are all part of the evidence behind effective instruction?

    Reply
  2. dodiscimus

    I don’t have anything like the overview, or time invested, on assessment that you have, Daisy, so these are just some passing thoughts in response to your first point from the report, perhaps to chew over at some point. The rest I generally agree with although I think your longer posts on exemplars are clearer about the need to use exemplars to define the meaining of statements, rather than getting rid of statements altogether – maybe that’s obvious.

    1. Assessment training is very weak
    The problem might be a bit more intractable than you suggest. Assessment in ITT is complicated because it encompasses techniques for checking on performance during lessons, and high-quality assessment of progress over time, and the kind of assessment involved in RCTs and other large-scale evaluations. We already cover everything in the Carter Review list during our PGCE but it’s hard to make it stick. Mostly I think this is because our trainee teachers tend to be very focused on the craft knowledge they need to improve their immediate classroom performance. We could ‘easily’ increase theoretical content on assessment (and by doing so, improve the quality of their M-Level assignments, probably) but what would we cut? Behaviour, planning, literacy, safe and effective practical work, trying to get people with rusty GCSE to understand enough physics to be able to teach it effectively? Most of our trainee teachers wouldn’t thank us for dumping any of this for more assessment theory. When they think of assessment they are usually thinking about how to make decisions during lessons – are they ready for independent practice or do I need to model that again? – and how to leverage their marking. Of course, we do want them to think about accurate assessment of progress over time but the work they do on this is heavily constrained by the school setting. If the school still uses levelled assessment tasks (and a lot do) then a trainee teacher may not be able to set separate assessments, and probably isn’t going to want to. That sounds like a bit of a cop-out, I know, but very few of my trainee teachers would want to do anything other than adopt the assessment practices of the schools and departments they are working in, until they’ve developed further. When they can start to see the wood for the trees then the M-Level work they do includes assessment of the effect of changes to practice so they have covered this stuff and can go back to it but it doesn’t develop very far during the PGCE for most of them. Maybe our priorities are wrong but I’m not convinced (yet).

    Best wishes

    Reply
  3. Pingback: Guide to my posts about assessment | The Wing to Heaven

  4. Pingback: Tests are inhuman – and that is what so good about them | The Wing to Heaven

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s