Increasingly MATs are choosing to align curricula and s across the whole trust. From the assessment perspective, there’s a lot to like about this approach:
- You get your own MAT-standardised assessments, with each MAT cohort acting as the standardisation sample.
- You can write assessments once, and share them with many schools. Writing reliable assessments is a hard job. You’re more likely to get it right if you have one or two people per subject focused on developing assessment expertise on behalf of the whole MAT.
- Everybody gets higher quality data. Teachers and students can analyse question, topic and overall test data, viewing their class in the context of the whole MAT. School and MAT leaders get summative data they can believe in, as well as granular information to feed into the curriculum design process.
However, even when a trust has an appetite for a MAT-wide assessment policy, it’s not always obvious how to implement it successfully. This blog is designed to help with that, and contains the key insights from our panel for any MAT planning their own common assessment strategy.
1. Curriculum comes first
While the conversation was focused on assessment strategy, all participants were keen to emphasise that curriculum should guide assessment, and not the other way round. In other words, you can only implement a common assessment strategy if you’ve achieved some form of curriculum alignment first.
2. Getting buy-in from schools is essential
One MAT made the point that policies put in place by a MAT can be viewed with suspicion by schools. That means you need to do work to explain to schools what your plan is, and what they’ll get from process (e.g. more reliable and more insightful analysis). It’s also essential they don’t view assessments as being imposed by a MAT for central purposes only.
Another made the point that it’s important for schools to know that common assessments are not an attempt to “stitch them up” with a difficult test. So it helps to explain the reasoning behind your strategy, such as the benefits of ensuring that schools aren’t removing certain questions or setting their own grade boundaries.
3. Creating common assessments is a journey, and it’s ok to get there in stages
Nobody involved in the discussion had jumped straight to common assessments in all subjects and year groups. One MAT is piloting common assessments in three subjects at year 9. Another started off by setting common assessments Maths only, but with a range of year groups. So while the long-term strategy may be to get to common assessments in all subjects and year groups, it can be sensible to start small, and then learn from initial experience before broadening the strategy.
4. Life can get in the way of big plans, but there are always workarounds
Several panelists reported that they’d had to stay nimble and respond to events when implementing common assessments. One MAT had been hoping to quality assure assessments by trialling them with a small sample of students in the year above prior to the main assessment window, and using Smartgrade’s Quality Checker (designed in partnership with EBE) to check reliability and find questions that were not performing well. However, COVID got in the way of that plan, so this approach has been postponed to next year. Another MAT needed to move from paper-based assessments to online assessments at very short notice during the January lockdown. The broad point is that, when trying something as ambitious as common assessments, you need to be willing to adjust the approach in response to events.
5. There is an emerging consensus that deciles are a good approach to grading
All participants agreed that a common language was key, though there were a number of different approaches being used in terms of the preferred gradeset. One MAT is using deciles as the basis of grading. Others are using a 9–1 scale on an age-related basis, with grade boundaries reflecting the typical number of students getting each grade at GCSE. Another MAT is using percentile ranks with parents, but doing some internal analysis to map those to GCSE grade boundaries for internal purposes.
Ultimately, there was broad enthusiasm for the decile rank approach, since it avoids the problem of a pre-existing scale (such as 9–1) having an established meaning that is different from the meaning you’re giving it in your own assessment. The importance of not confusing parents was also discussed. Many grading systems are opaque to parents (and even some staff). So one benefit of a decile scale is it is relatively easy to communicate.
6. One or two assessment windows in the year are plenty enough
None of the panel are doing more than two network-wide assessments a year. A number of panelists are now doing just one annual summative assessment each year, co-ordinated by the MAT in the summer term. One MAT also has network-wide assessments in autumn and spring, but they’re not traditional summative assessments. Instead, they’re called network diagnostic assessments, and they include things like multiple choice quizzes alongside other elements like essays or problem solving.
7. Progress analysis matters, but it’s best not to obsess about small changes
One MAT talked about how they have a buffer of +- 1 on all progress analysis. In other words, if a child goes up or down by one grade, that isn’t considered meaningful from a progress perspective, to account for the fact that all assessments have some margin of error when it comes to grading. Another MAT mentioned looking at the average grade across multiple assessments as an alternative to traditional progress analysis, which analyses the change in grade over time.
8. Get experts involved
Several of the panel have brought in external organisations such as Evidence Based Education and Cambridge Assessment to provide training on assessment design and theory. This can be invaluable to ensure those writing summative assessments understand how to create a reliable and valid assessment. One MAT also mentioned how this assessment training had also paid dividends in their schools’ approach to formative assessment too.
9. Think carefully about your assessment writing process
Multiple panelists had put in place processes to write new assessments every year to ensure the assessments are blind. Typically this involves one person (e.g. a network-wide subject lead, or a chosen Head of Department) writing an assessment, then one or two other Heads of Department quality assuring the paper.
One MAT stressed the importance of trying to make sure the assessments aren’t guessable, so that they’re testing the underlying skills rather than simply covering a question in exactly the way it might have been taught.
Another stressed the importance of having all schools being bought into the curriculum alignment process. If that is sorted, it is inevitably easier to get schools on board with assessments they haven’t seen in advance.
Ultimately, all panelists liked the idea of moving towards more of a “question bank” approach, where questions are reused, but in different configurations from year to year. This can reduce the workload involved in creating assessments, while ensuring that questions have been quality assured (and had a difficulty level established) through prior use.
For more information about how Smartgrade could help your MAT to move towards common assessments, please contact us at hello@smartgrade.co.uk.
Contributors: Neil Miley, Dixons Academies Trust; Amie Barr, Ark Schools; James Richardson, Castle School Education Trust; Nimish Lad, Creative Education Trust; Prof. Rob Coe, Evidence Based Education; Joshua Perry, Smartgrade
Sign up to our newsletter
Join our mailing list for the latest news and product updates.