Blogs
/
4 things we’ve learned from standardising White Rose Maths assessments

4 things we’ve learned from standardising White Rose Maths assessments

It’s a real joy to be the official standardisation partner of White Rose Education. Their resources are incredibly popular - around 85% of primary schools in England use their materials in some form - and we’re proud of the role we play in helping schools to access standardised grades and benchmarked analysis at question, topic and overall grade level for their primary maths assessments. As well as traditional standardised scores like percentile rank, we also offer performance indicators, which translate the results into a “WTS/EXS/HS” scale with reference to the most recent SATs results.

We’ve learned a lot about how their primary maths assessments work from this partnership, so we wanted to share a few of those insights.

1. Questions in the tests relate to things the students have actually been taught.

This might sound obvious, but you might be surprised about how often this isn’t the case in primary! One of my big bugbears with traditional standardised assessments is the lack of curriculum alignment in maths. Many schools around the country still use non-curricular maths assessments containing questions that a teacher won’t have taught yet. That’s kind of a weird thing to do with students of any age if you can avoid it. The role of a school’s end-of-term assessment is surely to evaluate how your curriculum is performing, and to do that you presumably want to ask about stuff you’d expect them to know based on the lessons you’ve actually taught.

The great thing about the White Rose Maths termly assessments we standardise is that they align perfectly with their curriculum, so you never experience that issue. Children just see the right questions at the right time in a way that neatly matches the curriculum sequence.

2. The year 1 papers are shorter and easier by design.

Another thing that people sometimes worry about with assessments for younger children is whether the experience might be intimidating. To address this, White Rose Maths make their year 1 papers easier than other year groups to give children a gentle introduction to this kind of more summative assessment.

To illustrate the point, here’s the distribution of the autumn year 1 assessment:

As you can see, the peak of the distribution is to the right, and there is a long left tail. The assessment only has 23 questions and the average score is 73%. 93% of children are achieving 10+ marks and the modal score is 22-23 out of 25.

Compare this with the year 5 autumn 2 assessment:

Now we’ve got double the number of marks, and a much more normal distribution. The average score is 52%. That means the assessment is more able to sample all ability ranges and give a more precise score, which feels appropriate, given we’re in upper key stage 2, and we want the assessment to sample a broad range of curriculum from both this term and the preceding years. (As an aside, you may have noticed that there’s a spike of students who appear to have a raw score of zero. That’s mainly because we assign a 0 mark to students who are working below the level of the test, to ensure they’re included in the standardisation.)

Of course, this does mean that you need a little context to work out what “good” looks like, but happily, that’s where our standardisations come in. By giving you standardised grades on top of the “raw” scores, you can see percentile ranks and performance indicators that help you compare the performance of your school in a common language across year groups.

3. Children find measurement hard!

One really neat thing about how Smartgrade works with White Rose Maths is that we can give you topic analysis based on the White Rose Maths scheme of learning, so you can go beyond the overall mark to find out where your gaps are. And in looking back over the results from 23-24, we’ve noticed that measurement consistently rates as one of the toughest topics. Across all assessments in years 2-5 in 2023-24, the average score on the measurement topic was 41% vs 56% across all topics. That’s a big variance!

To be clear this isn’t a “White Rose Maths only” thing - we see a very similar trend in our KS2 practice SATs assessments, which give similar benchmarking and standardised analysis in year 6 using SATs past papers. Looking at the last 4 practice SATs windows, measurement averaged 38% vs 48% across all topics. So we know that measurement is one of the tougher areas to master.

There’s also an interesting gender disparity that shows up in some areas of measurement. Let’s take this question from the Year 4 Spring assessment, for example:

51% of boys got this right, vs just 36% of girls. That’s a 15 percentage point difference, compared to an average difference of 4 percentage points across all questions in the same test.  We had over 11,000 children in the sample for this assessment, so we can be confident that the results are statistically significant. We also saw other similar gender disparities in other measurement questions. So we’re not quite sure why this type of measurement is more challenging for girls, but the trend is real!

4. Schools accessing standardised White Rose Maths assessments via Smartgrade are helping to make future iterations better!

After you’ve entered your White Rose Maths data into Smartgrade, we share it in aggregated form with the White Rose Education team. This is fully secure from a GDPR perspective, as we never share pupil-level data. And what this means is that the White Rose Maths assessment experts can use that data to see how well questions are performing, and adjust future iterations of the assessments accordingly. And happily, we hear that there’s a new set of assessments being developed right now, so by signing up, you’re helping to improve the quality of their assessment offer for all schools in the future.

If you'd like to find out more about our assessments, book a 30-minute personalised demo with one of our team.

Sign up to our newsletter

Join our mailing list for the latest news and product updates.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Everything you need for smarter assessments

Book a Demo Today