Most state primary schools in England set practice SATs assessments for their students during year 6. Of course, part of the reason for this is to offer some test prep experience to students, but there’s also a tonne of useful insight that can be gleaned from these assessments if they’re analysed in the right way. So to help you make the most of your Practice SATs, here are five things we’ve learnt since we started offering Practice SATS analysis in Smartgrade.
1. Before you start look at your results, think about what you want to do with the analysis
We’re all busy, which means that too often we block off time to analyse our data, and dive in without a clear idea of what we’re looking for. But this is risky — you may find that you play around with data for a happy hour, but get to the end of it without having meaningfully learned anything that will impact upon your decision-making as to what to do as a result of the analysis.
So we always advise people looking at SATs (or any) data to think: “what things could I do differently based on what this data tells me?”. Here are some good potential answers to that question:
- Reteach something. If I find out my class struggled with fractions, I can work out the best way of reteaching the specific concepts they’re struggling with. This is always a crucial element of an internal school assessment — you need to be willing to give over time to reteaching concepts that students have struggled to grasp, or your analytical gains will be limited.
- Reconsider your curriculum. A longer term decision you may want to make with reference to SATs papers is: how well is your curriculum performing. This can be in specific areas (e.g. the fractions example given above); but it could also be broader questions such as “am I allocating enough time to reading fluency”?
- Allocate resources differently. If you’re a leader in a school or MAT you may be looking at your data to consider whether you need to juggle around which teachers are in which year groups, or whether you need to buy in additional resources (curriculum, software, tutoring and so on) to support your school in certain areas.
2. Score well below the “May actuals” are normal in practice SATs
The average scaled score in May 2022 and May 2023 for Maths in the official May SATs assessments was 104. Contrast this to our practice SATs cohort in 2022/23 (which includes a broadly representative sample of schools), where we saw an average scaled score of 101 in Spring 2 (based on the 2022 paper) and just 94 in Autumn 1 (based on the 2017 paper). A similar trend (albeit with a slightly higher start point) is also evident in Reading and GPS.
You can see from these results why we think it’s so important to look at the percentile rank of your student, and the implied performance indicator arising from that percentile rank as provided by Smartgrade, alongside the scaled score conversion! The 2022/23 practice SATs Maths Autumn 1 average of 94 is equivalent to the bottom decile of performers in most years, so if you’re only looking at scaled score conversions you can give yourself an unwarranted fright.
3. Aim to filter out analysis of content you haven’t taught yet
Of course, the trend outlined in point 1 above shouldn’t be surprising; after all, in the autumn term students still haven’t learned the whole curriculum. So where something isn’t taught (for example a specific area of maths), our steer is not to dwell on analysis of content not yet taught. Smartgrade makes this easy for you: in our maths topic analysis we tag questions by the year in which they’re taught in the national curriculum, so in Autumn 1 you can focus more analytical attention on identifying gaps in learning from content taught in years 3, 4 and 5, for example.
4. Don’t assume you’re struggling in something until you know the national average
Let’s imagine you’re a school that took a practice Mathematics assessment in Spring 2 of 2023, and on average your students scored 40% on geometry and 40% on number & place value (NPV). Which is the bigger issue?
Well, look, 40% is probably never a number to make you whoop with pride in a national assessment, but a really important piece of context here is that in our national sample at that point in time, the geometry average was just 26%, whereas the NPV average was 62%. What’s more, geometry accounted for just 3 marks out of 110, whereas NPV contributed 14 marks. So your school’s performance in geometry was actually considerably above average, and anyway, changes here will not massively affect your overall SATs score (not that SATs performance is the only reason to teach something thoroughly, of course!). In contrast, your NPV score is significantly under the average, and given the much higher contribution of the topic to the final mark, you may well choose to make reteaching here a priority.
In Reading there’s a similar trend to be aware of: students consistently score lower on questions requiring them to make inferences than on questions to do with explaining meaning or retrieving information. So again, it’s important to look at the national picture before extrapolating your own performance. Happily, in Smartgrade we make it possible for you to see all these comparison numbers for yourself from the topic analysis screen, so you can avoid this pitfall.
5. Don’t overdo the group analysis
Smartgrade has revealed some really fascinating nuggets of information about national performance over the yers. Take the answers to this question from our 22/23 Spring 2 Practice SATS:
In our sample of over 5,000 students, girls averaged 50% on the question, compared with 67% for boys — a whopping (and statistically significant) 17 percentage points’ difference!
And the question isn’t an anomaly — in our work with White Rose Education we’ve similarly found that measurement is an area where girls underperform boys in the primary phase. This is super-interesting stuff and I’d love to hear from any mathematicians reading this about why it might be so!
But our takeaway here is NOT that schools should do lots of gender gap analysis. There are some forms of group analysis that can be helpful — EAL analysis on a reading paper, for example — but gender group analysis probably won’t tell you a tonne if you’re just looking at one class. For a start, each student is 3% of your sample, so one or two students under- or overperforming will have a huge effect on the results. But also, will it really change the way you teach once you’ve completed your analysis? In most cases, even if you do find differing performance for a given gender, I doubt you’ll be searching for teaching strategies that play to a heteronormative view of that gender to rectify it! So to come back to where we started, focus on the analysis that can help you improve outcomes by informing your decision-making.
For more information about practice SATs at Smartgrade, visit our website or book in a demo using this link. You can also email us with any questions or to request a quote at sales@smartgrade.co.uk.
Sign up to our newsletter
Join our mailing list for the latest news and product updates.