Thanks to Michael Tidd for this helpful article.

The excitement of opening up Raise Online when the data is first published is… well, perhaps excitement is not quite the right word. Nevertheless, when the data finally arrived this term headteachers will have been poring over it trying to extract every last detail of information about last year’s performance. Doubtless governors too will get their chance to share in the scatterplots and tables, wisely guided by the professional leads.

The problem is, it’s too late for all those children, and just like the stock market adverts always tell us: historic performance is not necessarily a guide to future success. Leaders and governors need to consider what has gone before, but all the while need to be keeping an eye on the future. So while Raise can tell us something of what we achieved last year, how else do we keep everyone informed, including our governors?

One big thing that is evident from this year’s Raise summary is the clear focus on disadvantaged pupils, i.e. those eligible for pupil premium funding. Barely a page goes by without the group being separated out from the rest of the cohort and their attainment and progress being listed separately. In many cases, it’s also compared to other pupils nationally, but it’s important to note that it’s not other pupil premium children, but rather the non-PP children they’re being compared to. That’s important to consider when looking at other data in school.

If you’re using tests in school to support your assessments, then looking at the outcomes of particular groups will help you both to identify how pupils are doing now in each year group, but also to look ahead to future years’ key stage 1 and 2 results.  The new (and free) Rising Stars Assessment Online reporting tool will allow you to compare groups in this way when using the Progress Tests, Optional Tests, or PiRA and PUMA termly assessments – and all in a much clearer way than the Raise data appears each year!

Raise also provides us with some interesting questions that, while we might not want to plough through in every year group, do make for some interesting points of discussion when looking at data. Perhaps where governors are looking at other areas of the school than years 2 and 6, they might be interested to look at the available data and consider questions such as how pupils with prior high attainment are progressing in different subjects, or what the data might mean for future use of the pupil premium strategy in school.

One further source of questions might be the graphs of groups’ progress. Alongside the progress tables are simple whisker plots which show which groups in school did better and worse than the national average, including for different ethnic groups. If there are some patterns in the end of key stage results, how do these compare to the patterns of in-school data? One of the advantages of using tests in other year groups is the equal opportunity all pupils get to shine. If girls in particular seem to be doing poorly in the national tests, is this reflected in the test results of other year groups – and if so, what can be done about it?

The Raise online report might not be exciting, but it is a key tool in the school leaders’ – and governors’ – kit for understanding where the school was. If we can also help it to guide our discussions of where the school is, then maybe we’re more likely to end up where we want to be?

Did you find this article helpful? Tweet us @risingstarsedu or Michael @MichaelT1979. 

Leave a Reply

Your email address will not be published. Required fields are marked *