First year evaluation results show promise

Feedback is central to teaching and learning at Rising Academies. Students and teachers learn to give and receive feedback using techniques like Two Stars and a Wish or What Went Well...Even Better If. The Rising Academy Creed reminds us that "Our first draft is never our final draft." Given that, it would be pretty strange if Rising as an organisation didn't also embrace feedback on how well we are doing at enabling more children to access quality learning.

That's why, even as a new organisation, we've made rigorous, transparent monitoring and evaluation a priority from the outset. Internally, we've invested in our assessment systems and data. But my focus here is on external evaluation, because I'm excited to report that we have just received the first annual report from our external evaluators. If you want to understand the background to the study and our reactions to the first annual report, read on. If you're impatient and want to jump straight into the report itself, it's here.

Background

Last year, we commissioned a team led by Dr David Johnson from the Department of Education at Oxford University to conduct an independent impact evaluation of our schools in Sierra Leone.

The evaluation covers three academic years:

  • (The abridged) School Year 2016 (January-July)
  • School Year 2016-17 (September-July)
  • School Year 2017-18 (September-July)

The evaluation will track a sample of Rising students over those three years, and compare their progress both to a comparison group of students drawn from other private schools and government schools.

The overall evaluation will be based on a range of outcome measures, including standardised tests of reading and maths, a measure of writing skills, and a mixed-methods analysis of students' academic self-confidence and other learning dispositions.

The evaluation is based on what is known as a 'quasi-experimental' design rather than a randomised controlled trial (unlike our schools in Liberia, where we are part of a much larger RCT). But by matching the schools (on things like geography, fee level, and primary school exam scores), randomly selecting students within schools, and collecting appropriate student-level control variables (such as family background and socio-economic status) the idea is that it will ultimately be possible to develop an estimate of our impact over these 3 years that is relatively free of selection bias.

Figure 1: How the evaluation defines impact

Figure 1: How the evaluation defines impact

 

BASELINE

To make sure any estimate of learning gains is capturing the true impact of our schools, one of the most important control variables to capture is students' ability levels at baseline (i.e. at the start of the three-year evaluation period). This allows for an estimate of the 'value-added' by the student's school, controlling for differences in cognitive ability among students when they enrolled. Baselining for the evaluation took place in January and February 2016. The baseline report is available here. It showed:

  • That on average both Rising students (the treatment group) and students in the other schools (the comparison group) began their junior secondary school careers with similar ability levels in reading and maths. The two groups were, in other words, well matched;
  • That these averages were extremely low - for both reading and maths, approximately five grades below where they would be expected to be given students' chronological age.

YEAR ONE PROGRESS REPORT: RESULTS

The Year One Progress Report covers Academic Year 2016. The Ebola Crisis of 2014-15 disrupted the academic calendar in Sierra Leone. Students missed two full terms of schooling. The Government of Sierra Leone therefore introduced a temporary academic calendar, with the school year cut from three terms to two in 2015 (April-December) and again in 2016 (January-July). The normal (September-July) school year will resume in September 2016.

The Progress Report therefore covers a relatively short period - essentially 4.5 months from late January when baselining was undertaken to late June when the follow-up assessments took place. It would be unrealistic to see major impacts in such a short period, and any impacts that were identified would need to be followed-up over the next two academic years to ensure they were actually sustained. As the authors note, "it is a good principle to see annual progress reports as just that – reports that monitor progress and that treat gains as initial rather than conclusive. A more complete understanding of the extent to which learning in the Rising Academy Network has improved is to be gained towards the end of the study."

Nevertheless, this report represents an important check-in point and an opportunity for us to see whether things looking to be heading in the right direction.

Our reading of the Year One report is that, broadly speaking, they are. To summarise the key findings:

  • The report finds that Rising students made statistically significant gains in both reading and maths, even in this short period. Average scaled scores rose 35 points in reading (from 196 to 231) and 36 points in maths (from 480 to 516). To put these numbers in context, this change in reading scores corresponds to 4 months' worth of progress (based on the UK student population on which these tests are normed) in 4.5 months of instruction.
  • These gains were higher than for students in comparison schools. The differences were both statistically significant and practically important: in both reading and maths, Rising students gained more than twice as much as their peers in other private schools (35 points versus 13 points in reading, and 36 points versus 4 points in maths). Students in government schools made no discernible progress at all in either reading or maths. (For the more statistically inclined, this represents an effect size of 0.39 for reading and 0.38 for maths relative to government schools, or 0.23 for reading and 0.29 for maths relative to private schools, which is pretty good in such a short timespan.) 
  • The gains were also equitably distributed, in that the students who gained most were the students who started out lowest, and there were no significant differences between boys and girls.
  • Finally, there are early indications that students' experience of school is quite different at Rising compared to other schools. Rising students were more likely to report spending time working together and supporting each others' learning, and more likely to report getting praise, feedback and help when they get stuck from their teachers.

That's the good news. What about the bad news? The most obvious point is that in absolute terms our students' reading and maths skills are still very low. They are starting from such a low base that one-off improvements in learning levels are not good enough. To catch-up, we need to sustain and accelerate these gains over the next few years.

That's why, for example, we've recently been partnering with Results for Development to prototype and test new ways to improve the literacy skills of our most struggling readers, including a peer-to-peer reading club.

So what are my two stars and a wish?

  • My first star is that our students are making much more rapid progress in our schools than they did in their previous schools, or than that their peers are making in other schools they might have chosen to attend;
  • My second star is that these gains are not concentrated in a single subset of higher ability students but widely and equitably shared across our intake;
  • My wish is that we find ways to sustain these gains next year (particularly as we grow, with 5 new schools joining our network in September 2016) and accelerate them through innovations like our reading club. If we can do that, and with the benefit of 50% more instructional time (as the school year returns to its normal length), we can start to be more confident we are truly having the impact we're aiming for.

Take a look at the report yourself, and let us know what you think. Tweet me @pjskids or send me an email.