This isn’t a ‘look how great we are’ post. It might sound like a ‘look why we achieved great results’ post, but it’s just a story of our journey. Regardless of what we did as staff, it was the pupils in #teamstats who really earned the grades, so well done to them!
I’ll start with the figures and their context, then write about some things.
2014: 42 pupils, 93% A*-C, 19% A/A*
2015: 58 pupils, 100% A*-C, 70% A/A*
Here are some things which mean the results should have improved anyway, with no changes to teaching:
- The top 25 pupils in 2014 did the FSMQ instead, whereas the 2015 stats group included them. (Adding 25 pupils to team 2014 and assuming they all achieved A/A* grades would have given a figure of 49% A/A*)
- The 2015 group had significantly more curriculum time for mathematics and statistics combined, as our 58 pupils chose to do statistics as an additional option. This was a new offer for the 2015 leavers.
- The 2014 group had a change of teacher at Christmas, with an experienced member of staff replaced by a teacher who had never taught the course before (me). The 2015s had the same two teachers all year.
- The school increased its focus on retaining pupils until exam day, meaning more pupils were still turning up to every lesson once their other exams had started.
Here are some lesson-based things we changed, which probably helped:
- A new scheme of work was written, based on those provided by Kangaroo Maths and Edexcel, but edited to include key questions and misconceptions for each chapter of the course.
- The scheme of work was added to our school VLE for pupils to access, along with a lesson powerpoint and a multiple-choice misconceptions quiz for the end of each chapter.
- Pupils were strongly encouraged to do the quizzes for homework on a fortnightly basis.
- All lessons were collaboratively planned between the two teachers delivering the course (me and #missblock) and evaluated immediately after delivery before being amended and added to the VLE on the evening they had been taught in class.
- We included time to recap everything, including the bits that pupils think are easy but can be easily forgotten (e.g. drawing graphs properly, remembering definitions of sample vs. sampling frame).
Here are some assessment-based things we changed, which may have been beneficial:
- Pupils marked each other’s assessment papers (full past papers) once per term with the mark schemes printed for guidance – this was surprisingly accurate, as the first markers tended to be too harsh, while pupils called out any harsh marking for me to check when they got their own work back. They were also much more engaged in the markscheme when they actually had to use it rather than just look at it.
- Pupils could retake the homework quizzes as many times as they liked, in order to improve their marks. As the quizzes were self-marking, this generated no extra work once they were created.
- In the final lessons before the exam, pupils were provided with worksheets of past paper questions on the top six topics which the analysis from previous assessments showed to be the most difficult.
Here are some other things which happened:
- We minimised the time spent on the controlled assessment – pupils had written a practice plan on a different topic, and completed a practice data collection cycle on another different topic, before the real thing. Combining this with the guidance given by the exam board meant that they should have had enough knowledge to do very well. Those who followed all the guidance did very well. We decided that pupils’ first grades would count, even though we could have used further curriculum time for pupils to do the whole thing again on a different topic. (I also don’t think it’s fair on the pupils who put in the effort first time round if everyone else ends up with the same grade as them after n retakes)
- We used mixed-but-reasonably-high-ability classes – all pupils were in sets 1-3 out of six for maths anyway, but we decided that the top pupils would be useful participants in class discussions, and we knew that the entire course should have been accessible to all pupils. I wouldn’t do this for my maths groups, as I think the key difference is whether your subject is about knowing the topic (maths) or interpreting the topic you know (stats).
- We increased pupils’ knowledge of current affairs and world events – one pupil was adamant that I’d invented badger culling, as it fitted too well with my lesson on sampling methods to be a real news story. Another chose to take to twitter to tackle Fraser Nelson on his ‘ridiculous misleading graph‘ which we’d discussed in a lesson. We searched twitter for ‘GCSE Statistics’ on the morning of the exam, with interesting results. The team were all engaged in discussions about the topics including the general election and the biblical census, while no-one forgot about pilot studies or placebos after seeing these two groups:
- Their mock exam papers were marked both felinely and caninely for triangulation purposes.
So that was our year! Feel free to add any questions or comments about any of it. I don’t have any true control groups to compare any of these strategies to, and the results might have ended up the same if I’d just retaught last year’s lessons, due to the caveats at the top. There are still plenty of things which could be improved for next year too. #missblock and I have had a great year with #teamstats anyway – thanks to everyone who played a part in it!