Going data naked

Numbers don’t actually exist. There is no actual number three somewhere. It is not a thing. There is just ‘threeness’, a relationship between things that we learn to recognise; that this small cluster of cubes is similar to that small cluster of counters in a way we learn to call ‘three’.  The cubes themselves are not three; we declare their threeness when they are associated together in a certain way.  We learn what three means through repeated exposure to clusters exemplifying this relationship and thus come to learn what three and not-three look like.  But there is no spatiotemporally locatable prototype ‘three’ against which all other instances of three can be verified.

Pupil progress is a bit like that.  We tend to act as if ‘Progress’ is a real, tangible thing that really exists. Worse than that, we even believe that we can measure it.  This is an illusion.

It is, however, incredibly useful to have a word to describe ‘the process of gradually improving or getting nearer to achieving or completing something’ in the same way that it is even more useful to have the concept ‘three’.  So what’s my problem? Is this just an exercise in clever semantics?   My point is that progress isn’t a generalizable thing that exists independent of a highly specific context, a point that seems obvious. Yet the assumption that ‘Progress’ can be reduced to one, measurable thing that can or cannot be found hidden inside pupils’ exercise books or test scores is the basis of the panoply of accountability; all those graphs and charts and spreadsheets purporting to ‘measure’ something.  What then, we may ask, is the unit of measurement? The microGove perhaps[1]?

Of course we can look at pupils’ work over a period of time and see if they are getting better at the things we want them to get better at. Indeed, it is really important that we do, because if they are not getting better then there’s a problem of some sort that we need to get to the bottom of and then remediate. So we need to be clear about what we want them to improve. Generally, this is to do with either knowing more stuff or knowing how to do certain stuff or knowing how and when to do certain stuff rather than others.  So we will listen to pupils’ answers and read their work and set them tests to find out if what we are teaching them is sticking. And if it is we will be pleased that they are making progress, maybe even good progress.  But the improvement they make in their times table test scores and the improvements they make in knowing more about the water cycle or using fronted adverbials in their writing are just not commensurate.  That would be like trying to compare mass with colour intensity or length with electrical charge.

Even Ofsted High Command are trying to move away from the idea that you can ‘measure’ progress.  The Ofsted Handbook, the report of the Commission on Assessment without Levels, the data management report from the Workload review group all say the same thing; you need to be able to show progress, but that does not mean you have to be able to quantify it.[2]  Here’s a brief selection (courtesy of James Pembroke and Gaz Needle) from those listed above, saying just this.

sean h

Inspectors will use lesson observations, pupils’ work, discussions with teachers and pupils and school records to judge the effectiveness of assessment and whether it is having an impact on pupils’ learning.  They don’t need to see vast amounts of data, spreadsheets, charts or graphs.   –  Sean Harford: OFSTED National Director, Education, 2015.

From:  https://www.youtube.com/watch?v=H7whb8dOk5Q

Be ruthless: only collect what is needed to support outcomes for children. The amount of data collected should be proportionate to its usefulness. Always ask why the data is needed.

A purportedly robust and numerical measure of pupil progress that can be tracked and used to draw a wide range of conclusions about pupil and teacher performance, and school policy, when in fact information collected in such a way is flawed. This approach is unclear on purpose, and demands burdensome processes.

The recent removal of ‘levels’ should be a positive step in terms of data management; schools should not feel any pressure to create elaborate tracking systems

Focusing on key performance indicators reduces the burden of assessing every lesson objective. This also provides the basis of next steps: are pupils secure and can pupils move on, or do they need additional teaching?

From:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/511258/Eliminating-unnecessary-workload-associated-with-data-management.pdf

‘Progress became synonymous with moving on to the next level, but progress can involve developing deeper or wider understanding, not just moving on to work of greater difficulty. Sometimes progress is simply about consolidation.’

From:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/483058/Commission_on_Assessment_Without_Levels_-_report.pdf

“We want to see the assessment information you use as a school to identify how well your pupils are progressing in your curriculum and, crucially, how you use that information to improve pupils’ achievement.”  Sean Harford

And then today, Sean has both ‘liked’ and retweeted this Tweet of mine:

sean 2

However, some of Ofsted’s foot soldiers still appear not to have yet got this message. A report published on May 25th 2017 had as a key issue

  • There is not enough emphasis on the measurement of pupil progress from individual pupil starting points.

But that was nearly a year ago. Maybe things have improved since then? To find out, I decided to read all the areas for improvement in Ofsted reports for primary schools published in March. However, that runs to over 70 pages, so I gave up after reading 7 pages worth of reports. With 10 schools per page – that’s 70 reports I read. To be fair, most of them seemed sensible enough, but I found a fair few recommendations that worried me. All of the following are all recommendations from reports published in March 2018. I have highlighted in bold the problematic parts.

  • ensuring that success criteria regarding pupils’ progress and attainment in performance management documents and in the school’s development plan are measurable, to hold teachers more clearly to account for the achievement of pupils in their classes.

 

I’m not sure how this can mean anything other than reducing progress to a numerical score? As James Pembroke says ‘numbers in a tracking system do not prove that pupils have made progress; they just prove that someone has entered some data into the system.’

 

  • assessment information is accurate and used alongside improvement plans that have precise objectives and clear measurable outcomes, in order for academy committee members to further hold leaders to account

 

  • leaders’ plans for school improvement and the use of the pupil premium have clear actions, timescales and measurable outcomes

Again, an emphasis on measuring the unmeasurable – a desire for the false illusion of accuracy that measuring something purports to bring.

  • outcomes of groups of pupils, no matter how small, are reviewed more precisely, so that leaders know whether their actions to raise standards are effective and represent good value for money
  • action plans contain precise success criteria, with specific targets for groups and cohorts of pupils, so that leaders and governors are able to check the impact of their actions on improving outcomes for pupils

With both ASP and the Ofsted dashboard moving away from looking at smaller groups, it is alarming to see this in recent reports.

  • they strengthen their analysis and evaluation of the progress of different groups so that they know how well different groups of pupils are progressing

Indeed, even this one bothers me. Why can’t we just check and respond on a pupil by pupil basis? How does it actually help any child do better if leaders are spending precious time analysing groups? Even bigger groups? Especially in year. At the end of the year, then yes, I’d have a look at how pupil premium children were doing compared with non-pupil premium. And obviously at the end of a key stage a whole raft of data is produced. But I’d rather spend my time improving the curriculum and teaching than making pretty charts on excel.

Then there is the question of whether ‘tracking’ really means ‘have a spreadsheet with numbers.’ See for example, these recommendations.

  • systems for tracking the progress of pupils in subject-specific skills across the curriculum in subjects other than English and mathematics are embedded
  • track the progress of pupils so that governors, middle and senior leaders are fully informed about the progress of groups of pupils, particularly across the wider curriculum.

So they want information about how different groups are doing in geography then, do they?

These two might not mean ‘have a spreadsheet for the other subjects, but that’s probably not how it is going to be interpreted.

So much for being ruthless and only collecting what is needed to support outcomes for children!

Be that as it may, we are doing our best to go ‘data naked,’ by which I mean having the least data we possibly can, only resorting to numbers if they actually tell us something that will enable someone to do something that will make things better for the children as a result. I’m not sure we’ve got it all right and it is still very much a work in progress, but this is what we currently do.  I am not holding this up as a marvellous example for others to follow. We are currently due Ofsted, so, not quite holding my nerve, in September our assessment plan included more data than I really thought necessary. While I believe that Sean Harford means what he says, I get nervous about individual inspectors – so the plan included data as a sort of security blanket or lucky amulet to bewitch any data-besotted inspector. However, the plan did not survive contact with reality. Either that, or I just got braver.

Maths

We started the year intending to carry on from the previous year using the PUMA standardised tests at the end of each term. The standardised scores from these were then entered into, yes, you guessed, a home-made excel spreadsheet, which was formatted to colour code certain ranges of scores, based on benchmarks suggested by the publishers of PUMA.  The idea being we could have a column with the previous scores from July alongside the December scores, thus being able to make useful comparisons over time. Is Abdul still ‘green’?  Why has David done from ‘orange’ to ‘red’? In other words, pseudo-levels.

However, come December, the year 2 and 6 teachers asked if they could do a previous SATS paper instead – which seemed liked a sensible idea. That immediately meant that the December results could not be directly compared with the previous July ones, since the children were taking a test intended for several months later. These results were worthwhile though, and gave us a rough but useful indication of who was ‘on track’ or ‘behind’ or ‘ahead’ given their ks1 score or EYFS score. Everyone else did PUMA but came up against the obvious problem that when you take these kind of tests in year, they don’t necessarily test what you have taught. In other words, it was pretty meaningless except as a way of the individual teachers checking if those questions they had actually taught had been answered correctly. So any attempt to check progress from the previous July was futile.   For year 1, the situation was even worse as they were being compared to FSP outcomes.  Nevertheless, we valiantly attempted to crunch data and report to our standard and curriculum committee. We even analysed groups – though only boys, girls and pupil premium vs non pupil premium. However, by the time we’d explained for the umpteenth time that ‘you can’t really compare December results with July results’, the governors looked at us all funny and asked us why we were wasting time on in depth analysis of something patently not suitable for such treatment. Then when we tried to talk about groups –and some of our classes are small with only 18 pupils in – it got even more farcical.  Governors and leaders together resolved not to waste any more time analysing stuff that was not properly analysable.

So this term, year 2 and year 6 are doing another sats paper, and everyone else is either  doing PUMA or White Rose – whatever best fits what they have actually taught so far – but they are doing these assessments not so the SLT can analyse and draw (highly dubious) conclusions. Instead, they are doing them to inform their own teaching so they know what needs more revising and who might need more supporting. At our next pupil progress meeting we will have a conversation about each pupil, and how they did on whatever tool the teacher used will be discussed as a possibly useful starting point. Where pupils do not appear to be doing so well, we will have a look at their maths book to see if that sheds any light on the situation.  I will also look at the tracker that tells me if the child knows their number bonds and timetables.  I will ask the teacher if there were any particular areas of maths where many children did badly in questions, and if so, what are they going to do about it.

Then in July, everyone (except Early Years and years 2 and 6) will take PUMA (because by then, everyone should have taught the year’s curriculum, so the test:curriculum misalignment problem should not arise) and then I will enter those scores against last July’s scores. I can see a point of data tracking year on year.  I can see how that can flag up potential problems either for a child or teacher.  But within year, talking to the teacher about their class, looking at books, watching lessons and tracking acquisition of key number facts is much more useful than wasting hours with a spreadsheet.

I should add that, as an experiment, this year we bought into Star Maths (part of the Accelerated Reader package from Renaissance Learning) for years 5 and 6. This enables pupils to do a maths test in a matter of minutes, with no marking for the teacher, and result instantly available (and analysis of what the pupil can and can’t do).  Apparently, according to @MrLearnwell, these results correlate very well with actual sats performance.  Renaissance Learning bought the anonymised sats data from the government and matched (via UPN) actual sats results with performance of the thousands of children who use their product and got a very high level of correlation.  I will wait and see how this bears out for us when this year’s sats results are out, but it may be that from next September we use Star Maths across the school. I don’t understand the product enough yet to understand how it gets round the curriculum: test misalignment problem that happens in year. That’s something I need to find out more about.

Reading

We abandoned PIRA (twin sister of PUMA) this year as we didn’t find it helpful at all. It’s nothing like actual SATS papers, some questions are really odd and all in all, it’s not a good assessment. Several other people have contacted me via Twitter to express the same opinion. Instead, we use Accelerated Reader to find out all sorts of useful things. As well as getting a standardised score from Star Reader, it also gives us a fluency measure, a reading age and, best of all, how many minutes of independent reading each child is doing.  This kind of granular information is so much more useful than a test score and really helps us pinpoint what needs more attention. For children in Reception and KS1, (or for older children where appropriate) we also track their progress in phonics. As with maths, all of this information is discussed for each child in our pupil progress meetings and where there are problems, strategies are decided. Years 2 and 6 do previous Sats papers in December and March, in part to give children practice of the format.

Writing

Last year we bought into a tracker system that had every objective for the year. It took a lot of teacher effort for practically no impact on children.  Indeed, by focusing on the objectives for that year, it drew teachers’ attention away from objectives in lower year groups that might urgently need attention. Yes, full stops, I’m looking at you. So  this year we’ve invented our own really minimal writing objectives tracker for ks2. This starts with the year 2 interim framework objectives, then builds from there, with each year group having 4 or 5 further key objectives, drawn from the national curriculum. So each ks2 teacher checks off the previous year groups objectives first, starting with the year 2 ones. It’s quick and makes sure teachers address learning gaps. On top of that, we are involved in the Sharing Standards comparative judgement project from No More Marking.  This gives us a good measure of how well we are doing as a school in relation to other schools, as well as giving each child a scaled score.  This scaled score is only based on one piece of work, but a useful starting point for discussion and enables us to target book looks on those children who seem to be doing worse than we would have expected, given their prior attainment. Added to that, it means every teacher has seen a piece of work from every child in the school from year 1 upwards and I have instant access to that work from my computer.

History, geography, science, RE

Children do a multiple choice quiz at the end of each unit. The score out of 10 (or 5 in ks1) gets recorded on a spreadsheet. Then a couple of months later (when that unit has long finished) they do another quiz on that subject. That score is also recorded. Then at the end of the year they do a quiz of quizzes, with questions from all the units that year. And guess what – that score gets recorded too, and goes on end of year report. I was really worried about assessing the foundation subjects when this first became a thing, but actually, this system works really well, is quick and easy and has impact.  It allows us to identify which questions children are finding harder and which children are not doing as well as they should. In order to assess children’s ability to apply knowledge, we have just started using stem sentences and ‘but, because, so’ to see if children can put their knowledge to work.   For example, given the stem sentence, ‘the River Nile used to flood each year…’ can the children carry on this sentence 3 different ways, using but, because and so?  For example, the River Nile used to flood each year but does not any more since the Aswan dam has been build.  Or, the River Nile used to flood each year so the land became very fertile from all the minerals in the floodwater.  At the moment this is mainly formative, but we may also weave it into their end of unit assessment once children are more familiar with the process.

MFL is similar with end of unit quizzes, but I haven’t got round to having them on a spreadsheet yet.

Computing

This is still under development and not yet available for every class. Children start each coding unit with screenshots of various bits of code (usually from Scratch). They write what they think this code might do. Then at the end of the unit, they get the same screenshots and again write descriptions – which are of course then much more accurate and detailed. Nothing gets put on a spreadsheet. Ironically the computing assessment is the most low-tech! The assessment helps the teacher see how effective they have been and which aspects were the least successful. Children like seeing how much they have learnt. So I am quite happy with this system. In addition, we have a multiple choice quiz on online safety, which the children do every term. Yes, the questions are the same, because it’s not about progress, it is about keeping the children safe.

PE, art, DT and music

We have a PE coach who takes all PE lessons. He has this massive spreadsheet with 3 or 4 objectives from each sport plus one for being a good team player. Hand on heart I have no idea if it actually has any impact on children’s progress in the subject but he said he had all that information anyway and was happy to do it.

We have a similar system for art and DT (though much shorter). I’m not wedded to the idea. We have also started doing simple assessments of children’s ability to copy patterns of increasing complexity – starting with just a line and getting progressively harder, the child stopping at the pattern they find difficult to copy. I think this is much more likely to be useful.

And as for music…er…I confess we don’t have a system yet for music.

Reporting to governors and parents

The great thing about graphs and charts is they make complex information understandable. The downside is they give the illusion of making flawed information meaningful. They enable comparisons but at a cost; everything has to be reducible to a number.  This is a cost I am no longer prepared to pay. But while I think our present way of checking for progress is far superior to previous systems, without a doubt it is harder to report to others in terms of accountability.  As you can see, we have different systems for different subjects; some information tracks discrete objectives or behaviours, some is comparative with other schools, some is strictly formative.  I can’t reduce this complexity to a numerical value. Governors have to bear with narrative descriptions of how we know about the progress our children are making. Some subjects have some numbers, but the score out of 10 in a history quiz is in no way directly comparable with say, average reading age or the number of number bonds a child in year 1 knows.  And as for tracking groups – well – except for at the end of each key stage, we don’t. It doesn’t add any value at all to the achievement of any child so I simply refuse to indulged in such a meaningless ritual.

Reporting to parents, on the other hand is much easier. Parents understand things like a reading age or a score out of 10 or a chart that shows how many times tables or spellings a child knows. That’s far more understandable than being told your child is 3b or ‘emerging plus’ or even ‘working at the expected level.’

And Ofsted?

Maybe I’ll just give them this to read?

 

 

 

[1] Yes, I know I’ve made that joke before. It’s good though, isn’t it, even though I say so myself.

[2] Read this excellent blog which says everything I am saying, only better, and from which I have drawn extensively in this blog

Advertisement
Going data naked

Oven-ready, Hello Fresh or Just Eat? What’s the beef about pre-planned lessons?

Another weekend, another Twitterstorm. The Policy Exchange have just released a paper arguing for more availability of ‘coherent curriculum programmes’ which include, among other things, lesson plans, text books and lesson resources such as worksheets. Unfortunately the TES reported this as ‘The solution to the workload crisis? Stop teachers designing their own lessons’  which, understandably, has gone down like a bucket of cold sick on Twitter.  The fear being that this augers the triumph of the neo-liberal take-over of education, with lesson plans direct from Pearson delivered straight to the classroom by Amazon drone.  Or, to refer to my possibly obscure title, delivered by motorbike by Just Eat, with the teacher’s role limited to opening the plastic cartons and serving them out; lamb bhuna tonight, whether you want it or not.

Having read the entire article, what the article is actually proposing is something much more reasonable: debatable, but reasonable. The argument goes that the 2014 National Curriculum is not being implemented as well as it possibly could because the appropriate resources and training to implement it well either don’t exist, or if they do exist, are hard to locate among the myriad of online resources. It bemoans the current situation where many teachers trawl through online resources, of possibly dubious quality, late into the night, as they attempt to plan each and every lesson ‘from scratch,’ although in reality, probably ‘from Twinkl.’[1]   This is wrong, the report argues, because the ‘lesson by lesson’ approach is highly unlikely to result in a coherent curriculum that hangs together across the year groups, or that provides sufficient provision for revisiting previous learning. The workload argument is more of a side issue in the report, not its main thrust. Its main thrust is about having a coherent curriculum.

I’m all in favour of coherent curriculum. Indeed, in this blog I argue for curriculum design that has coherence not only within each specific subject, but across subjects.  Yet the type of ‘3D’ curriculum I’m advocating is extremely time consuming to write. We’ve been at it for almost 2 years and it’s not where I want it to be yet. The same situation is being replicated across the country. In my ideal world, the DfE would pay me and my selected Twitter mates to devote ourselves to this task, but since (doubtless due to unintended oversight) the report fails to mention me explicitly by name, it comes up with the suggestion that the Government should have a curriculum fund that brings ‘teachers with curriculum planning ideas together with institutions who can provide quality assurance and wider scale distribution.’p36  The kind of institutions it posits as being in a position to do this are multi-academy trusts, learned societies, subject associations and museums.  What about schools not in MAT’s, I’d argue? As otherwise that means the vast majority of primary schools would be overlooked, and surely some of us have something to offer? And what about the BBC?

So, while I might argue with the detail about who might and might not secure funding to write detailed, coherent curriculum programmes, I think this is an excellent idea. I’d much rather use a curriculum resource written by a bunch of teachers in partnership with, say, a museum than by most educational publishers.  Especially if there existed a range of quality assured, kite marked resources that schools could choose to use, if they wanted to. Many primary schools already use ‘off the peg’ curriculum packages, usually for discrete subjects but occasionally across the curriculum. [2]  What is lacking is the all-important question of quality assurance. At the moment, schools buy in all sorts of ready-made packages for aspects of their curriculum.  With Ofsted signalling its intention to scrutinise the quality of the curriculum (which in a primary school context is shorthand for ‘everything other than English and maths’), primary headteachers are tearing their hair out trying to rustle up a coherent curriculum offer for the foundation subjects while secondary heads fret about ks3. Just off the top of my head, I can think of the following resources that primary schools of my acquaintance use.[3]  Jolly Phonics, Third Space Learning, Cornerstones Curriculum, White Rose, Literacy Shed Plus, International Primary Curriculum, Developing Experts, Jigsaw PHSE, Discovery RE  Val Sabin PE, Rigolo, Discovery Education Coding, ReadWriteInc, Maths Mastery, Charanga.

The thing that strikes me going through this list is that there are lots of different resources out there for maths and phonics and plenty for those really specialist areas of the primary curriculum where many primary teachers are more than willing to ‘fess up to having little to no subject knowledge and welcome explicit handholding; PE, music, computing.  But for geography and history, I know of nothing except for Cornerstones and IPC, which offer many subjects. I think it is fair to say to both parties that the IPC is not quite what the authors of the 2014 National Curriculum quite had in mind. And neither of these curriculum packages have the sort of horizontal, vertical and diagonal links that  I would argue  an excellent curriculum should be striving to build within and across subjects.

However, I really do understand the horror some teachers are expressing on Twitter today about having the planning of lessons taken away from them. The two main objections are that no ‘off the peg’ lesson can ever hope to meet the specific learning needs of the diverse classes we all teach and that planning lessons specifically for one’s children was one of the best bits of teaching, part of what made the job rewarding.

So, finally, let’s get back to the title.  In the report, the author John Blake suggests that coherent curricular programmes could be thought of as ‘oven-ready’ – presumably a sort of educational ready meal that just needs a bit of warming up. He argues that these would be especially useful for teachers new to the profession or new to a particular subject. And to be honest, even those of us who love lesson planning probably don’t mind using ‘ready meals’ for some subjects where they lack subject knowledge. If you told most primary school teachers that they were not allowed to use externally produced resources for computing, MFL or music, for example, and had to plan every lesson entirely from scratch, then there would be tears. (Except for the highly knowledgeable minority, of course, who might not understand what all the fuss was about).

Blake then goes on to talk about ‘the final foot.’  What he means here is how teachers could take an ‘oven ready’ resource and then use their professional expertise to adapt it as necessary for the realities of their class. Much of the groundwork having already been done, the teacher is freed up to tweak the lesson to fit their children.  This is what I meant by the ‘Hello Fresh’ approach. Hello Fresh is one of those companies that delivers boxes of food with all the ingredients you need to make the particular recipes it also provides. Everything is already in exactly the right quantity, all the cook needs to do is chop, peel, and actually cook the ingredients. Unlike a ready meal, this gives you scope either to follow the recipe slavishly, or, for those who feel confident, add or omit ingredients according to your family’s preferences, play about with cooking times (because you know your cooker best, right) or even go completely rogue and use the ingredients in a completely different recipe, maybe adding in other ingredients bought elsewhere and chucking others.

Yet I understand that some teachers will still object and see this as an assault on their professional autonomy and creativity.  When I was a class teacher, I loved lesson planning. So it was with some trepidation that 4 years ago we tried out a particular maths scheme that has very detailed, partially scripted lesson plans. I’m not going to say which one because I’m not specifically arguing for the merits of that particular programme or not, but about the idea of using very detailed plans written by others. (Besides which, many of you will either already know or be able to guess).  Anyway, we got funding with one class. The class teacher was happy to give it a go, though she was already an experienced, skilled teacher.  The reason why she soon loved it was because it wasn’t a ready meal, it was more of a ‘Hello Fresh’ kind of thing.  In fact, you had to tweak the lessons because, as the programme makes quite clear, they are aimed towards the average child and your actual children aren’t average. Some will need more challenge, more depth, others will need more support. So the teacher needed to think about how to adapt every lesson for the particulars of their class. The teacher also needed to decide whether or not to spend more time on a particular lesson, skip over lessons if the class didn’t need them, swap suggested manipulatives for something else, and had the freedom to design their own worksheets or to not use any worksheets at all.

What made this possible was that the programme wasn’t really a set of resources, it was a training programme, of which resources were a part.  Each unit of work included a video explaining key concepts, an overview, links to articles and research, as well as the lesson plans and flipchart slides to go with it. These resources are excellent and go far beyond what any of us in school would have been able to offer. And ours is a school unusually blessed with knowledgeable maths teachers.  There was also some central training and the expectation that the maths leader was regularly coaching teachers new to the programme. Indeed, during the first year, our maths leader, a year 6 teacher, had to teach year 1 maths once a week using the programme, so that she became familiar with it.  Without this training, the resource would not have had half the impact it did.

Now you may argue, if you had to do all that tweaking, what on earth is the point? You might as well have designed the whole thing yourself. Well no, even with the tweaking, lesson planning was much quicker. But why our first teacher really loved it, and why the subsequent teachers to use it also love it, is because it is so clever.  The progression and the way it comes back to topics again and again, the way it builds in reasoning at every step, the way it moves children away from reliance on counting and towards reasoning based on known facts is excellent. We might rate ourselves as excellent maths teachers who can plan fantastic lessons, but we simply do not have the expertise or time to develop a scheme of such quality. What really struck me doing lesson observations one week was how brilliantly progression is planned into the scheme. I saw addition lessons in year 1, 3 and 4 and in each lesson exactly the same structure was used, but with increasing complexity. Given its obvious superiority to anything we could produce, it would be foolish and arrogant to insist that we had the ‘freedom’ to plan our own lessons, just because we liked it. Nor do teachers feel reduced to mere delivery bots. I really feared they might, but that just didn’t happen. Because the lessons made sense. And where, very occasionally, a lesson didn’t seem to work, they had the freedom to teach it again, their way.

That’s not to say we don’t occasionally do things differently. For example, I think this way of teaching telling the time is better, so we don’t use all of their resources for that – just some. And we are encouraged to comment on lessons and suggest improvements which are listened to. Because the resource is online, rather than a textbook, when they adapt the programme, we don’t have to throw out costly resources. Were there to be similar quality programmes in other areas of the curriculum, I would buy them like a shot.

However, I also really understand that many teaches love the creativity that planning affords and would be loath to relinquish it.  On the other hand, just because you love doing something, doesn’t mean everybody does. As Michael Fordham says:

michael f

Instead, as an alternative to moving into leadership, more experienced teachers should have the option to move into curriculum design themselves. This is what happens in Singapore, where experienced teachers have options to move into senior specialist roles that work on areas such as curriculum design, testing, educational research or educational psychology.

With talk here of sabbaticals for teachers, maybe one sabbatical opportunity could be to work within a curriculum development team, producing resources for others to use?

 

 

 

 

 

[1] The report doesn’t mention Twinkl, that’s me, being facetious.

[2] I can only comment in detail on primary schools. Maybe it’s different in secondary schools where teachers are subject specialists?  But from talking to many secondary teachers, I don’t think it is as different as all that.

[3] Inclusion in this list does not mean I think the resource is either good or bad. We use some of these; some I wouldn’t touch with a bargepole.

Oven-ready, Hello Fresh or Just Eat? What’s the beef about pre-planned lessons?