What’s all the fuss about a knowledge-rich curriculum? part one

Here’s part of my talk from yesterday’s ResearchED,  A word of warning though, if you are a regular reader of my blogs, it is very similar indeed to my previous post ‘Mutual Misunderstandings’ although now with pretty pictures added. This blog talks about what knowledge is and features the first part of my intended talk. I massively ran out of time during my talk, so rushed through the bit on understanding somewhat breathlessly.  I will put that section into another blog in due course.

I was having dinner with a headteacher friend who told me how much my name was reviled by some of her colleagues because apparently, as an advocate of a knowledge-rich curriculum, I was in favour of bombarding children with facts in boring lectures and was only interested in teaching to the test.  This is frustrating, because it’s so wrong, and might lead to people missing out on the wonders of knowledge-rich goodness! So to put the record straight…

The first misunderstanding is to think that that when we speak of knowledge, we only mean acquiring facts.  That’s not the case at all. Knowledge can be divided into declarative knowledge and procedural knowledge.

knowledge isnt facts

Declarative knowledge includes concepts and rules as well as facts and will allow us ‘to recognize things, make judgments, classify things, discriminate differences and identify similarities.’[1]

So for example, it is declarative knowledge that lets us recognise a tree as a tree, judge that it is a mighty fine tree, or an old tree, classify it as a Horse Chestnut, discriminate how it is different from a Sweet Chestnut and identify how both are similar, being examples of deciduous trees.


Procedural knowledge is knowledge that produces action, that enables us to do stuff. It is goal directed, whereas declarative knowledge (the kind of knowledge that includes, but is not limited to facts) just sits there waiting to be of service. It doesn’t of itself result in any action.  Procedural knowledge actually enables us to do things. Most obviously, it involves motor behaviour; learning to play the guitar or catch a ball are both forms of procedural knowledge.  To throw a ball, you don’t need to also know the physics behind it, you don’t need factual knowledge of how it all works. You need to build motor memory and that is a form of knowledge learnt through paying attention and repeated practice. But procedural knowledge isn’t only about muscle movement, it also lies behind enabling us to use declarative knowledge.  Solving an equation or balancing a chemical reaction, both involve turning declarative understanding into procedural knowledge.

dec vs proc

To use an analogy, in football, its no good just having strikers who score goals, you also need players who can assist the striker, players who set up the opportunities for the striker to  do their thing.

assist striker.PNG


Of course we want knowledge to be put to work, to use declarative knowledge to do actual stuff, it’s just that you don’t get to do the stuff  unless  you have the requisite declarative knowledge in the first place. Yet the fear is that the knowledge-rich brigade are only interested in the declarative side of things. This fear is misplaced. The knowledge-rich crew simply value declarative knowledge as a crucial part of the learning equation – alongside procedural knowledge – and want to liberate it from its imprisonment in the dungeon at the base of Bloom’s taxonomy, derided as somehow objectionably  ‘lower order’, looked down upon in condescending fashion by the haute bourgeoisie of the ‘higher order thinking skills’ clique.


Instead, we want to champion the absolutely vital place it holds as enabling all further thought.  Instead of Bloom’s triangle, how about we have Sealy’s Funnel?

sealys funnel

Synthesis and evaluation – what we might call creativity and critical thinking – are only possible once vast amounts of knowledge have been understood.  Far from being ‘lower order’, knowledge is our precious  cognitive capital. with critical thinking its dividend.

Christine Counsell talks about the curriculum as needing a hinterland which nurtures and sustains the city. in the same way as a city needs a vast area of land surrounding it to provide it with the water, food, fuel and people it needs to thrive, so creativity and critical thinking rely on extensive and wide ranging knowledge.


People sometimes refer to the ability to do such things as acquiring skills rather than procedural knowledge. The word ‘skills’ however, is particularly problematic as it is used to mean several different things. For example, it is used to describe dispositions such as resilience and behaviours such as collaboration, it is also used to describe things such as inference and problem solving, which traditionalists are more likely to see as different kinds of disciplinary knowledge, and then again it is used for procedural knowledge. Knowledge-rich advocates positively embrace procedural knowledge and also are happy to endorse dispositions and behaviours such as collaboration and resilience, though might argue that there are plenty of opportunities for encouraging both within an ordinary maths or PE lesson.  The fault line lies  with the so-called generic skills such as problem solving, or explanation.

skills 3 ways

troub;e with generic

Knowledge-rich advocates argue that while things such as problem solving or explaining or observing are important, they are not generic skills.  These terms imply different things in different subjects. ‘Explanation’ in history has a different meaning than ‘explanation; in science for example. Even something apparently straightforward like ‘observation’ means something different depending on whether you are doing something in an art lesson or a science lesson. What is worthy of observing will differ.

For example, if I ask a bank robber ‘why did you rob the bank?’ I am expecting a moral or psychological explanation and not a logical one. I would not expect the answer ‘because that’s were the most money is.’  That’s not the kind of explanation I’m expecting.


Explanations in history and science are very different.

sceince vs hisotriy

Similarly, close observation drawing differs depending on whether its a science or an art lesson.

science vs artthe knowledge vs skills debate is an argument about the extend to which skills are transferable from one context to another and not about whether or not these  ‘skills’ are important.

k vs s




[1] From The Unified Learning Model, Shell , D et al  chapter 4

What’s all the fuss about a knowledge-rich curriculum? part one

Mutual Misunderstandings- the hidden lives of tweeters.

I love Twitter, and I’ve learnt so much since I joined it a mere 4 year ago. Yet recently, I’m getting tired of having the same old arguments going round and round the same territory, with people seeing to talk past each other, using the same word to mean different things.  For example, take the word ‘engagement.’ To one person, this is obviously a great thing – who wouldn’t want their pupils absorbed in their learning? For this person, the opposite of engagement is disengagement, and surely nobody wants their pupils disengaged.  Yet for another person ‘engagement’ means something completely different. For them, it is shorthand for an approach to education that they reject; one that thinks the content of what we teach is inherently boring to many pupils, so therefore needs to be dressed up in ‘fun’, thereby tricking pupils into learning. For this person, anyone advocating ‘engagement’ has woefully low expectations of children and is short-changing them with fun ‘edutainment’ instead of substance. For such people, the opposite of ‘engagement’ is learning. So if these two people, with their different definitions of engagement have an exchange of views on Twitter, it will appear to each other that the person they are arguing with is advocating either disengagement or baby-sitting.

I had my own experience of this recently. I, in all innocence, used the word ‘delivery’ in the context of teaching. To my mind this is a perfectly harmless word. If pushed, I’d maybe link it to how Santa delivers eagerly awaited presents to excited children. (Yeah, all my lessons were that great…) But apparently ‘delivery’ is some sort of evil trigger word, with connotations of ramming pipes into the throats of geese and pumping them over full of grain in order to make fois gras.  It signalled my moral depravity. Yet I was just using it as a synonym for ‘teach.’

Each brings their own prior learning to the table. One person remembers their own terrible school experience of dull, dreary, soul crushing lessons that all but extinguished any desire to learn and is motivated by their resultant burning conviction to ensure their lessons breathe light, life and passionate interest into those who experience them.  The other cringes as they remember their earlier exhausting attempts to dupe pupils by disguising the learning with some complicated, ‘fun’ activity and subsequent relief when later on, they just started teaching directly and found that a much better way of arousing a passionate interest. No wonder they don’t agree. Both are fighting an enemy the other cannot see.[1]

This morning, I read this very interesting interview with Dylan Wiliam.  One of the bits that really got me thinking was this.

‘any teaching should start from what the learner already knows…teachers should ascertain this, and teach accordingly. The problem is that even with a new and unfamiliar topic, after 20 minutes teaching, students will have different understandings of the material, which the teacher needs to know about. What you call the curse of knowledge is part of that—we assume something is easier if we know it.’

Which resonated with something I’d read earlier this week by Harry Fletcher Webb about how when we teach adults, we often forget that they too, just like children, build on what they already know and sometimes form misunderstandings because connect the new stuff we are teaching to their prior knowledge in a faulty way. [2]

It seems rather pompous to say that when we tweet, we are seeking to ‘teach.’ Yet I certainly engage with Twitter in order to learn and indeed I have learnt an awful lot. And yes, sometimes I hope what I say will help other people learn. I am sure there are many reasons to go on twitter, to be entertained, for banter, for company, to show off, for that self-righteous thrill of point scoring, but also to be inspired, to learn things and to share what you have learnt with others, hoping that some may find it useful;  it’s that last reason that motivates me to write blogs.

As a teaching medium, Twitter has its drawbacks! A strict character limit, the torrent of ‘interruptions’ to one’s line of argument when all and sundry can comment, the conversation lurching off in bizarre directions, massive disputes about who actually, if anyone, is the teacher, who is worth listening too, the distraction of other threads, for you as well as anyone else…actually when I think about it, it’s amazing I’ve learnt anything from Twitter yet it has been incredibly powerful in my own learning.  Mainly because Twitter points me to blogs which have fewer of the shortcomings listed above and where I can learn from people I have chosen, by my act of clicking, to learn from. [3]Yet many of the people I engage with don’t have blogs and obviously not everybody who I interact with reads my blogs (the fools!)

All of which is a long prelude to what comes next. The more I argue over the same ground again and again, the more I am aware that I am being misunderstood. I am saying words which to me have a clear and obvious meaning yet they’re being taken to mean something quite different. I also know this works the other way around too and I completely misunderstand what other people are saying. For example, someone says ‘child-centred’ and, building on my prior experience, I imagine the crazy excesses of having to teach via an integrated day and how much more everybody learnt when I stopped doing that and actually taught children stuff. When they might mean something quite different. (Or they might not, that’s the problem, it’s hard to tell). It’s so easy when arguing with someone to imagine the worst possible version of what they are espousing, and the best possible version of what you are arguing for. If they do the same, that’s a recipe for more heat than light. The hidden lives of our prior experiences make mutual misunderstandings inevitable.  Human nature exacerbates the problem, with people falling into ‘in-crowds’ and ‘out-crowds’, retweeting and sub tweeting and eye rolling and argument by GIF. I believe some people even DM catty messages to each other about third parties.

However, today the better side of my nature is on the bridge and what follows is simply an attempt to genuinely help people understand what those of us who are advocating ‘traditional’ teaching mean. Or possible clear up what we don’t mean. Probably, the social media bubble being what it is, I am preaching to the choir but hey ho.

First of all, that word ‘traditional.’ The scope for misunderstanding this word, (and its rival ‘progressive’) is immense. There was a great blog this week about just this problem. If people had a bad experience of very formal education, the ‘traditional’ tag is like a red rag to a bull. But take heed, the word does not imply that everything back in the olden days was rosy and we should bring back the cane, sit four year olds in rows and bore children into a stupor. It’s used in contrast with the term ‘progressive.’  Progressive sounds so lovely, who doesn’t want to be progressive? But here, ‘progressive’ does not mean ‘the opposite of regressive’ but is rather a description of a philosophical tradition in which the writings of such people as Rousseau, John Dewey and Jean Piaget are foundational. For a fuller description, see Greg Ashman here.  Summarising self-proclaimed progressive Alfie Kohn, Greg Ashman describes progressive education as one where students help to direct the curriculum, students seek and find their own answers, a focus on intrinsic motivation that eschews coercion and the drawing of a distinction between knowledge and understanding in order to focus on the latter. Traditional education, by way of contrast believes that teachers are experts in their subject and therefore they should design the curriculum and teach it explicitly. Traditional teaching believes that there is a ‘tradition’ of knowledge that students are entitled to.[4]

The second misunderstanding is that when we speak of knowledge, we only mean acquiring facts.  That’s not the case at all. Knowledge can be divided into declarative knowledge and procedural knowledge. Declarative knowledge includes concepts and rules as well as facts and will allow us ‘to recognize things, make judgments, classify things, discriminate differences and identify similarities.’[5] Procedural knowledge is knowledge that produces action, that enables us to do stuff. It is goal directed, whereas declarative knowledge (the kind of knowledge that includes, but is not limited to facts) just sits there waiting to be of service. It doesn’t of itself result in any action.  Procedural knowledge actually enables us to do things. Most obviously, it involves motor behaviour; learning to play the guitar or catch a ball are both forms of procedural knowledge.  To throw a ball, you don’t need to also know the physics behind it, you don’t need factual knowledge of how it all works. You need to build motor memory and that is a form of knowledge learnt through paying attention and repeated practice. But procedural knowledge isn’t only about muscle movement, it also lies behind enabling us to use declarative knowledge.  Solving an equation or balancing a chemical reaction, both involve turning declarative understanding into procedural knowledge. There is so much more to knowledge than just facts (as useful as they are).

People sometimes refer to the ability to do such things as acquiring skills rather than procedural knowledge. The word ‘skills’ however, is particularly problematic as it is used to mean several different things. For example, it is used to describe dispositions such as resilience and behaviours such as collaboration, it is also used to describe things such as inference and problem solving, which traditionalists are more likely to see as different kinds of disciplinary knowledge[6], and then again it is used for procedural knowledge.

The third misunderstanding is to think that traditionalist teachers are only interested in knowledge rather than understanding. Again this is quite wrong. What traditionalist teachers assert is that it is impossible to understand something unless you know something about it. This is because understanding, properly understood, is simply having lots of well organised knowledge that is connected together. I wrote about this recently in my previous blog so I won’t repeat that again here.  Understanding is literally made out of knowledge. So it is possible to know something without understanding it but it is not possible to understand something without knowing it. This diagram by Efrat Furst explains it well.

understanding model efrat Furst

The fourth common misunderstanding is to think that therefore, traditionalists think that lessons should be formal lectures where the lecturer does 99% of the talking, the learners role in the process inherently passive. This is not the case. The kind of explicit instruction most traditionalists favour – and like anything it is a broad church, so there will be differences in emphasis – is highly interactive.  It will involve questioning not just of the few eager clever clogs but of everybody present, through strategies such as ‘cold calling’, using mini whiteboards, individual exercises interspersing the teacher explanation. It might involve discussing things in pairs or even doing a short bit of drama. In a maths lesson, it might well involve children using manipulatives. What is more, it does not preclude the use of other ways of sharing the information other than a teacher talking (though this will be the most common way). A video clip might be used if it is a more effective way of explaining something – for example, an animation of the heart beating might well be a better way of explaining the role of the heart and lungs in the circulatory system than just a verbal description. Who wouldn’t show the clip of Commander David Scott  dropping a feather and a hammer at the same time on the moon to show that without air resistance, objects fall at the same rate.

Traditionalist teachers also follow up the explicit teaching phase with ‘shed loads of practice’ – gloriously shortened to SLOP. If learning is to stick, long term, it needs to be practised over and over, to the point where it becomes automatic and can be recalled without conscious effort. Some of this practice might even involve tightly planned opportunities to ‘discover’ aspects of what is being taught, for example, variation theory can be used to devise the kind of deliberate practice that helps learners notice patterns, similarities and differences. However, there would always be some sort of teacher commentary at some point to draw attention to things that might not have been noticed.

The final misunderstanding I am going to cover is the erroneous belief that traditionalist teachers adhere to a crude ‘transmission’ model and don’t realise that learners build – or construct – their learning on what they already know. Of course traditionalist teachers do believe they have a tradition worth sharing (or transmitting, though the word lacks nuance).  Whereas while traditionalist teacher eschew constructivist teaching, they are well aware of constructivist learning theories. Learners build on what they already know and construct meaning out of the connections formed between their established and new learning. See the point about understanding above. Efrat Furst explains this in more detail here.

In fact, I’d go so far as to say that traditionalist teachers are more concerned about prior learning than progressive teachers because of the pivotal role prior learning plays in whether we understand or misunderstand something. Misunderstanding occurs when we connect bits of knowledge together in the wrong way. For example, if we know that addition can be done in any order so 2+3=5 and 3+2=5, when we learn about subtraction, we think that this also can be done in any order so think that 5-2=3 and 2-5=3. We’ve made a false connection between our new knowledge and our existing knowledge.

This is why questioning and other forms of formative assessment are so important to try and ascertain that the right connections are being made and to address misconceptions as soon as possible, before they become too established.  It is also why traditionalist teachers use explicit teaching that breaks knowledge down into very small sub steps to minimise the risk that wrong connections will be made. Traditionalist teachers are really aware of the ‘curse of knowledge’, the difficulty ‘experts’ have in realising how complicated something really is and therefore overwhelming students working memories by trying to teach too much at once. Indeed, in what is probably the ‘purist’ form of traditionalist teaching, the Direct Instruction method designed by Engelmann, concepts are meticulously broken down into minute sub steps, carefully explained and regularly practised so that the new learning has the best possible opportunity to connect to prior knowledge in the right way.

Aren’t labels funny. My husband swears he is a social constructivist, yet he does all this stuff. Maybe when we label ourselves or others, that is more about the group of people we want to belong to (or not belong to), more about the kind of people we believe ourselves to be.

[1] I’m not saying that there are not also honest to goodness, downright disagreements where people understand perfectly well what the other is saying. Such disagreements are (I think) often disagreements about what each other values. Possibly.

[2] Well that’s not quite what Harry said, but it’s how I connected it to what I already knew!

[3] Though I also click on people I am pretty sure I will disagree with too. Sometimes even with an open(ish) mind!

[4] Thanks to Andrew Old for this way of explaining it

[5] From The Unified Learning Model, Shell , D et al  chapter 4

[6] With each discipline determining what is meant by the term, so terms mean different things in different subjects. ‘Explanation’ in history has a different meaning than ‘explanation; in science for example. Even something apparently straightforward like ‘observation’ means something different depending on whether you are doing something in an art lesson or a science lesson. What is worthy of observing will differ.

Mutual Misunderstandings- the hidden lives of tweeters.

In praise of a prosaic curriculum

In case you’ve been living in an underground bunker for the last year or two, the curriculum is now a thing. More than that, it’s the next big thing in education. The new framework for inspection arriving in our schools from September 2019 will have the quality of the curriculum firmly in its sights. Intent, implementation and impact are set to become our new lodestar, possibly (fingers crossed) eclipsing sats and GCSE results at the centre of the school solar system.  Amanda Spielman has made it quite clear, mistaking ‘badges and stickers’ [the performance table success that comes with having great data] for learning and the substance of education, is a mistake, and one, presumably, that the new framework will seek to overcome. I do not under-estimate the massive culture shift involved, for both inspectors and schools. It’s nothing short of a revolution. (I wrote a blog about this a year ago; in terms of what actually happening in schools I don’t think much headway has been made in rearranging our priorities.)

Inevitably, in the build up to the new framework, various people, myself included, are offering training to schools about how to develop your curriculum. I’m seriously worried about some of the examples on offer.  For example, I heard about one school advertising that its amazing curriculum now had 60% of learning outside (and they weren’t talking about Early Years). Why is doing 60% of learning outside of itself a good thing, any more than doing 60% of learning indoors is a priori a good thing? Surely you decide this on a case by case basis? Pond dipping? Orienteering? That’ll be outside. How to use semi colons? Writing a paragraph about the Roman invasion of Britain? Probably inside. Learning does not become more durable or transferrable according to its location, though I would have thought for most learning having a regulated temperature and protection from the elements and flying insects, not to mention good acoustics, the facility to model ideas on a board of some sort and a surface upon which to write are all quite useful features.

I’ve heard of schools where ‘hooking’ the children into learning involved spending a whole day dressing up as Romans, and eating Roman food and so on. A whole day! Or another school where to build empathy with the homeless they spend the day (and possibly the night) camping on a field.  When reading and discussing Way Home would have achieved the same objective in 30 minutes. Personally I think the best hook is a teacher saying animatedly ‘You are going to love learning about the Romans!’ and then teaching it with passion, but if a school wants to spend 10 minutes on some sort of hook, so be it. But burning through whole day of precious curriculum time is just profligate! As if fitting everything into the limited time we have available to us was easy!

The mot du jour seems to be ‘exciting.’  All around me, schools are proclaiming how exciting their revised curriculum now is, as if ‘excitement’ were the substance of education. Alongside ‘exciting’, I also often hear that curriculums are ‘innovative’ and ‘engaging.’  Superficially, these sound like persuasive descriptions of great learning, especially if you contrast excitement with boredom, innovation with stagnation and engagement with distraction.[1] The problem with ‘excitement’ though, is that it’s not a great way to ensure the kind of learning that is durable in the longer term, or that transfers from one context to another. In other words, it might be fun at the time, but it is less likely to result in long term learning. In exciting lessons, you run the risk of remembering the excitement, rather than the learning. I’ve written before about the difference between episodic and semantic memory.  Let me recap the essential differences between the two.

Episodic memory is where we store the ‘episodes’ of our life, the narrative of our days. This is the autobiographical part of our memory that remembers the times, places and emotions that occur during events and experiences.  We don’t have to work hard or particularly concentrate to acquire episodic memories, they just happen whether we like it or not. When we talk about having fond memories or an event being memorable or exciting, we are talking about episodic memory. We are talking about something that happened, something where details of time, place and how we felt at the time are central.

Semantic memory, by contrast, is where we store information, facts, concepts.  These are stored ‘context-free’, that is, without the emotional and spatial/temporal context in which they were first acquired.  These type of memories take effort; we have to work to make them happen.  That might sound a bit boring, compared with episodic memory.  Yet it is our amazing ability to store culturally acquired learning in our semantic memory that makes as so successful as a species. Semantic memory is how we know stuff. Without it, human culture would not exist.

The problem with episodic memories is that while they may be acquired effortlessly, they come with several drawbacks in terms of acquiring skills and knowledge. Episodic memories come tagged with context. In the episodic memory, the sensory data – what a child saw and heard during a lesson – alongside their emotions, become part of the learning. These emotional and sensory cues are triggered when they try and retrieve an episodic memory. The problem being that sometimes they remember the contextual tags but not the actual learning. Episodic memory is so tied up with context it is no good for remembering things once that context is no longer present. Because it is context-bound, it does not transfer well to different contexts. Luckily our brains also have semantic memory. Semantic memories have been liberated from the emotional and spatial/temporal context in which they were first acquired. And once a concept has been stored in the semantic memory, then it is more flexible and transferable between different contexts.

At this point, it is usual for people to say ‘but what about understanding? There’s no point having a load of facts if you don’t understand them.’  Which is of course true.  However, understanding happens in the semantic memory! Understanding is the word we use for when we have a well-developed schema for something – in other words, understanding is what happens when we have lots of well organised, connected knowledge, as opposed a handful of unconnected facts (or no facts). It’s the connections between facts that is understanding. When we misunderstand something, that is because we have made the wrong connections. For example we might have connected how the concept of value works in natural (counting) numbers with how value works in rational numbers such as fractions, and therefore think the bigger the denominator, the bigger the value of the fraction.  When we don’t understand something (as opposed to misunderstanding it), that is because we have not made enough connections yet. If we only know one or two facts about something, understanding is hard because the potential to make connections is so limited. Our two lonely facts may seem a bit meaningless. If however we know hundreds of different facts about a topic, that changes the nature of our thinking; we can now weave a rich web of understanding because there are so many connections that can be made.  Because of the wealth of connections, we can think deeply and creatively. Jo Facer has written an excellent blog expanding on this here.

This is why progression in a subject necessarily involves acquiring more knowledge. As more knowledge is acquired, more links are made; thinking is structured differently so more nuanced application is possible. Schools should be wary of curriculum packages that describe progression in terms of levels of learning (e.g. basic, advancing and deep) and spout nonsense such as progression involving ‘changing  the  nature  of  thinking  rather  than  just  acquiring  new  knowledge.’

Because understanding is literally made out of knowledge, it is possible to know something without understanding it, but you can’t understand it without knowing it. The onus on the teacher then is to carefully share their own schema step by step, explicitly describing/explaining/modelling the links and being alert for misconceptions. Indeed, assessment for learning – or responsive teaching as it is more appropriately called, involves checking for missing knowledge and misconceptions (wrongly connected knowledge) and remedying them when found.

This being the case, I would argue that the main substance  of education – the back bone of it, so to speak – is building strong semantic memory; the passing on and further development of the knowledge built up over centuries to the next generation; how to read and write, how stories work, how to use mathematical reasoning to solve problems, science with its amazing power to gives us to predict the future, how people in different times and places are so different and yet so similar, and the myriad of other concepts, ideas and practices.  We want children to understand concepts and facts rather than just remember events and experiences. Alongside this, we should also be building procedural memory (the memory of how to perform physical tasks and skills such as handwriting or riding a bike, or playing the piano).  Honing these skills – or procedural knowledge – comes down to regular practice, not to exciting, innovative experiences.

This building of semantic and procedural memory sounds terribly prosaic. What about critical thinking, problem solving and creativity I hear you ask? Why aren’t we teaching them? Surely this is what education is for – not just knowing stuff?

Again, I’d agree. Helping children grow into people who can think critically for themselves, who can solve problems and be creative is the ultimate goal of education.  However, we should not confuse the ends with the means. If we want children to be able to think critically and solve problems, then they need something to think critically with. For this they need knowledge, and the kind of flexible knowledge that is durable and transfers between contexts. This necessarily involves using semantic memories stored in the long term memory. If we want children to be creative and innovative, they need knowledge of the tradition upon which they are going to innovate.  You can’t really teach critical thinking as a detached skill; what you can do is teach various metacognitive strategies such as ‘consider both sides of an issue.’  Of course, this only helps if your students know what both sides are, so these metacognitive strategies need to be taught and applied within a specific context. In other words, teach someone about something and then give them opportunities to think critically about it. Don’t start off a programme of study with critical thinking or problem solving. Lay the ground work first, carefully and systematically building the requisite knowledge so that then students can apply their knowledge, using it to solve problems, possibly generating creative, novel solutions.

This building of semantic and procedural memory is not the only purpose of education, of course.  If it’s the backbone, then it will need further fleshing out. It’s just as important that we educate children to be emotionally literate and morally responsible and that will involve thinking about the kind of episodic memories we try and build for our children. We want some memories to come tagged with emotion. If we treat our children with kindness and respect, they will have episodic memories of what it was like to be treated kindly and respectfully, which makes it more likely they too will treat others with kindness and respect themselves.  If we want them to feel compassion for others, we will treat them with compassion.

Building of episodic memory is important for other reasons too.  Episodic memory encodes memories of our experiences, whatever they are. Another key purpose of education is to broaden the range of these everyday experiences so that we are lifted up out of our familiar and parochial context and gain the kind of perspective that comes from encountering new, different situations. Some children have very narrow life experiences, as this thoughtful blog by Debra Kidd testifies. Here’s an extract from it:

Hywel Roberts tells a story in his wonderful key notes about teaching in a school in Sheffield. The class are looking at town planning and urban developments, so as a way in, he asks them what they might find in a great city – if the city of Sheffield were to be redeveloped, what would they put there? One by one, the children list things the city should have – a Greggs, a BP Garage, a hairdressers called Streakers…they are describing their walk to school. For many of the children, their only experience of the city they live in is the walk to and from school. For those children and others like them, getting on a coach and going to a museum is about far, far more than remembering aspects of the curriculum. It can be literally life changing.

It is not only the most disadvantaged children who could benefit from experiencing wider horizons.  Plenty of children, particularly in they live in a city, might never have climbed a mountain or seen the sea, or even been to the local park. Rural children might never have been to a big city, let alone looked down on a cityscape from the top of a skyscraper or cathedral or castle turret.  Swathes of children might never have visited an art gallery or heard classical music or music from a different culture or been to the theatre or nature park or on a train or in a boat or even gone away on holiday, beyond visiting relatives.  This is why schools such as Hartford Manor have a curriculum pledge that builds in a range of experiences – dare I say, exciting experiences – into their curriculum as an entitlement. See this blog by Loic Menzies which articulates how life enhancing he found the rich opportunities he had for outdoor adventure as a child and why he believes all children should have such opportunities.

Other, superficially more advantaged children may also have limited life experiences. They might never have properly encountered people who live in different socio-economic circumstances, for example and so have no idea how challenging it is to live a life of grinding poverty. Or they might live within a mono-culture, never meeting people from different cultures or traditions. With adults increasing living in narrow social media bubbles, rarely encountering people who think differently from them, it is all the more important that education broadens out all children’s horizons and enriches all communities. In other words, curriculums need to be planned to foster spiritual, moral, social and cultural development as well as knowledge acquisition.  This will require attending to both episodic and semantic memory formation.  A carefully planned programme of experiences that compliments and reinforces the super abundance of SMSC inherent in a rounded study of history, literature, art, music, RE, geography, MFL, maths, science, PE, PHSE and so on should provide this.

So if I’m all in favour of building procedural learning, opportunities to apply knowledge in critical thinking and with creativity, to emotional literacy and moral responsibility, and experiences that broaden horizons as well as education that builds semantic memory, what exactly is my problem and why am I banging on about a prosaic curriculum?[2]

This is because there is a world of difference between planning a set of experiences that consciously address the specific kind of narrowness that a school’s particular context creates and believing that excitement per se is a good enough reason for inclusion on the curriculum.  Providing experiences just because children might find them exciting and enjoyable is not a great reason to allocate them precious curriculum time. (Which is not to say they can never happen, just that they should be the rare exception rather than the rule).  Nor is this to say lessons must be dull and uninspiring. That’s just as bad.  There is a middle ground between a curriculum that panders to a craving for ever more excitement and is preoccupied with novelty and gimmicks and a dismally boring, dry as dust snooze-fest. Something solid and prosaic, something with enough cognitive challenging to be absorbing. The engagement comes from the subject matter itself and the feeling of satisfaction one feels after a bit of struggle.  Easy success isn’t rewarding; earned success is motivating.

When I use the word prosaic, I am not using it to mean dull and boring, but to mean ordinary, everyday, usual, familiar, regular, customary, typical, bread-and-butter – stuff that isn’t ‘sexy’ or glamorous or flashy, but that forms the bedrock of what we do in schools. Some of this is the stuff that forms the foundation upon which more interesting stuff depends. Learning to read, to add and subtract, learning number facts and times tables, to use punctuation and spell correctly, handwriting; basic, humdrum everyday stuff, no bells and whistles, the stuff of learning, this should be at the heart of our curriculums because without it, nothing else is possible. Sometimes derided and sneered at, often looked down upon, let’s hear a cheer for the workaday workhorse of education.

I sometimes see on Twitter teachers moaning that phonics is tedious and the decodable early readers are boring. Which is to completely miss the point – they aren’t intended to be great works of literature, they are intended to teach children to read (so that they can go on to read great works of literature).  Teaching phonics is as tedious as you want it to be. Young children love learning to do ‘grown up’ things like reading, and if you show how excited and impressed you are that children can now blend p-i-n, then they will be excited and impressed with themselves too. That’s where the engagement comes in, with the success. The lesson isn’t meant to entertain the teacher after all, it’s meant to develop the child and some of them things that help a child develop, especially early on, are pretty prosaic. The sentence ‘Clap, clap, clap on the big, red bus,’ is not, of itself, desperately interesting. However, being able to turn all those squiggles into actual words that make up a sentence is amazing! Criticising a decodable reader because of its limited story line is like telling a babbling baby their conversation is boring. To the ‘phonics is boring’ brigade I say, ‘stop raining on the children’s parade!’ If you are not delighted and enthused by helping young children take their first steps on the reading journey, then you are in the wrong job. (Or wrong phase, perhaps).

Then there is also the content, the knowledge, the substance that we want to teach; knowing where countries are on a map of the world or what a force is or what the industrial revolution was or what the 5 pillars are or what irrigation means. Content that is taught and practised and revisited so that the learning is durable. So that it can be transferred in different contexts and used in critical thinking. If we want children to think critically about arguments around immigration, it helps to know where different countries are in relation to one another. If we want children to think critically about the engineering challenges inherent in a mission to Mars, it helps to know about force and gravity. If we want children to think critically about the advantages and disadvantages of industrialisation for an economically developing country, knowledge of the industrial revolution will provide a useful way in. If Britain developed in the 19th century by exploiting our resources and workers, are we right to condemn other countries for doing the same now in the 21st century? If we want to have an informed understanding of Islam rather than one tainted with ill-informed Islamaphobic histrionics, understanding the importance of the 5 pillars for Muslims is a necessary but not sufficient starting point. If we want children to think responsibly about natural resources, knowing about water use and the benefits and pitfalls of irrigation is vital.  The knowledge we teach forms the “teeth” in the gears of understanding. Without knowledge, understanding cannot gain any traction.   Such content is inherently interesting in the hands of a skilful teacher, it does not need sugar coating with gimmicks in order to make it palatable.

There is a misconception that this entails a ‘lecture’ form of lesson, with children meekly listening for long periods of time to their teacher. This is not at all what I am advocating. Rather, I am suggesting (courtesy of Greg Ashman) that the majority of lessons have four main features. They are planned and led by the teacher, who makes conscious choices about the sequence of learning, the content is broken down into small steps with children learning how to do each individual step well before the steps are brought together into task that require the sub steps to be integrated all at once, concepts are fully explained – children do not have to ‘discover’ it all by themselves and finally teaching is highly interactive with everybody required to participate throughout the lesson.[3] I’d also add in that they frequently revisit previous learning with regular retrieval practice so that memory of previously taught content is strengthened. Such lessons are usually calm rather than dull or whacky.

Towards the end of a sequence of such lessons, I’d advocate opportunities to apply what has been learnt.  At this stage, the child integrates the sub steps in some way with less explicit teacher direction. There are a myriad of ways this could be done, from writing an essay to goal free problem solving to pursuing one’s own line of inquiry to doing a test to making a model or creating some art work.[4]

However, anything can be done to death.   Having a template lesson structure is one thing, clinging to it come hell or high water is another. Occasionally mixing up the structure – for example – using a Mantle of the Expert approach once in a while or doing some sort of whizzy experiment or workshop provides variety and counterpoint.   Just don’t confuse this with the prosaic core.




[1] Though you could also contrast over-excitement with calmness, gimmicky innovation with tried and tested methods and superficial engagement with the medium of the lesson as opposed to focused absorption on the core content.

[2] Yes, it’s beginning to sound a bit like the ‘what have the Romans ever done for us’ sketch from the life of Brian.

[3] This list comes from Greg Ashman’s excellent new book, chapter 5, The Truth about Teaching, sage Publications 2018

[4] Though I’d be wary if the final project ate into too much curriculum time. I’d go for an 80:20 or 90:10 balance. So making a claymation video about embalming mummies would not be a good use of time, unless you believed learning how to do claymation was as itself an important part of the art/computing curriculum. Great for an after school club though.

In praise of a prosaic curriculum

Responsive teaching, responsive leadership

Based on my presentation at the Medway Network of the Chartered College of Teaching

Inaugural Conference on Culture, Wellbeing, Workload.

Over the last couple of years, I’ve made a concerted effort to question how we do things. Just because something is the accepted way of doing things, doesn’t mean it is the most efficient or effective. The two main things we’ve really changed is how we mark and how we observe lessons.  Reading the research around marking and feedback and reading various blogs quickly convinced me that marking was a very inefficient use of staff time and had limited impact on learning. Similarly, having termly graded lesson observations was an inefficient use of SLT and had limited impact on improving teaching or learning. Instead of marking, we now have responsive teaching; instead of termly observations we now have responsive leadership.

Kluger and Denissi’s research into written feedback found that students often learn less when teachers provide written feedback than they do when the teacher writes nothing.[1] 38% of feedback made learning worse! Yet the marking mania epidemic still has teachers throughout the land double or triple marking work, with teachers, or maybe just their senior leaders, convinced that this is a touch stone of good practice. Assessment for learning has become a ritualised performance believed to magically invoke learning, as long as the prescribed coloured pens, fans, gestures or whatever are used in the liturgically correct way.  No wonder Dylan William declared he wished he had called formative assessment ‘responsive teaching’ instead.

So what is responsive teaching?  It’s really very straightforward. It’s

  • Looking at pupils’ work, either within the lesson or after it, and responding to what you find out.
  • Gathering feedback for the teacher, about the pupil
  • Using this feedback to decide what to teach next

Shorn of its ritualistic associations, we can embrace a simple definition of feedback as ‘information about reactions and/or performance which is used as a basis for improvement.’ So scanning the classroom to see what pupils’ facial expressions are telling us is one commonplace – and very useful – way of gaining feedback. If half the class look clueless, there is probably a problem that needs urgent responding to!

When we look at pupil’s work after a lesson to gather feedback about what to do next, it’s looking for the bottleneck, the main thing holding pupils back in their learning, and teaching that, regardless of whether or not it is on the scheme of work for that year group. If many pupils still can’t use full stops and capital letters correctly even in year 3/5/10, there’s no point in moving onto fronted adverbials or whatever, despite what the official plan says. Fix that first. It’s not going to fix itself. Instead of writing in pupils’ books what their next step is, the next step is…the next lesson. Respond to what pupils don’t yet know by teaching them.

Tom Sherrington outlined 5 ways teachers might respond to feedback. I’ve played around with this a bit and come up with my own version of 6 ways, more geared to a primary context (though I don’t see why these wouldn’t work with secondary pupils too).

1.Reteach – they don’t understand this. I need to reteach with different examples

Sometimes you look at books after a lesson and realise that an awful lot of pupils have got the complete wrong end of the stick. So instead of pressing on with the next lesson in the series, you go back and teach whatever it was again, only better.

2. Revise – they know something about this but we need to go over it again because otherwise they will forget it

Often you pick up either within the lesson itself or afterwards when looking at pupils’ work that pupils have begun to learn whatever it is you are teaching them. However, you don’t have enough evidence yet that this learning is really secure and won’t be immediately forgotten the moment you move onto something new. Better consolidate the learning by going over it again. Resist the siren voices telling you that if pupils can do something a couple of times they need to move on to new learning. That’s just madness. Vaguely understanding something is not the same as really knowing it. Progress does not always involve learning new stuff. Often it involves learning ‘old’ stuff more securely.

3. Redraft – they can do this better. I need to model how to improve it.

This sort of response is typically used when responding to longer writing tasks. But expecting children to be able to make things better without showing them how is pointless.  It’s a bit like rushing to do your photocopying only to find the ‘paper jam’ and ‘change toner’ lights flashing at you. The lights are giving you, the learner, very specific feedback, but unless you already know how to clear a paper jam, where the toner is stored and how on earth you change it, it is not going to be any use. In fact, it will probably just wind you up. Dylan William mentions some written feedback that told a pupil to ‘be more systematic in your scientific enquiries.’ To which the pupil’s response was, ‘if I knew how to be more systematic, I would have done it the first time.’

What we now do instead of marking longer writing tasks is to devote a whole lesson to whole class feedback. During this, the teacher showcases small extracts – possibly just a sentence or two – where various pupils have done a particular thing very well.  So for example, they might share some well punctuated speech, then some excellent description, a great use of rhetorical questioning and the deft use of a range of sentence lengths to build suspense. Pupils may or may not work together on improving fictional examples where the particular thing is lacking immediately after the good example has been shared. Then teacher will then go on to share (anonymously of course) some examples – probably slightly tweaked, of brief extracts that could do with improving in some way. Usually teachers type up these sentences, rather than use a visualiser because first of all it’s easier to read and secondly it gives the teacher the chance to correct all but the actual mistake s/he wants pupils to focus on. Otherwise, pupils might fixate on incorrect spelling when you want them to focus on mixing up tenses, for example. Pupils then get short examples with similar errors to practise improving (usually in pairs). Finally, after this sustained quality modelling and practice, the children redraft their own work.[2]

This works really well. The children love their feedback lessons. They make great progress. And teacher workload has been cut by at least two thirds when compared to marking, even factoring in the need to plan a feedback lesson. Whereas marking one set of books used to take three hours, the whole process now takes about an hour. And it’s more effective.

4. Practice – they can do this but it is not yet automatic

This is different from revising (see point 2 above).  This is about practising things we know how to do but have not yet learnt to automaticity. So this might include being able to use a standard algorithm to do vertical subtraction, for example, or times tables, or number bonds, or converting measures from one unit to another, or revising key vocabulary or handwriting or a forming a stroke correctly when swimming.

5. Check – I need more information before I am convinced they really have this securely

This is when you want feedback about if pupils really know something securely, at some remove from the initial teaching. So it usually involves giving children questions to do at least a month after that original teaching has taken place. Can they still do fractions now we’ve moved on to area? Can they remember anything about the Romans now we are learning about the Vikings? And if not, what am I going to do about it? (If you are not going to do anything about it regardless, there is no point in checking, unless you like making yourself depressed).

And finally

6. Move on to something new

Sometimes feedback tells you good news. They’ve got it! We can move on to something new.

So that’s a whistle-stop tour through responsive teaching.  Harry Fletcher Wood has just published a book called ‘Responsive Teaching: Cognitive Science and Formative Assessment in practice’ that I would strongly recommend if you want to look at this in a lot more detail. But what does this have to do with the second part of my title? What has all this got to do with responsive leadership? Well it turns out that responsive leadership is a lot like responsive teaching. Like marking, doing termly high stakes lesson observations can actually make things worse, rather than better. The problem with this kind of lesson observation is that they lead to teachers showcasing compliance, rather than their warts-and-all, everyday practice. Instead, you observe lots of all singing, all dancing lessons quite unlike everyday practice that are therefore useless for helping people talk about how they might further improve their teaching. What a monumental waste of time!

Responsive leadership is

  • Looking at teachers’ work, either within the lesson or after it,[3] and responding to what you find out.
  • Gathering feedback for the leader, about teaching
  • Using this feedback to decide what professional development is most appropriate

In the context of lesson observations, feedback used to be something potentially ominous delivered to the observed teacher, some sort of label denoting their professional worth.  This is nonsense. In the same way that feedback about pupils should feed forward into planning future lessons, feedback about teaching tells leaders what should feed forward into planning future professional development. In the same way teachers need to think about what is the bottleneck holding children back, leaders need to reflect upon what it is that is stopping a teacher from teaching as well as possible, and then plan a course of action – usually in partnership with the teacher –  to help them improve.

And guess what, here are 6 possible ways to respond to what you’ve learnt from gathering feedback.

  1. Reteach

You know that feeling when you’ve delivered some inset on the latest whole school initiative and then you go and observe lessons and realise that almost everybody has either got the wrong end of the stick, or isn’t implementing the initiative at all? (No? Maybe just me then). Anyway, should that happen, here are some things to reflect upon

  • Maybe I didn’t explain this initiative properly
  • So either they don’t ‘get’ it…
  • Or don’t understand why it is important
  • Or possibly both
  • So I need to explain it better and persuade people why it is important
  • Did I explain what they can now stop doing?
  • If anyone is to blame, it is me
  • (Or maybe it’s an unworkable initiative or a seriously bad idea)
  1. Revise

This time when you see how the initiative is working out in practice, you find out that some people who were doing this really well seemed to have forgotten all about it. So ask yourself…

  • I need to remind them about this, and why it is important, before it fades away, like so many previous initiatives that were quietly forgotten, rather than explicitly stopped. I need to emphasise that this one is important. I really mean it this time!
  • Ask why it is fading? Forgotten? Not useful? Tricky? Logistics?
  • Spend quality time going over it again in a staff  or departmental meeting
  • Pupils aren’t the only ones who need time to work on remembering things. Retrieval practice is useful for teachers as well as pupils!
  1. Redraft – they can sort of do this, but it could be better.
  • They’ve half got this
  • We need to provide modelling of how to do it better and supportively coach people as they learn how to do this
  1. Practice

There are some techniques in teaching, such as giving clear explanations, specific behaviour management techniques, and various ‘Teach Like a Champion’ techniques such as ‘no opt out’ ‘cold calling’ etc. that are highly amenable to improvement through deliberate practice. With specific, deliberate practice, things become habitual, and no longer need conscious effort to remember to do. Some schools devote the first 20 minutes of staff or departmental meetings to deliberately practising specific techniques.

  1. Check

We need to ask ourselves about key initiatives

  • Is this really embedded across the school 3 months later, 6 months later, a year later?
  • Is this as secure as I like to think it is?

And finally

  1. Move on to something new

A cherished initiative is now part of the life blood of the school. Everyone does it, and does it well. If we need to, we could now introduce something else, without worrying that this will fade away.  But we need to remember newcomers to the school who missed out on the concerted effort it took to get everybody to understand why this was important and how to do it well. If it took a year for staff to really get this, don’t expect new staff to get it after a 10 minute briefing.  When something is automatic, or obvious, remember those who join the school sometime after the initiative was introduced. It probably won’t be obvious to them.

In the same way that ‘You need to be more systematic in planning your scientific inquiries,’ is not terribly helpful to a pupil who doesn’t know what this actually involves, teachers who aren’t doing the sort of things we want them to be doing probably just don’t understand what we mean and need some modelling and coaching, opportunities to observe others alongside someone who can provide a commentary, and time to practise.

And instead of termly high stakes lesson observations, try using more frequent, low stakes, developmental observations that are all about genuinely helping staff get better at what they do, rather than finding fault. Our model is as follows:

  • Senior leaders do low stakes, 10 minute ‘drop ins’ most weeks
  • Subject leaders do lesson-long modelling, team teaching and coaching
  • We bring books to staff meetings a lot
  • Also look at books in SLT meetings
  • Staff meeting timetable flexible and responsive
  • We revisit previous inset a couple of months after (spaced practice!)

I’ve written a blog about our approach before for thirdspace learning.

When I started in headship, in somewhat inauspicious circumstances, someone told me to remember that your staff team is just like your class. You will have well behaved ones and challenging ones, people who learn things really easily and ones that need more support and so on. Using feedback to help children learn is not really very different from using feedback to help adults learn.  Don’t rely on ritualistic responses; seek evidence with an open mind, have a good think about it and plan the best way to respond.






























[1] Kluger A, DeNisi A. The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin. 1996;119(2):254–84.

[2] The process is a little more complicated than I’ve explained as for brevity’s sake I’ve left out the finer details. We usually divide the lesson into a proof reading part (changing those things that make our writing harder for someone to enjoy reading) and then a longer editing session (making our writing even more interesting for a reader). In ks1, we mainly to proof reading with a hint of editing. In year 1, in the main, pupils proof read fictional ‘work’ written by the teacher rather than their own.  We also give fictional work to pupils of any age who have something very specific to work on, that few other pupils need (or are ready for).

[3] By looking at work and/or by talking to pupils

Responsive teaching, responsive leadership

Cognitive load: a case study

This is a shortened version of the talks I gave at ResearchED Durrington and ResearchED Rugby

When we are taught something, the information our teacher is sharing passes first into our working memory. The working memory is the place where we think.  What many teachers do not realise is that the capacity of the working memory is fixed and limited; as a result, it can only think about a very small number of things at a time.  Once the working memory is full, it can only take on more information by ‘dropping’ something, in the same way that you might be able to juggle with two balls easily enough, but add a third into the mix and everything would go pear shaped. The technical term in cognitive science for ‘going pear shaped’ is cognitive overload. 

Fortunately, there is a work-around. Unlike the teeny-tiny working memory, the long-term memory is vast. I like to think of it a bit like the Room of Requirement in Harry Potter.  The long term memory is the place where things go when we have thought hard about them. The great thing about this is, once something makes it to the long-term memory, we can bring that memory back into the working memory when we want to think about something. We can remember things. With things we have thought about over and over again,  retrieval of memories  can become completely effortless and automatic.  For example, you can read these words with minimal effort because reading for you has become automatic. This means you have cognitive capacity to spare in your working memory to think about what these words are actually saying. You don’t have to use any of you capacity trying to work out what the words say.

This cognitive architecture has implications for teachers. We will need to consider the cognitive load involved in what we are teaching  and be keenly aware of the limited nature of working memory. This means we will need to present information in really small steps. Another implication is that we will need to make sure that students have to think hard about what we want them to remember (rather than thinking hard about something else, like the format of the lesson).  A third implication is that because we want students to remember what we taught them, we will need to give them lots and lots of opportunities to retrieve what we have taught them from their long-term memories, as this will make the memories stronger.

Some things we learn form the building blocks of much of our later thinking so secure recall of these is vital. They must be practised over and over until they are so automatic, it is impossible to forget them. We need these tools to be available to us in our working memory whenever we want them, without any conscious effort. We don’t want to have to remember how to read before we can read anything  or have to resort to counting on our fingers in the middle of our maths GCSE. (For more about how we remember things, see here.)

However, we don’t always bear these implications in mind. For example, we don’t break things down into small enough steps because we are experts in the things were are teaching. Things seem easy to use, precisely because various steps in the learning process have become so automated and unconscious, we don’t even recognise all the different things we are doing at once.  Wiemann called this ‘the curse of knowledge’[1].

I’m going to explore this using a case study approach. I’m going to explore how we learn to tell the time. However, since I am assuming that you probably can already tell the time using a conventional, analogue clock, I am going to teach you using a kind of clock I’m pretty sure most people who read this won’t be familiar with.   Please let me introduce the Fibonacci clock.

fib clock

The Fibonacci clock uses the Fibonacci sequence, rather than the more conventional numbers 1-12.  To work out the Fibonacci sequence, start with 0 and 1, and add them together. Obviously this is equal to 1, which now forms the third number in our sequence of 0,1,1. To get the next number, add the last number in the sequence to number to the one before it. So the next number will be 2. The number after that will be 3, then 5 and so on. If you really want to get into the spirit of things, you might wish to pause and work out the next few numbers in the sequence for yourself.  For ease of reference, I’ve put them here.[2]

However, for the purposes of our clock, we only need to first 5 of these (the first 5 after zero that is, so 1,1,2,3,5). Another property of Fibonacci numbers is that if you draw squares whose sides equal the numbers in the Fibonacci sequence, you can arrange these squares into an ever expanding spiral, known as the golden spiral or the Fibonacci spiral.

fib sprial numbers

For our purposes, we only want to look at the rectangle formed when 1,1,2,3,5 are placed together in this spiral formation. This rectangle will form our clock face.

fib 1 to 5

fib clock numbers

The panels on the face light up different colours and the pattern of colours is what tells us the time. (They are only accurate to 5 minutes.) These are the rules for telling the time on a Fibonacci clock

  • The hours are displayed using red and the minutes using green.
  • To work out the minutes just add up the green squares and multiply by 5

That seems simple enough, so let’s have a go (answers at the end as footnotes)


7 oclcok



6 30 1

That’s not so bad. The hours are quite straightforward. The minutes are a little but more clunk to work out – worth remembering when we expect children to grasp that with the minute hand you also have to count in 5’s.

However, it isn’t quite as straightforward as that. Here is the full set of rules.

  • The hours are displayed using red and the minutes using green.
  • When a square is used to display both the hours and minutes it turns blue.
  • So to work out the hours just add up the red and blue squares.
  • To work out the minutes just add up the green and blue squares and multiply by 5

Ok, let’s try telling the time now




6 30 2


6 30 3


There’s more than one way to display the same time on a Fibonacci clock.



925.PNGI’m hoping that you are finding this a bit taxing. There’s a lot to think about and you are a good way off being able to ‘read’ the time in the same way you can read your watch without thinking.

Now let’s contrast the rules for telling the time on a Fibonacci clock with those for telling the time on an analogue clock.

fib rules

There are actually more complicated rules for the analogue clock. Yet we expect children to pick this up with a couple of three week block in year 2 and year 4, and then wonder why so many of them can’t tell the time! Because there are 4 different rules that all need orchestrating simultaneously, the cognitive load is too high for many children, so learning fails. The ones that get it probably had already had a fair bit of practice at home, so some of the rules were already automated and didn’t need to be consciously worked through. This meant these pupils had more space left in their working memories to think about those rules that were new to them. So, extrapolating from telling the time, we should consider that whenever children struggle with something, it is worth asking ourselves if we have overwhelmed their working memory by underestimating how complex something is? More often than not, the answer will be yes.

If we really did teach the time using a Fibonacci clock, what would be an effective way to do it? We’d break it down into small steps, one rule at a time, practising that lots and lots, before introducing the next rule. So we would start off just telling the time in hours, using red only. if we did this lots and lots, the children would start to benefit from what is known as the ‘chunking effect.’ If we gave children plenty of time to practise each component aspect separately, this step would become stored in the long term memory as a ‘chunk’.

Have you ever tried to carry a large bundle of washing upstairs.  First of all, you drop a sock. When you pick this up you then drop some pants. Precariously balancing your pants on top of the pile causes yet more socks to cascade to the floor. Then consider the same load, packed into 5 carrier bags. You easily manage to climb the stairs without depositing underwear on the landing or hosiery in the corridor. A similar thing happens in our brains with chunking. The classic illustration of this effect is to ask someone to try and remember a sequence of letters or numbers. For example, look at this sequence for a few seconds (or even better, have somebody else read you this sequence) then look away and try to recall it.


Now try this sequence which has exactly the same letters


British readers should find this much, much easier as the groups now form instantly recognisable chunks (for non-Brits, BBC is obviously the more usual name of the British Broadcasting Company TV channel, ITV is another TV channel, NQT stands for ‘newly qualified teacher’ and SMT stands for ‘senior management team’ – the leadership team in a school.)

Each of these ‘chunks’ of meaning only take up one slot in the working memory, so in the second example we only have to remember 4 things, not 12. We use chunking when we read a clock face. When we read a watch, we don’t count round in 5’s, we automatically ‘read’ the time from the position of the hands. We can even do it when the numbers are missing!



In the same way we no longer consciously sound out every letter when we read but can just ‘see’ what a word says, given sufficient practice, children will be able to just’ read’ a clock or watch. So now let’s practise reading our Fibonacci clock sticking just to red for the moment.  You may find you begin to just recognise certain patterns if you do this a few times.




2oc 1


2oc 2




4 oc




5 oc 2



i)[17]6oc 2


7oc 2






9oc 2







10 oc



11 oc





12 oc 2

When we were able to just read all these red clock faces automatically, we could move on to reading hours using a mixture of red and blue. When that was completely fluent we would concentrate on minutes, first of all just using green and when that was very secure, green and blue minutes. Eventually we would be in a position to put it all together.  This would take a lot of time and a lot of short but frequent practice.

If we translate this into how we teach children to tell the time using an analogue clock, it is little wonder children find it so hard and teachers so frustrating to teach. We don’t break it down enough and don’t do nearly enough practice once we’ve finished teaching the unit on time. In fact, it’s a miracle anyone learns to tell the time at all! If you want to find out about a better way of teaching time, I suggest you look at my blog here, where I advocate teaching using the hour hand only at first, and then subsequently teaching the minute hand separately. When both of these can be read fluently, read two clocks side by side, one showing hours, the other minutes. Finally, after all this practice, you can introduce a standard two hand clock.

As I said earlier, there are some things we learn as in the early years and key stage one that form the building blocks of much of our later thinking. If we want children to have the mental capacity to be independent, critical thinkers, we need to move heaven and earth to make sure as many  as possible of these crucial building blocks become completely automatic so that precious working memory space can be used for more creative thinking. These key skills must be practised over and over until they are so automatic we cannot forget them and don’t need to think about them. Drivers may well remember how difficult it was when first learning to drive to change gear, steer, signal and read the traffic all at the same time.  A year or so later, the process is so automatic, you can arrive at home without even remembering much of your journey. Instead, you’ve been able to think about other, more important things on the way home.

In the same way, our children have an entitlement to be given time and encouragement to commit the basic building blocks of thinking into their long term memories. Primary schools owe it to the children they teach to make sure that as a  bare minimum, all of these are learnt to automaticity.

  • Number bonds
  • Times tables
  • Phonics
  • Handwriting
  • Telling the time
  • Full stops and capital letters.
  • Weeks and months
  • Recognising map of UK and beyond

Yet there is a reluctance to spend time practising basic skills. It is derided as ‘meaningless rote learning.’  Nothing could be farther from the truth.  What is really meaningless is condemning children to a lifetime of having to count on their fingers when we could have set them free from the bondage to counting by making such they knew their number bonds to automaticity. What could hinder problem solving more than not being able to manipulate numbers effortlessly because you were never given the opportunity to learn your tables by heart, because your teacher described that sort of thing as ‘regurgitation’?  What could be less creative than not being able to read fluently because your teaching thought phonics was boring? It is our duty as educators to ensure that we help children move as much information as possible to long-term memory, so that their cognitive load can be utilised on the fun stuff, the clever stuff, the important stuff.


[1] Wiemann, C. (2007) ‘the curse of knowledge’. Or why intuition about teaching often fails’. APS News 16 p.9

[2] 0,1,1,2,3,5,8,13,21,34,55,89…


[3] 7 o’clock

[4] 6:30

[5] 5:45

[6] 6:30

[7] 6:30

[8] 9:25

[9] 1 o’clock

[10] 2 o’clock

[11] 2 o’clock (those annoying duplicates!)

[12] 3 o’clock

[13] 4 o’clock

[14] 5 o’clock

[15] 5 o’clock

[16] 6 o’clock

[17] 6 o’clock

[18] 7 o’clock

[19] 7 o’clock

[20] 8 o’clock

[21] 9 o’clock

[22] 9 o’clock

[23] 10 o’clock

[24] 10 o’clock

[25] 11 o’clock

[26] 12 o’clock

[27] This is also 12 o’clock. I forgot to tell you that rule, in the same way we forget to tell children that 12 is also zero on an analogue clock

Cognitive load: a case study

Shoreditch calling! Job opportunity.

school logo

KS2 class teacher vacancy

Salary:  MPS/UPS

Want a great job in a great school in a great location?

You will have to work hard here, but only on things that make a difference: we don’t do meaningless paperwork; we don’t grade lessons or monitor your planning.  Instead of marking we give feedback in lessons.  We are always trying to find smarter ways to do our work. We only recruit people who are passionate about making a difference to children’s lives, people who challenge themselves to keep on improving. This means our working relationships can be relatively relaxed and informal. People love working here; we are warm, welcoming, positive, supportive and forward looking.  Our innovative approach means that teachers from all over the UK and beyond come to see our work in practice, so expect to welcome visitors into your classroom.

You will be joining a 1 form entry Church of England school with an inclusive and multi-cultural ethos based in the vibrant Shoreditch area of Tower Hamlets. Our community includes pupils and staff from a range of different faiths as well as people with no faith – all are welcome. We work together to foster children who are compassionate, respectful, happy, tolerant, curious and collaborative as well as academically successful.

We are looking for a teacher who:

  • Has a proven record of success as a class teacher and expertise in empowering children to make rapid progress.
  • Demonstrates optimism about children and expect the highest possible standards
  • Nurtures pupils’ emotional wellbeing

We offer you:

  • Sensible and successful systems, based on research, with a commitment to reduce workload whenever possible
  • A positive, warm and welcoming working environment
  • Experienced, intelligent, lively colleagues
  • Family-friendly working practices
  • A fantastic location with great transport links

Closing date for applications: Friday 11th May 2018
Interviews: Friday 18th May 2018

Visits to the school are strongly recommended and can be arranged via on 020 7739 8058 or by emailing Maureen Marlborough at admin@st-matthias.towerhamlets.sch.uk  This is also the address if you want an application pack.  We reserve the right to close the selection process early.

St Matthias School is committed to safeguarding all children. Successful candidates will require a DBS clearance and suitable references before commencing employment. We welcome applications from all section of the community, regardless of gender, race, religion, disability, sexual orientation or age.


St Matthias School, Bacon Street, London, E2 6DY

020 7739 8058

Shoreditch calling! Job opportunity.

Going data naked

Numbers don’t actually exist. There is no actual number three somewhere. It is not a thing. There is just ‘threeness’, a relationship between things that we learn to recognise; that this small cluster of cubes is similar to that small cluster of counters in a way we learn to call ‘three’.  The cubes themselves are not three; we declare their threeness when they are associated together in a certain way.  We learn what three means through repeated exposure to clusters exemplifying this relationship and thus come to learn what three and not-three look like.  But there is no spatiotemporally locatable prototype ‘three’ against which all other instances of three can be verified.

Pupil progress is a bit like that.  We tend to act as if ‘Progress’ is a real, tangible thing that really exists. Worse than that, we even believe that we can measure it.  This is an illusion.

It is, however, incredibly useful to have a word to describe ‘the process of gradually improving or getting nearer to achieving or completing something’ in the same way that it is even more useful to have the concept ‘three’.  So what’s my problem? Is this just an exercise in clever semantics?   My point is that progress isn’t a generalizable thing that exists independent of a highly specific context, a point that seems obvious. Yet the assumption that ‘Progress’ can be reduced to one, measurable thing that can or cannot be found hidden inside pupils’ exercise books or test scores is the basis of the panoply of accountability; all those graphs and charts and spreadsheets purporting to ‘measure’ something.  What then, we may ask, is the unit of measurement? The microGove perhaps[1]?

Of course we can look at pupils’ work over a period of time and see if they are getting better at the things we want them to get better at. Indeed, it is really important that we do, because if they are not getting better then there’s a problem of some sort that we need to get to the bottom of and then remediate. So we need to be clear about what we want them to improve. Generally, this is to do with either knowing more stuff or knowing how to do certain stuff or knowing how and when to do certain stuff rather than others.  So we will listen to pupils’ answers and read their work and set them tests to find out if what we are teaching them is sticking. And if it is we will be pleased that they are making progress, maybe even good progress.  But the improvement they make in their times table test scores and the improvements they make in knowing more about the water cycle or using fronted adverbials in their writing are just not commensurate.  That would be like trying to compare mass with colour intensity or length with electrical charge.

Even Ofsted High Command are trying to move away from the idea that you can ‘measure’ progress.  The Ofsted Handbook, the report of the Commission on Assessment without Levels, the data management report from the Workload review group all say the same thing; you need to be able to show progress, but that does not mean you have to be able to quantify it.[2]  Here’s a brief selection (courtesy of James Pembroke and Gaz Needle) from those listed above, saying just this.

sean h

Inspectors will use lesson observations, pupils’ work, discussions with teachers and pupils and school records to judge the effectiveness of assessment and whether it is having an impact on pupils’ learning.  They don’t need to see vast amounts of data, spreadsheets, charts or graphs.   –  Sean Harford: OFSTED National Director, Education, 2015.

From:  https://www.youtube.com/watch?v=H7whb8dOk5Q

Be ruthless: only collect what is needed to support outcomes for children. The amount of data collected should be proportionate to its usefulness. Always ask why the data is needed.

A purportedly robust and numerical measure of pupil progress that can be tracked and used to draw a wide range of conclusions about pupil and teacher performance, and school policy, when in fact information collected in such a way is flawed. This approach is unclear on purpose, and demands burdensome processes.

The recent removal of ‘levels’ should be a positive step in terms of data management; schools should not feel any pressure to create elaborate tracking systems

Focusing on key performance indicators reduces the burden of assessing every lesson objective. This also provides the basis of next steps: are pupils secure and can pupils move on, or do they need additional teaching?



‘Progress became synonymous with moving on to the next level, but progress can involve developing deeper or wider understanding, not just moving on to work of greater difficulty. Sometimes progress is simply about consolidation.’



“We want to see the assessment information you use as a school to identify how well your pupils are progressing in your curriculum and, crucially, how you use that information to improve pupils’ achievement.”  Sean Harford

And then today, Sean has both ‘liked’ and retweeted this Tweet of mine:

sean 2

However, some of Ofsted’s foot soldiers still appear not to have yet got this message. A report published on May 25th 2017 had as a key issue

  • There is not enough emphasis on the measurement of pupil progress from individual pupil starting points.

But that was nearly a year ago. Maybe things have improved since then? To find out, I decided to read all the areas for improvement in Ofsted reports for primary schools published in March. However, that runs to over 70 pages, so I gave up after reading 7 pages worth of reports. With 10 schools per page – that’s 70 reports I read. To be fair, most of them seemed sensible enough, but I found a fair few recommendations that worried me. All of the following are all recommendations from reports published in March 2018. I have highlighted in bold the problematic parts.

  • ensuring that success criteria regarding pupils’ progress and attainment in performance management documents and in the school’s development plan are measurable, to hold teachers more clearly to account for the achievement of pupils in their classes.


I’m not sure how this can mean anything other than reducing progress to a numerical score? As James Pembroke says ‘numbers in a tracking system do not prove that pupils have made progress; they just prove that someone has entered some data into the system.’


  • assessment information is accurate and used alongside improvement plans that have precise objectives and clear measurable outcomes, in order for academy committee members to further hold leaders to account


  • leaders’ plans for school improvement and the use of the pupil premium have clear actions, timescales and measurable outcomes

Again, an emphasis on measuring the unmeasurable – a desire for the false illusion of accuracy that measuring something purports to bring.

  • outcomes of groups of pupils, no matter how small, are reviewed more precisely, so that leaders know whether their actions to raise standards are effective and represent good value for money
  • action plans contain precise success criteria, with specific targets for groups and cohorts of pupils, so that leaders and governors are able to check the impact of their actions on improving outcomes for pupils

With both ASP and the Ofsted dashboard moving away from looking at smaller groups, it is alarming to see this in recent reports.

  • they strengthen their analysis and evaluation of the progress of different groups so that they know how well different groups of pupils are progressing

Indeed, even this one bothers me. Why can’t we just check and respond on a pupil by pupil basis? How does it actually help any child do better if leaders are spending precious time analysing groups? Even bigger groups? Especially in year. At the end of the year, then yes, I’d have a look at how pupil premium children were doing compared with non-pupil premium. And obviously at the end of a key stage a whole raft of data is produced. But I’d rather spend my time improving the curriculum and teaching than making pretty charts on excel.

Then there is the question of whether ‘tracking’ really means ‘have a spreadsheet with numbers.’ See for example, these recommendations.

  • systems for tracking the progress of pupils in subject-specific skills across the curriculum in subjects other than English and mathematics are embedded
  • track the progress of pupils so that governors, middle and senior leaders are fully informed about the progress of groups of pupils, particularly across the wider curriculum.

So they want information about how different groups are doing in geography then, do they?

These two might not mean ‘have a spreadsheet for the other subjects, but that’s probably not how it is going to be interpreted.

So much for being ruthless and only collecting what is needed to support outcomes for children!

Be that as it may, we are doing our best to go ‘data naked,’ by which I mean having the least data we possibly can, only resorting to numbers if they actually tell us something that will enable someone to do something that will make things better for the children as a result. I’m not sure we’ve got it all right and it is still very much a work in progress, but this is what we currently do.  I am not holding this up as a marvellous example for others to follow. We are currently due Ofsted, so, not quite holding my nerve, in September our assessment plan included more data than I really thought necessary. While I believe that Sean Harford means what he says, I get nervous about individual inspectors – so the plan included data as a sort of security blanket or lucky amulet to bewitch any data-besotted inspector. However, the plan did not survive contact with reality. Either that, or I just got braver.


We started the year intending to carry on from the previous year using the PUMA standardised tests at the end of each term. The standardised scores from these were then entered into, yes, you guessed, a home-made excel spreadsheet, which was formatted to colour code certain ranges of scores, based on benchmarks suggested by the publishers of PUMA.  The idea being we could have a column with the previous scores from July alongside the December scores, thus being able to make useful comparisons over time. Is Abdul still ‘green’?  Why has David done from ‘orange’ to ‘red’? In other words, pseudo-levels.

However, come December, the year 2 and 6 teachers asked if they could do a previous SATS paper instead – which seemed liked a sensible idea. That immediately meant that the December results could not be directly compared with the previous July ones, since the children were taking a test intended for several months later. These results were worthwhile though, and gave us a rough but useful indication of who was ‘on track’ or ‘behind’ or ‘ahead’ given their ks1 score or EYFS score. Everyone else did PUMA but came up against the obvious problem that when you take these kind of tests in year, they don’t necessarily test what you have taught. In other words, it was pretty meaningless except as a way of the individual teachers checking if those questions they had actually taught had been answered correctly. So any attempt to check progress from the previous July was futile.   For year 1, the situation was even worse as they were being compared to FSP outcomes.  Nevertheless, we valiantly attempted to crunch data and report to our standard and curriculum committee. We even analysed groups – though only boys, girls and pupil premium vs non pupil premium. However, by the time we’d explained for the umpteenth time that ‘you can’t really compare December results with July results’, the governors looked at us all funny and asked us why we were wasting time on in depth analysis of something patently not suitable for such treatment. Then when we tried to talk about groups –and some of our classes are small with only 18 pupils in – it got even more farcical.  Governors and leaders together resolved not to waste any more time analysing stuff that was not properly analysable.

So this term, year 2 and year 6 are doing another sats paper, and everyone else is either  doing PUMA or White Rose – whatever best fits what they have actually taught so far – but they are doing these assessments not so the SLT can analyse and draw (highly dubious) conclusions. Instead, they are doing them to inform their own teaching so they know what needs more revising and who might need more supporting. At our next pupil progress meeting we will have a conversation about each pupil, and how they did on whatever tool the teacher used will be discussed as a possibly useful starting point. Where pupils do not appear to be doing so well, we will have a look at their maths book to see if that sheds any light on the situation.  I will also look at the tracker that tells me if the child knows their number bonds and timetables.  I will ask the teacher if there were any particular areas of maths where many children did badly in questions, and if so, what are they going to do about it.

Then in July, everyone (except Early Years and years 2 and 6) will take PUMA (because by then, everyone should have taught the year’s curriculum, so the test:curriculum misalignment problem should not arise) and then I will enter those scores against last July’s scores. I can see a point of data tracking year on year.  I can see how that can flag up potential problems either for a child or teacher.  But within year, talking to the teacher about their class, looking at books, watching lessons and tracking acquisition of key number facts is much more useful than wasting hours with a spreadsheet.

I should add that, as an experiment, this year we bought into Star Maths (part of the Accelerated Reader package from Renaissance Learning) for years 5 and 6. This enables pupils to do a maths test in a matter of minutes, with no marking for the teacher, and result instantly available (and analysis of what the pupil can and can’t do).  Apparently, according to @MrLearnwell, these results correlate very well with actual sats performance.  Renaissance Learning bought the anonymised sats data from the government and matched (via UPN) actual sats results with performance of the thousands of children who use their product and got a very high level of correlation.  I will wait and see how this bears out for us when this year’s sats results are out, but it may be that from next September we use Star Maths across the school. I don’t understand the product enough yet to understand how it gets round the curriculum: test misalignment problem that happens in year. That’s something I need to find out more about.


We abandoned PIRA (twin sister of PUMA) this year as we didn’t find it helpful at all. It’s nothing like actual SATS papers, some questions are really odd and all in all, it’s not a good assessment. Several other people have contacted me via Twitter to express the same opinion. Instead, we use Accelerated Reader to find out all sorts of useful things. As well as getting a standardised score from Star Reader, it also gives us a fluency measure, a reading age and, best of all, how many minutes of independent reading each child is doing.  This kind of granular information is so much more useful than a test score and really helps us pinpoint what needs more attention. For children in Reception and KS1, (or for older children where appropriate) we also track their progress in phonics. As with maths, all of this information is discussed for each child in our pupil progress meetings and where there are problems, strategies are decided. Years 2 and 6 do previous Sats papers in December and March, in part to give children practice of the format.


Last year we bought into a tracker system that had every objective for the year. It took a lot of teacher effort for practically no impact on children.  Indeed, by focusing on the objectives for that year, it drew teachers’ attention away from objectives in lower year groups that might urgently need attention. Yes, full stops, I’m looking at you. So  this year we’ve invented our own really minimal writing objectives tracker for ks2. This starts with the year 2 interim framework objectives, then builds from there, with each year group having 4 or 5 further key objectives, drawn from the national curriculum. So each ks2 teacher checks off the previous year groups objectives first, starting with the year 2 ones. It’s quick and makes sure teachers address learning gaps. On top of that, we are involved in the Sharing Standards comparative judgement project from No More Marking.  This gives us a good measure of how well we are doing as a school in relation to other schools, as well as giving each child a scaled score.  This scaled score is only based on one piece of work, but a useful starting point for discussion and enables us to target book looks on those children who seem to be doing worse than we would have expected, given their prior attainment. Added to that, it means every teacher has seen a piece of work from every child in the school from year 1 upwards and I have instant access to that work from my computer.

History, geography, science, RE

Children do a multiple choice quiz at the end of each unit. The score out of 10 (or 5 in ks1) gets recorded on a spreadsheet. Then a couple of months later (when that unit has long finished) they do another quiz on that subject. That score is also recorded. Then at the end of the year they do a quiz of quizzes, with questions from all the units that year. And guess what – that score gets recorded too, and goes on end of year report. I was really worried about assessing the foundation subjects when this first became a thing, but actually, this system works really well, is quick and easy and has impact.  It allows us to identify which questions children are finding harder and which children are not doing as well as they should. In order to assess children’s ability to apply knowledge, we have just started using stem sentences and ‘but, because, so’ to see if children can put their knowledge to work.   For example, given the stem sentence, ‘the River Nile used to flood each year…’ can the children carry on this sentence 3 different ways, using but, because and so?  For example, the River Nile used to flood each year but does not any more since the Aswan dam has been build.  Or, the River Nile used to flood each year so the land became very fertile from all the minerals in the floodwater.  At the moment this is mainly formative, but we may also weave it into their end of unit assessment once children are more familiar with the process.

MFL is similar with end of unit quizzes, but I haven’t got round to having them on a spreadsheet yet.


This is still under development and not yet available for every class. Children start each coding unit with screenshots of various bits of code (usually from Scratch). They write what they think this code might do. Then at the end of the unit, they get the same screenshots and again write descriptions – which are of course then much more accurate and detailed. Nothing gets put on a spreadsheet. Ironically the computing assessment is the most low-tech! The assessment helps the teacher see how effective they have been and which aspects were the least successful. Children like seeing how much they have learnt. So I am quite happy with this system. In addition, we have a multiple choice quiz on online safety, which the children do every term. Yes, the questions are the same, because it’s not about progress, it is about keeping the children safe.

PE, art, DT and music

We have a PE coach who takes all PE lessons. He has this massive spreadsheet with 3 or 4 objectives from each sport plus one for being a good team player. Hand on heart I have no idea if it actually has any impact on children’s progress in the subject but he said he had all that information anyway and was happy to do it.

We have a similar system for art and DT (though much shorter). I’m not wedded to the idea. We have also started doing simple assessments of children’s ability to copy patterns of increasing complexity – starting with just a line and getting progressively harder, the child stopping at the pattern they find difficult to copy. I think this is much more likely to be useful.

And as for music…er…I confess we don’t have a system yet for music.

Reporting to governors and parents

The great thing about graphs and charts is they make complex information understandable. The downside is they give the illusion of making flawed information meaningful. They enable comparisons but at a cost; everything has to be reducible to a number.  This is a cost I am no longer prepared to pay. But while I think our present way of checking for progress is far superior to previous systems, without a doubt it is harder to report to others in terms of accountability.  As you can see, we have different systems for different subjects; some information tracks discrete objectives or behaviours, some is comparative with other schools, some is strictly formative.  I can’t reduce this complexity to a numerical value. Governors have to bear with narrative descriptions of how we know about the progress our children are making. Some subjects have some numbers, but the score out of 10 in a history quiz is in no way directly comparable with say, average reading age or the number of number bonds a child in year 1 knows.  And as for tracking groups – well – except for at the end of each key stage, we don’t. It doesn’t add any value at all to the achievement of any child so I simply refuse to indulged in such a meaningless ritual.

Reporting to parents, on the other hand is much easier. Parents understand things like a reading age or a score out of 10 or a chart that shows how many times tables or spellings a child knows. That’s far more understandable than being told your child is 3b or ‘emerging plus’ or even ‘working at the expected level.’

And Ofsted?

Maybe I’ll just give them this to read?




[1] Yes, I know I’ve made that joke before. It’s good though, isn’t it, even though I say so myself.

[2] Read this excellent blog which says everything I am saying, only better, and from which I have drawn extensively in this blog

Going data naked