How can we teach disciplinary knowledge?

The ‘knowledge-rich’ curriculum has been much discussed over recent years, and an increasing number of schools and departments have started to place a greater emphasis on the acquisition and retention of knowledge in their curriculum. The vast majority of this focus, however, has been directed towards substantive knowledge, the facts, dates, and formulae that we want our students to remember. Whilst this is an important step,  substantive knowledge alone does not render a curriculum ‘knowledge-rich’ – it is important that alongside this students are being exposed to disciplinary knowledge. 

Christine Counsell has described disciplinary knowledge as the ‘organising structures of the discipline‘, the unwritten rules that shape each discipline as as tradition of enquiry focused on its own unique quest for truth. In the case of History, it refers to how historians think about the very nature of causation, or change and continuity, the unwritten rules of how historians construct a claim, the acceptable standards of evidence versus speculation in historical argument, and the conventions of academic writing. It’s the tacit knowledge that makes an individual a subject expert, that which allows them to participate a discipline’s academic discourse.

In planning a curriculum, we should be seeking to introduce our students to this disciplinary knowledge, since it is fundamental to giving our students access to the conversations of our disciplines. Of course, introducing students to something so amorphous is no easy task. The challenge is made more difficult by the ever-present need to plan or update schemes of work, which tempts us into short-termism at the expense of thinking carefully about the long-term progression model of our curriculum. All too often, this leaves us hoping that students will simply ‘bump into’ knowledge about how to analyse a source, or write meaningfully about causation. The result is that students’ conceptual understanding develops haphazardly and inconsistently.

In the process of rethinking our curriculum, we decided that we wanted to think more pro-actively about the question of disciplinary knowledge, and how we can best go about teaching it. Although this is no straightforward task, we are lucky within the history subject community to have a rich tradition of thinking about precisely what this knowledge looks like. The basic foundations of disciplinary knowledge – those organising structures – are framed by our second-order concepts. The best history departments have long built their curricula around these using enquiry questions drawn from historical scholarship (Riley, 1998; Jenner, 2019). By putting these enquiry questions at the heart of what we do in History, we are able to sharpen our focus on the concepts that provide order and structure to the discipline of History itself.

Although we had been structuring our curriculum around such questions, we had definitely succumbed to the short-termism of the need to plan individual schemes of work, at the expense of the bigger picture of our curriculum. We therefore started to discuss how we might be able to make more systematic use of our enquiries in the long-term in order to develop our students disciplinary knowledge. We wanted to figure out whether we could more methodically plan for precisely how and when students encounter this kind of knowledge. If we could do this, we could ensure that by the time they have reached the end of Year 9, they are equipped with a basic toolkit that enables them to participate in the conversations of the discipline, regardless of whether or not they choose to take History at GCSE.

As a test case, we decided to think about how we might plan to develop our students knowledge of causation, an area that has been particularly well explored by practitioners over recent decades. The first thing we needed to figure out what disciplinary knowledge ‘looked like’, so that we could think about how to go about teaching it. What exactly was it that we wanted our students to know about causation?

The first thing to come to mind was James Woodcock’s article on the role of using precise causal language to improve students’ causal reasoning (2005). This quickly led us also to thinking about what our students needed to know to construct an argument – we wanted them to have experience of prioritising and grouping causes together thematically (or otherwise). We then thought about areas where our students had historically been quite weak, and quickly came upon the problems of counterfactual reasoning, as discussed by Arthur Chapman (2003) and Ellen Buxton (2010). Inevitability was a key concern too – we had already tried to deliberately introduce our students to complexities of the notion of ‘inevitability’, but we needed to think more carefully about when and how to do this. We therefore turned to Gary Howell’s work on teaching the First World War (1998) to shape our approach to this in Year 9.

After identifying a rough outline of what disciplinary knowledge we wanted to teach our students, the next step was to try to sequence this across our curriculum. To do this, we crafted a series of enquiry questions that would create opportunities for our students to encounter this kind of disciplinary knowledge. A few examples are below:

Time Unit Enquiry/lesson Disciplinary aim
Year 7 HT1 Ancient Rome Why did the Roman Empire collapse? Teacher modelling causal argument
Year 7 HT2 Anglo-Saxon England What led to the development of an English nation by 1042? Modelling of written causal argument
Year 7 HT4 The Crusades Why did Europeans go on Crusade? Considering diversity of motive, challenging generalisations
Year 8 HT1 The Reformation Why was there a Reformation in England? Prioritising causes
Year 8 HT3 Civil War and the Stuarts What was responsible for the rise of Parliament? Develop precise use of causal language in written/spoken argument
Year 8 HT6 Victorian England Why did Disraeli pass the Second Reform Act in 1867? Analyse the precise roles of causes in order to prioritise
Year 9 HT2 The First World War Was the First World War the inevitable consequence of the alliance system? Grapple with the concept of inevitability

In some ways this is little different to the curriculum we had before – our enquiry questions remain at the heart of historical thinking. The real change, however, was that we were being much more explicit in identifying  when and how to introduce students to new ways of thinking about causation, and precisely when to add an extra implement to their conceptual toolkit. The most significant improvement here was consistency – by explicitly incorporating how we want students to develop their disciplinary knowledge into the progression model of our curriculum, we hoped to ensure that all students would be gradually acquire it, rather than relying on chance.

Once we had done this with causation, we began to wonder about the other second-order concepts. We had felt on secure ground with the question of causation because it has been so well theorised by the subject community – but could we develop disciplinary knowledge in the same way with something as tricky as historical interpretations?

To try and do so, we went through the same process, referring back to the discourse of the subject community in an attempt to identify precisely what tools we wanted students to have. We then planned when and where we wanted our students to encounter this knowledge. A few examples for interpretations are below.

Time Unit Enquiry/lesson Disciplinary aim
Year 7 HT1 Ancient Rome Examining a statue of Boudicca Introduce the idea that events/individuals can be deliberately presented in a particular light
Year 7 HT4 Medieval Kingship Lesson on interpretations of Magna Carta Consider how interpretations can be different – one event can be viewed in different ways
Year 9 HT3 Second World War To what extent was Hitler an opportunist? Encounter multiple interpretations of the same event in order to critique them
Year 9 HT5 The Holocaust What role did ordinary men play in the Holocaust? Engagement with scholarship, considering why interpretations may differ

We were right to think that the disciplinary knowledge would come to hand less easily here – it’s much harder to identify specific components of our conceptual toolkit than it is when looking at causation. Nonetheless, we were still able to plot out the beginnings of a curricular narrative that can underpin the development of our students’ knowledge, and will help us to deliberately shape their progress in this area.

Of course, these plans aren’t perfect. They might not work at all. The cumulative nature of progression in disciplinary knowledge means we’ll almost certainly find numerous improvements that we could make to our sequencing. Moreover, we simply won’t be able to account for every way in which our students disciplinary knowledge – there will always be an extent to which its development lies beyond our control. But they do mark an important change in the way we are approaching the question of disciplinary knowledge. By focusing on the long-term, and trying to deliberately set out when and where our students encounter disciplinary knowledge as part of our curricular narrative, we hope to move away from a reliance on chance encounters, and ensure that we are more effectively inducting all of our students into the conversations of the discipline.

How can we improve teacher radar?

We’ve all met those teachers who appear to have eyes in the back of their head. No matter what they are doing, they seem capable of spotting even the slightest off-task behaviour. They can notice and stop a student who is about to talk before they’ve said their first syllable, or correct the student about to look out of the window just as she begins to move her head to do so. Unsurprisingly, behaviour in their classroom is immaculate.

We call this ‘ability’ our radar. Having a good radar is fundamental to effective behaviour management. After all, if you don’t see the misbehaviour, you’re not going to be able to correct it. This is at its most obvious when we are observing trainees or early career teachers who don’t appear to have an effective radar. It’s easy to sit at the back of the classroom wondering how on earth our trainee hasn’t seen that Alex is staring out of the window, and that Sarah is trying to attract the attention of her friend in the corridor. 

It’s tempting in such situations to set our trainees a target to ‘work on their radar’. We do, after all, want them to become like those expert teachers who are able to notice and respond to the slightest distraction. But a target this vague isn’t at all helpful – we might as well ask them to ‘become better at seeing things’.

Part of the difficulty lies in the fact that we can’t easily explain how it is that we maintain our own radar. It seems to be something innate, something that we can’t remember not having. So how can we actually help our trainees improve theirs?

What can we do?

Although there’s no point setting a target as vague as ‘improve your radar’, there are a few things we can ask our trainees to do that might help them notice off-task behaviour in their classroom. In particular, we can get them thinking more carefully about how they position themselves so that they can better see their students. In Teach Like a Champion, Doug Lemov talks about using ‘Pastore’s Perch’ to position yourself effectively in the classroom. The idea is that it’s much easier to see all of your students if you position yourself in the corner of the room rather than the centre.

Radar diagram

Standing in this position will reduce the number of students outside of your field of vision at any one time, making it much easier to see – and therefore stop – any off-task behaviour. This can be further refined by getting our trainees to stand still so that they spend minimal time looking away from the class.

Unfortunately, however, this does not entirely solve the problem. Whilst better positioning increases the likelihood that our trainees will see off-task behaviour, it doesn’t actually improve their ‘ability’ to notice and respond to it. So what next?

Can we actually improve a teacher’s radar?

Because it seems such a difficult thing to improve, it’s very easy to fall into the trap of dismissing teacher radar as something that you either have or you don’t. I don’t think this is the case. Those expert practitioners who can maintain a hyper-vigilant radar are able to do so because they are experts. They can maintain an effective radar because they need to think less about the other processes involved in teaching.

Think about the number of things that a trainee teacher will be trying to hold in their working memory in a lesson. At any given moment, they could be thinking about:

  • The names of the students sat in front of them
  • The substantive knowledge required to teach a given lesson
  • How to draw connections with previous learning
  • How to lay the foundations of the next lesson
  • Whether there is enough time left for the lesson
  • How to support the SEN students in the room
  • Why Alicia has developed a particular misconception
  • How to phrase their next question
  • How to improve their lesson the next time they teach it
  • What notes their observer is making

All of this is before our trainee begins to think about looking at what their students are doing. It shouldn’t be a surprise, therefore, when they find it difficult to maintain an effective radar. Our expert teachers have automatised many of these other processes, so it’s unsurprising that they are able to give the impression of having eyes in the back of their head. They don’t need to think about their explanation as much because they have developed a mental script for explaining electrolysis. Nor do they need to think about the names of their students – they have taught them for years. This frees up their working memory, giving them a much greater capacity to keep an eye on what their students are up to.

In order to help early career teachers develop an effective radar, then, we are be better off focusing our efforts on helping them to automatise the other processes that are a crucial part of teaching. That means setting targets that will make a help them move towards greater fluency in these tasks. This might include supporting them in developing their subject knowledge, or their understanding of the broader narrative of the curriculum. Or it could mean helping them to script questions in advance, and sharing our expertise in how to explain a particularly tricky concept. It doesn’t mean telling trainees to ‘work on their radar’.

How can we improve professional learning? Part 1

Professional learning has long been considered a standard part of teaching. From sporadic twilight sessions to INSET days, we have a healthy expectation that teachers continue to develop their practice throughout their career. In recent years, however, the increasing emphasis on evidence-informed practice has brought professional learning back into the spotlight. The success of education conferences such as ResearchED attests to the growing demand for rigorous, research-informed CPD across the country.

This spotlight has, however, raised significant questions about how we approach professional learning in schools. Although schools have always engaged in professional learning, many of us have experienced situations where it has ‘gone wrong’. We all agree professional learning is important – so why do some of us think it’s so bad?

Problems with professional learning

One common issue is that professional learning is often not considered a high enough priority to be allocated significant time in teachers’ busy schedules. Training sessions become sporadic and are delivered on an ad hoc basis, rather than being seen and treated as a professional entitlement. Not only can this negatively impact the quality of training, it also sends a clear message to staff that their development is not considered a priority.

A more fundamental problem, however, is that we often lack a meaningful vision for precisely what it is that we want professional learning to achieve. This is in no small part the result of the fact that staff training is often seen primarily as a tool for school improvement. Training, therefore, tends to focus on surface-level changes that seem to provide a quick-fix for our problems. Unsurprisingly, we quickly find ourselves suffering from initiative overload, as short-term priorities come to dominate.

Such a focus misses the point. Professional learning isn’t about making your school better. It’s about making your staff better. We can much more easily improve our schools by equipping our staff to better implement the things we already do than we can by asking them to introduce a new initiative.

Of course, in practice things aren’t so simple. Our schools are all made up of a team of practitioners who are at different points in their career, each with their own knowledge and experiences. They will have varying strengths and weaknesses, and catering to these makes delivering effective professional learning incredibly complex. We are understandably reluctant to train our staff to do things that, to all appearances, some of them can already do. It can therefore seem more sensible to instead introduce something new, something that might benefit everyone. When we do this, however, training quickly becomes a case of throwing new ideas at colleagues and seeing what sticks. This reinforces a defiant short-termism at the expense of long-term improvement.

What’s our vision?

Within our trust, we are keen to ensure that our professional learning offer does not repeat such mistakes. We want our staff to receive a rigorous professional learning that is evidence-informed, that prioritises long-term improvement over short-term gain. To do this meaningfully, however, we needed to identify precisely what that offer was. Saying we wanted to ‘improve teaching’ was too vague – we needed to be crystal clear about what we wanted to achieve. We needed a coherent vision.

At Bedford Free School, we believed we had such a vision. Over the past few years, much of our professional learning has been focused on ensuring that our staff are trained to effectively use our whole-school routines (based largely around Doug Lemov’s Teach Like a Champion). Staff training focused on regularly revisiting and reviewing the core routines that underpin our approach, so that all staff (old and new) can master them.

Although this was helpful in supporting our whole-school routines, it led us to obsess over the surface-level features of good practice, at the expense of the knowledge that underpins it. This is due in no small part to an historic tendency to think of teaching as a skills-based profession. We often have a clear idea of the sort of thing we want teachers to be doing in the classroom (such as our TLAC routines), and so we focus on that which is visible, rather than the knowledge and mental models underpin it. What we needed to pay attention to what we wanted our staff to know.

I’ve written before about how we believe that there is a core body of knowledge that all teachers should have access to. It’s the shared knowledge of ‘the best that has been thought and said’ that not only underpins their classroom practice, but allows them to meaningfully debate within their departments. This is an idea that we have already worked to put at the heart of our offer to trainees. It is, however, just as important for our permanent staff. If we want our colleagues to be able to meaningfully debate what constitutes best practice in their departments, we need to make sure that they have access to this knowledge, regardless of where or when they trained and their level of experience.

Place your bets

The problem with such an approach is that there has historically been little agreement on precisely what that knowledge is. Because of (primarily) pedagogical divisions within the profession, we have been very reluctant to try and categorically identify a core body of knowledge that all teachers should have access to (precisely the problem that the DfE’s Early Career Framework is intending to fix). Furthermore, we cannot simply give our teachers generic guidance on how to go and teach their subjects – they are, after all, the experts in their field.

These challenges, however, don’t absolve us from the need to make decisions about what we want our staff to know. Although we might not be able to definitively set out what knowledge is best, within our trust we can place bets on what knowledge we think is most likely to improve practice within our schools. What knowledge about assessment, for example, is most likely to enable our departments to improve their end of year assessments? What knowledge about curriculum design is most likely to enable them to meaningfully debate how to build students’ disciplinary knowledge?

By placing such bets within our trust, we can work to identify the knowledge that we think is most likely to improve practice within our schools. The goal of professional learning is to ensure that all of our staff have access to this knowledge. This, we hope, will empower them in turn to place their own bets within their departments about how best to teach their subject.

In my next post, I’ll explore the practicalities of implementing our vision at Bedford Free School.

Rethinking Point, Evidence, Explanation

In the process of updating our scheme of work for Year 7 on The Crusades, I’ve been doing a lot of thinking about how best to get students writing historically. I’ve found myself grappling yet again with a common debate faced by history teachers across the country – how can I support students to produce great historical writing without having it brutalised by the reductive limitations of a generic writing frame?

We have a rich tradition in the history subject community of debating how best to support students with extended historical writing, from Michael Fordham’s dragon slaying (2007), to Pate and Evans’ reflection on whether too much scaffolding can actually impede students (2007), and, more recently, Jim Carroll’s seminal work examining how the disciplinary traditions of historical writing can be translated into A Level lessons (2016).

Despite all this, many of us habitually return to PEE (Point, Evidence, Explanation) as a straightforward structure that ‘works’. I have certainly done so at numerous times. The reasons are obvious – it seems to be a quick and effective way to get students to make clear, succinct points, and one that can easily lead to rapid improvement for the weakest students.

However, PEE remains highly problematic for a number of reasons. First and foremost, as a structure it does not model good historical writing – its far removed from the gold standard of academic scholarship that we should be aiming to introduce our students to. The improvement tends to focus on surface-level features – the reason PEE appears to be a quick fix for student writing is because it is an easy way to get students saying the right kind of thing with sentences starters as ‘this was important because…’.

Furthermore, PEE does not account for the different disciplinary dimensions of historical writing. Whilst PEE might help a student write a passable extended piece on causation, it is highly unlikely to enable them to produce a good piece of extended writing on change and continuity, where we are instead looking for what Kath Goudie and Rachel Foster have termed ‘analytic description’ (2017).

All of this gives us good reason to question the utility of using PEE as a support for extended writing. This is a daunting prospect, when complete abandonment of the idea can leave us facing something of a vacuum – how are we going to support the weakest students when it comes to writing for argument? Despite its flaws, when facing a disadvantaged intake of students with weak literacy, is PEE the best of a bad bunch?

A reasonable defence of PEE in this situation is that there is significant value in this in showing students what success looks like – and in giving students the opportunity to realise that they are indeed capable of producing a good piece of writing. However, this could lead towards an over-reliance on PEE as a quick fix, and neglect of the complexity of great historical writing. It can mean that students don’t have the opportunity to recognise the tendency of historians to hide the what Carroll calls the ‘analytical ‘ductwork” (2016).

Furthermore, the focus on surface-level features of writing frames can easily lead us to over-estimate our students’ writing ability. Where writing frames help students to replicate sentences that serve as a proxy for historical thinking, we can easily fall into a trap of over-estimating students’ conceptual understanding. This is a problem that Rachel Foster and Sarah Gadd noticed this with their students – gaps in students’ understanding of evidential thinking led to a  ‘lack of care taken over the selection and deployment of information as evidence’ (2013). This is the greatest danger of using PEE as a writing frame – it has the potential to mask how far students have genuinely grasped the process of constructing a causal argument, which could lead us to overlook it in our teaching.

It would be easy in response to this challenge to throw PEE out entirely, something that I had a tendency to do during my training year. Increasingly, however, I have realised that this is too much of a knee-jerk reaction. Provided that we recognise and respond to its limitations, PEE can be a useful starting point for students, so long as we ensure that the focus is on the conceptual thinking that lies behind the construction of written argument. It is by modelling and practicing the conceptual thinking that underpins good writing that we can gradually begin to remove structured support for extended writing, and ensure that by the time our students reach GCSE or A Level, they are able to confidently produce good historical writing.

When planning our Year 7 schemes of work, therefore, I chose to use PEE not because of its effectiveness as a writing frame for our weakest students, but because of its potential value in demonstrating the relationship between evidence and a historical claim. When using PEE in lessons, we place the emphasis on the relationship between evidence and explanation, the process of how a historian selects evidence and then deploys it in order to answer an enquiry question. This requires a lot of teacher modelling – the students’ first essay, for example, is modelled in its entirety during the enquiry, so that the teacher has an opportunity to demonstrate how evidence is selected and then deployed in support of an argument.

It is this disciplinary understanding above all that will lead to excellent historical writing from students. By emphasising the disciplinary dimensions rather than specific sentence starters, we can teach students how to piece an argument together. We are then able to start gradually removing prescriptive writing frames towards the end of Year 7, and instead challenge students to think about historical writing much more freely. Instead, we begin to consider a basic paragraph as containing the follow things:

  1. Signpost – signal to your reader what the paragraph is about
  2. Tell the story – tell the story using evidence relevant to the question
  3. Answer the question – this is where you present your argument (in other words – why does this piece of evidence matter?)

This is not entirely dissimilar to structures such as PEE, in that it provides the weakest students some guidance as to what to write. Where it differs, however, is that it gives greater freedom for the most capable to experiment, because they aren’t restricted to a reductive structure. Because our students have already started to develop their understanding of the precise role of evidence, with continued teacher modelling they are able to begin writing with greater freedom, free from the constraints of overly-prescriptive sentence starters. Teacher input continues to focus on modelling the conceptual thinking required, and on providing further examples and non-examples of how evidence is used to justify a conclusion.

It will be a long time before we have the opportunity to evaluate how successful this approach has been with our students, but so far the shift in emphasis has proved promising. Students are beginning to select evidence much more carefully in their arguments, resulting in better substantiated conclusions. This has also impacted on the way our Year 8s write as the scaffolding is increasingly removed, allowing our most able students to demonstrate their conceptual understanding whilst also providing our weakest students with the conceptual understanding needed to produce an argument.

There remains lots of work to be done in thinking about how we can support increasingly advanced writing, but by shifting the focus of PEE away from providing a writing frame and towards using it as a basic model of argument construction, we have found PEE to be far more useful. I’m not convinced that writing frames are the answer to the challenge of extended writing. But they could be a more productive part of it.

The Problem(s) with the Teachers’ Standards

For anyone who has been involved in ITT, it’s impossible to ignore the ubiquity of the Teachers’ Standards. In most training courses, they’re everywhere. For many, the Standards are one of the most memorable parts of their training year, countless hours spent trying to meet targets linked to specific standards, ensuring that lesson plans are clearly labelled with them, and collecting various forms of evidence to prove that they have been met. In their NQT year, many go through the same process again, meticulously demonstrating how they have met each Standard, printing and filing evidence of parents’ evenings, markbooks, extra-curricular activities and duties covered.

None of this, of course, is what the Standards exist for. Their primary function is to act as a barrier for entry to the profession, an aim which is clearly set out in the DfE guidance:

‘The [teachers’] standards define the minimum level of practice expected of trainees and teachers from the point of being awarded qualified teacher status’

Click to access Teachers__Standards.pdf

This is an entirely sensible aim – having a common bar of entry helps to ensure that there is a minimum standard in the quality of teaching nationally, and it also serves an important function in providing a measure of professionalism to which teachers are accountable. However, rather than functioning as a benchmark for entry to the profession, they have quickly become its heart and soul. They have become the basis for grading lessons, been distorted into a progression model, and are used to brutally cut across subject-specificity, in a way that has had a highly problematic impact on the quality of teacher training.

The Standards as grades

The first problem is posed by how the Standards are used to grade teachers. The vast majority of ITT courses use the Standards to grade students at the end of their course, and many also grade their trainees on a termly basis. Some go yet further, insisting on giving a grade for every Standard every lesson. With such importance placed on this, the Standards very quickly become the main focus of a trainee’s practice.

Given that over the last few years we have started to call into question the practice teachers being graded as part of their performance management, it seems baffling that this practice has persisted in teacher training. Not only does this cause undue stress for trainees who already have more than enough to worry about in the classroom, the practice of grading obfuscates – for both trainees and mentors – what needs to be done in order to improve. The focus becomes about getting up to a ‘good’ grade according to the Standards, rather than thinking about steps that could more meaningfully improve practice.

‘We need some evidence…’

This focus on grading trainees according to the Standards leads directly to another problem. Either as trainees, mentors or eavesdroppers in the staff room, at some point in our careers we’ve all heard the phrase ‘we need some evidence for TS…’ Rather than spending time practising questioning or teacher talk, trainees are sent on a quest to amass evidence. Of course, the idea of evidence in itself is no problem – if the Standards are fulfilling their intended role as a benchmark for entry to the profession, then at some point there’s going to come a point where we need evidence that they have been met, be it a comment from a mentor, or some collected documents.

What we have instead ended up with is trainees collecting huge dossiers of evidence, multiple lever-arch files of signed emails that they have helped out with a parents evening, lesson plans that set out precisely how they differentiated for a particular Pupil Premium student, and countless print outs from training sessions. Particularly when deadlines loom, trainees’ weekly targets quickly start to focus on collecting evidence that they have ‘promoted a culture of high expectations’, rather than thinking about how to do so. Once again, we find ourselves losing sight of what teacher training should actually be about – becoming a good practitioner.

The Standards as a progression model

The obsession with grading trainees leads to the training year becoming more about moving from ‘satisfactory’ to ‘outstanding’ than it is about developing as a practitioner. This is particularly problematic because of the way in which the Standards have been distorted into level descriptors. It takes only a quick Google search to find numerous examples of this, with each Standard subdivided into four – or in some cases seven or eight – different ‘levels’.

Where the Standards’ become our road map for improvement, we start to place too much value on surface-level features that are highly visible, rather than the thinking and methods that we know underpins good practice. We find ourselves viewing improvement practice through the lens of vacuous, generic statements which cannot get to the heart of what good teaching really looks like. To take an example from a (random) set of level descriptors I found, the difference between ‘outstanding’ and ‘good’ in (one section) of TS4 is:

ts4

If this is our progression model, what targets does it lead us to set a trainee if we want to get them from ‘good’ to ‘outstanding’? ‘Impart more knowledge’? ‘Consider how to impart knowledge more consistently’? ‘Use lesson time more effectively’? Such comments fail to encapsulate what progress actually looks like, and confuse a summative tool with formative targets. We are left with targets that aren’t really helpful in getting a trainee to make meaningful improvements to their practice, forcing them to focus only on the most visible features of their teaching.

To make matters worse, they also leave us wide open to the greatest threat of all.

The threat of genericism

Treating the Teachers’ Standards as a progression model and the exemplar or good practice forces ITT into genericism. The Teachers’ Standards all too easily become a blunt instrument that cuts haphazardly across all that which makes a discipline unique.

Trainees are forced to spend their first year in the profession working on targets that focus primarily on the most visible, generic features of their practice, at the cost of engaging with their subject communities about how to get students thinking meaningfully about change and continuity, or how to get them using the conceptual language needed to analyse causation. It’s no surprise that this produces teachers who have to reinvent the wheel during their early career.

What do we do?

So, what do we do? To be clear, this isn’t a critique of any particular individual or institution. There are some amazing individuals already challenging these problems – Rachel Foster at the University of Cambridge, Will Bailey-Watson at the University of Reading, and Michael Fordham at the Inspiration Trust SCITT. There are also brilliant mentors across the country who work incredibly hard to focus on what matters whilst operating within the limits imposed by the over-emphasis on the Teachers’ Standards that they have to contend with.

However, there remains much to be done. We need subject communities to take a greater role in the provision of teacher training and mentoring, to provide both the expertise and institutional memory to prevent ourselves from reinventing the wheel. We also need a broader conversation about what we want teacher training to achieve (hopefully the ECF will be a useful first step towards this).

And we need to remember what the Teaching Standards are for. They’re a benchmark for entry to the profession, and nothing else. And that’s fine.

Teacher training at BFS

Last week the above tweet got a fair amount of interest, with several people asking for more details. I quickly realised that I couldn’t explain what we’re doing in 280 characters, so decided it made more sense to write a brief summary of what we’re doing to improve our teacher training offer at Bedford Free School.

Over the past term we’ve started to rethink how we approach the role we can play as a school to ensure that our trainees are receiving excellent training. We realised that there was much more that we could do as a placement school to complement and extend what trainees already receive from their course providers, through providing a research-based internal programme of professional learning and rigorous mentoring. In pursuing this aim, however, we decided to start anew, and think about precisely what good teacher training looks like.

One of the biggest problems is that teaching is so often seen as an entirely skills-based profession. This is partly the result of the over-emphasis of the Teaching Standards which, when subdivided into level descriptors, quickly become the heart and soul of the profession, rather than a barrier to entry. Our contention is that this isn’t the case, that training shouldn’t just be a chance to practice and develop a series of generic skills.

We believe that there is a core body of knowledge that all teachers should have access to. Those of us who have been to ResearchEd, or another education conference will be aware that there is a group of ‘switched-on’ teachers who have the knowledge to meaningfully debate best practice. It’s a group of teachers who, when they collect in a summative assessment Year 10 have just completed, are able to discuss with their departments whether the data is reliable or valid. It’s a group of teachers who are aware of the research on Cognitive Load Theory, the testing effect, and the limitations of working memory, and who can both discuss the potential implications of that research for classroom practice, and identify its potential limitations. It’s a group of teachers who have been exposed to, to borrow Arnold’s phrase, ‘the best that has been thought and said’, and as such have been inducted into the community of educated practitioners.

It’s this knowledge that we think is key. Our belief is that a key part of teacher training should be to induct trainees into the community of practitioners, giving them the knowledge to participate in these conversations. It’s the same knowledge that will help them avoid having to re-invent the wheel during the first few years of their career, and prevent them having to spend years working out the limitations of level-descriptors, or what approaches to differentiation are genuinely effective.

Our first step, therefore, was to try and identify this knowledge. We focused on the four broad categories of curriculum, pedagogy, assessment and behaviour, setting out everything that we thought an ‘expert’ teacher should know, before attempting to identify the ‘must haves’, those that were most important to a trainee. We then built a programme of professional learning designed to give our trainees access to this knowledge. A brief outline is below:

untitled

Of course, exposing trainees to education research does not mean handing down a rigid template of the best way to teach. Far from it.

Exposing trainees to the research on, say, Cognitive Load Theory, empowers them to take part in discussions about its utility in the classroom, giving them the knowledge to identify the potential benefits it can provide to planning. Furthermore, it’s exactly this knowledge that will enable trainees to, where appropriate, mount a critique of what the research says, and to identify potential limitations. To this end, following each reading, trainees will have a reflection task that involves identifying its main conclusions, but also noting any potential limitations.

Many of these limitations, of course, will come in the form of subject-specificity. One of the greatest risks of the move towards educational research is the poorly-thought through implementation of conclusions from a piece of research with scant regard for the distinctiveness of each subject. Here, our mentors will play the crucial role, supporting and challenging trainees to discern what does and doesn’t work for their subject, identifying potential benefits whilst maintaining the integrity of their discipline.

All of this will, we hope, play an important role in equipping our trainees not only to go on and become excellent practitioners in their own right, but to play pro-active roles in promoting meaningful debate and ideas, both within their first roles and more widely.

If you have any feedback or suggestions, then please let me know via Twitter. Likewise, if you have any other questions regarding what we’re doing, then do get in touch!