Research

The “Just Been to a Conference” Post

You know, this academic year I have attended a whole bunch of training. Some of it external, but much of it internal. Now, I have to admit that I don’t often get to engage with internal training events as a participant so I feel like I miss out sometimes. I’m a bit of a subject specific snob sometimes too – as soon as someone starts to share or discuss a technique which is highly linguistically demanding for learners then I’m afraid you have more or less lost me. I try, and I want to try, but you know, if I can’t see how I can apply the idea as is to my practice as soon as possible, then I’m really going to struggle to engage. Someone once observed that I was “too much of a specialist” but you know, I rather like being an ESOL specialist. It’s never going to score me much by way of a career, perhaps, both in and out of college, but I don’t think I really care. Becoming too generalised in mindset feels to me like selling out, in some weird, undefinable way.

So anyway, this all means that I rather like going to a subject specific conference, as I did on Saturday at the NATECLA National Conference, where I get to talk and think all things ESOL. There are a lot of people I on it ever see at these things, which is lovely, of course, but it’s also good when there is no need to filter concepts into an ESOL friendly format. Instead, I find myself taking on a whole bunch of new ideas and concepts, or realigning ideas, or just having ideas for simple classroom activities that I can do stuff with.

There were some recurring themes in the sessions I was able to attend, and indeed linked to my own. One of these themes was around reformulation. This is taking a learner’s inaccurate or incomplete utterance and repeating it back to the learner in the correct form. It is a fairly instinctive, natural method of error correction and functions as a sort of “on the fly” input for students

S: I make my homework.

T: I do my homework.

The session I attended by Richard Gallen from Tower Hamlets College was on that very theme, and around the ways in which classroom conversations can lead to specific learning, and fairly early on he established that the simple act of reformulation considered on its own is largely ineffective. I’m sure, as well, that this wasn’t news to me, but I can’t remember where i picked that up from.However, it does make sense to suggest that simply repeating back the language to the learners is unlikely to lead to anything useful – there’s nothing there to encourage the learner to act on the reformulation, there is no follow up for learners. No, the point is this: for reformulation to work, we need to make things explicit to the students – make sure that the learner notices the reformulation and actually attempts to assimilate it. The phrase that kept coming up during the session was language upgrades, which distinguished nicely for me this kind of conscious improving of language in situ rather than simply correcting errors. Richard suggested a number of ways to introduce this – recording the language on the board, then getting students to revisit the language in a follow up lesson, perhaps using a slightly different context. If you record all the language reformulations, you can then turn these into simple gap fills, for example, as an activity in the following lesson – to use my example above:

“I always ______ my homework after class.”

There were other things too. Timing is crucial for these language upgrades – it’s no good getting the upgrade too late – and it needs to be just at the periphery of awareness: conceptually familiar, perhaps, but not completely linguistically familiar.  In short, if you get the upgradewhen you need it “just in time” and “just right” then the language is more likely to stick.  Richard quoted here from Leo Van Lier: The Ecology & Semiotics of Language Learning, which I am adding to my reading list. There may be a confidence / fluency payoff here – such immediate upgrading is surely going to interrupt the flow of a learner’s speaking, but if it makes the language stick, is this a worthy sacrifice? To interrupt fluency like this is a tough call for a teacher whose main focus is often communicative effectiveness, of which fluency is a major part.The challenge, I guess, is making that judgement call in the lesson, and this would depend very much on the learners themselves. There were some interesting insights into learner practices – students who took on the new vocabulary offered in an exchange tended to use that language with some sort of qualifying definition or statement. It was a genuinely interesting thing to see the transcriptions of the classroom conversations, and I really did wonder how practical such a thing might be for me to try one day.

There were plentiful other insights from Richard, things like the notion that learners grouped by similar ability, rather than mixed ability is more likely to lead to learning because of the quality of upgrades they can offer: the lower level learner in a mixed pair is less likely to act on the upgrades offered, and is also unlikely to be able to offer appropriate upgrades to the higher level student.

What else? learners remember more lexical feedback than grammatical and in fact generally ask more questions about vocabulary, although this sort of questioning does tend to be at higher levels rather than lower. The other humdinger moment for me was the revelation that our learners should be aiming at developing around 12-15 words a lesson in order to progress appropriately.

So I found myself thinking, as one does at these times, about my own lessons. I reckon that I’m pretty good at reformulating and am definitely one for letting language emerge “on demand” in the lesson rather than being overtly dependent upon “input” language. I’m also fairly good at recording the language that arises, usually informally, I think: the day before the workshop I was revisiting an old IWB file with a colleague and found myself wondering how a whole bunch of words had appeared on the slide, which appeared to have only the most tenuous links to the main information. Where I know I need to do better, then, is the follow up work, the consolidation, if you like, something I want to be much much better at next year. I think I do it in the lesson, and I’ve noticed students doing this sort of conscious application of new language in the moment, but as was discussed in the workshop, teachers need to actively promote this kind of emergent, negotiated language in order to enhance learning  – students need to know that the language is there and do something with it.

This is, of course, going to appeal to me as a piece of research, and I guess when you sign up to sessions at a confenrence it is often a bit of an echo chamber – I’m unlikely to be going to sessions on, say, SMART targets, or engaging learners with learning outcomes, because I’d rather scoop out my hear with a spoon than listen to someone extolling cheap performance managed behaviourism, but I’m likely to be battering down the door to a workshop on conversation and emergent language. But then you go to conferences to find out more about things you are interested in, I guess: it’s not a comprehensive education, so to speak. I’d have been deeply disappointed to find out about Richard’s workshop second hand, whatever happened.

I’ve just seen the wordcount in the bottom corner creeping up towards 1500, so I think I should probably stop. This doesn’t mean I’ve nothing to say about storytelling from Jamie Keddie, just that this post is getting ridiculously long! In a lot of ways Jami’s talk on storytelling and ways to exploit videos in line with this was similar – after all, these kinds of activities often build on language that emerges in reaction to, or as part of the story – opportunities are presented for emergent language which can be capitalised upon and exploited in just the same way.

So it was a good day, and a good event – I’ve got a serious batch of ideas for next year, which is sort of the point, isn’t it?

Language & the ESOL image problem

Three things this week came together quite serendipitously. First was walking past a British Sign Language class, and seeing the tutor not only teaching BSL, but also using BSL to communicate ideas. The second was a conversation with two non-ESOL teaching colleagues about the SOLO taxonomy and the notion of using “higher order” questions. The third was a tweet from Scott Thornbury, “The problem with EFL/ESL teaching is that, unlike maths, history etc, there is no subject. So the language itself becomes the subject.”

So this set me thinking. You see I think ESOL in further education setting has a bit of an image problem. There’s a perception in some corners that we should fit in to everything else, that something which applies to sixteen year old joinery apprentices can be applied without modification to a group of beginner ESOL students, and that our reluctance to do so, or questions asked about it in order to make sense of it in ESOL terms is seen as ESOL teachers and departments being awkward, stroppy, and obstructive. Don’t get me wrong, mind, because like any teacher, ESOL teachers can indeed be stroppy and obstructive, and I get that. However, there is a serious point here: there is a single and profound difference between ESOL and, with the exception, perhaps, of my colleague teaching BSL, every single other subject teacher in a college can communicate directly and unambiguously with their students.

Let’s take questioning as a good example of this. When teaching a subject through a shared language, one quick, effective way of challenging students is to ask questions which probe deeper into the subject, moving from straightforward knowledge of details (“Name three types of…”) to more complex, evaluative and critical questions (“what might happen if…”). This is generally seen as good practice, and, I think, quite right too. When I think of CELTA, for example, I might ask students initially to identify how to use the past simple, and then challenge them to analyse the problems faced by second language learners in using it, or what the barriers might be, or to compare how the past simple is used as a simple,e past reference ce and how it used to describe a narrative. This sort of range of questioning or task-challenge works to push students into thinking beyond just knowing a fact. (For the record, however, you do need to know the fact before you can start to go beyond this. What is commonly referred to as “lower” order questioning is not necessarily worse or less important – if anything it is the most important type of learning without which all the rest is impossible.)

Trouble is, all of this, every element of this, is entirely language dependant. It assumes on the part of the speaker and the listener a shared language with a fair degree of linguistic complexity. Don’t let snobbery get in your way here: my fictional joinery apprentices have access to an astonishing array of linguistic talents, even those ones who failed GCSEs. The fact that they can understand a question like “what might happen if you used an alternative timber for this?” is a demonstration of a fair amount of language skill.

So we have to consider carefully the value of time spent in training or reading about this when you remove that language skill. I simply cannot reliably ask my students “how would you change the verb if it is irregular?” Instead I have to get there a different way. The primary way I use questioning is not to expand in this way, but to apply successive “lower order” questions to build complex knowledge. “Read this sentence: I visited my sister. Am I visiting my sister now? Tomorrow? Before now? Good.” Then the next day I come back and start up irregular verbs, checking and eliciting concepts again using simple questions.

None of this means that ESOL students are incapable of thinking in those terms. Remember these are diverse classrooms on a scale incomparable in FE, with teachers, doctors, university lecturers and civil servants sharing a room with hitherto uneducated housewives, farmers and factory workers, none of which can be used to make assumptions about language learning aptitude. To use terms associated with higher order thinking, synthesis, creativity, evaluation and hypothesising are required of ESOL students from the get go when they are challenged to use language in new and unique situations. It’s just that we, as teachers, can’t use the language as a means to get there.

So we have to critically evaluate everything that a generic trainer says. Teachers are pragmatic people, after all, and would like something useful that we can use in our day to day classrooms, and an interesting curio like the SOLO taxonomy has limited, if any applicability. Ditto Bloom, although it could be used for task design, perhaps. Ditto Socratic questioning, flipped learning,  negotiating learning targets, sharing and self assessing SMART lesson outcomes. These are language dependent concepts, and this is the key to everything.

Until you’ve taught an ESOL class, none of this will make sense to you. I’ve seen it in CELTA teaching practice where a qualified teacher in another subject tries over-complex questions to a low level class and suddenly realises that they might as well have just whistled and farted for all the good it’s done. The good trainees are the ones who realise that they do have to change their paradigm, and alter their classroom behaviours accordingly. Because that is what we are talking about: for a generically trained teacher of a vocational subject, the nature of the ESOL classroom in a UK setting is radically different.

And this can indeed make ESOL teachers seem obstructive when it comes to implementing college-wide initiatives or training opportunities, but they are simply trying to make sense of it all, to take those initiatives and challenges and make them work in their context. And that context is different, profoundly and radically. It’s also what makes ESOL such fun to teach.

A Long Ramble on Evidence and Change. No, really, it’s long. 

I read with some interest a post on “Six Useless Things Language Teachers Do.” I like this sort of thing, and it’s why I read Russ Mayne’s excellent blog not to mention several other blogs, and numerous books around a general theme of evidence based practice, and on the theme of challenging sacred cows. I particularly enjoyed the “six useless things” post because it challenged some of my own holy bovines: recasts, for example, being largely ineffective. This error correction strategy is something we teach on CELTA, although not, admittedly, as a key one, and it’s definitely one I apply. I think that if I do use it, mind you, it’s as an instinctive, automatic response to a minor error, rather than a planned or focussed technique. 

More of a challenge for me was the second point: not so much the dismissal of direct correction of written errors, as this more or less chimes with my own stance on this. I’m not sure it’s totally useless, as the piece suggests, but I certainly don’t think it’s much good. The challenge to indirect error correction (using marking codes, etc.) is more of a tricky one. I agree, for sure, that students can’t be expected to know what they have done wrong, but I wonder if there are perhaps one or two errors that a student can self correct: slips, silly spelling mistakes, “d’oh” moments which they know on a conscious level but perhaps forget when focussing on fluency (present simple third person singular S for higher level students. I mean you). I wonder, as well, if there is a pragmatic aspect here. Most teachers are working with groups of students, not individuals on a one to one basis, and using an indirect marking strategy, combined with making students do something about it inside class time, means that you, as a teacher, are then freed up to go round supporting students with the mistakes that they can’t self-correct. Context also counts for a lot here: a groups of beginners is radically different from a group of high intermediate students not only in their language level, but also in their meta-language level. Often, but not always, high level students have been through the language learning system a bit, have an awareness of meta-linguistic concepts,  and, crucially, are used to thinking about language. 

I could go on, but this isn’t about trying to pick holes, or a fight! It’s a naturally provocative piece, with a title like that, how can it not be? It’s also, as far as I’m concerned, correct in many of the other points, learning styles, of course, learning to learn, etc., although on that latter one I’d be interested to know how much time should be spent focussing on learning strategies: I’ve got 90 hours, tops, to help my students gain a qualification. How much of that time can my students and I afford to spend on it? If a one of session is minimally impactful, then I think I probably won’t bother.

What this shows you, and me, however, is that as a teacher I am terribly, horribly biased. I come to the job now with many years of courses, teacher training, reading, research, conference workshops, observing teachers, being observed, getting and giving feedback, in-house CPD, and, of course, a bit of classroom experience. This is bad. Bad bad. Because I have developed a set of routines, of practices, of “knowledge” which are, in fact, very hard to change. Oh, I may make lots of noise about research, about innovation, about challenges and being challenged, reflective practitioner, blah blah blah, but a lot of it, I worry, is so much hot air. 

Take one of my favourite bug bears: SMART targets for ESOL learners. Now let’s imagine that some university somewhere funded some formal research into SMART targets. And they did a massive study of second language learners in a multilingual setting which showed, without question, that students who used SMART targets to monitor their learning achieved significantly higher levels of improvement when compared to those who did not. Let’s imagine that a couple more universities did the same, and found very similar results. In fact, there developed a significant body of evidence that setting SMART targets with students was, beyond a shadow of a doubt, a good idea. Pow! 

Now, in our fictional universe, let’s also imagine that I read these reports and am struck by the convincing nature of the evidence which runs entirely at odds with my opinions, beliefs and understanding. I have to wonder that even, in spite of this, I would be able to make the massive mental leap of faith and accept that I am wrong and the evidence is right. Could I do it? On a similar vein, if it turned out the evidence was all in favour of learning styles; that technology is, in fact, a panacea for all educational challenges; and that there is a fixed body of objective Best Practice in Education which works for all students in all settings all the time, if all this turned out to be true, could I align useful with all this because the evidence told me so? 

Probably not. 

For one, if all these things turned out to be true, I’d probably have some sort of breakdown: you’d find me curled up in a ball in the corner of a classroom, rocking backwards and forwards muttering “it can’t be true, it can’t”. More importantly, however, what this shows is that evidence and facts can say what they want, but the pig-headed stubbornness of a working teacher is a tough nut to crack: it would take a long time for me to adjust, to take on the changes to my perceptions and to work them into what I do. It might not even happen at all: even in the best case scenario, I think I would probably want to cling on to my beliefs in the face of the evidence. 

Unless something chimes with our beliefs about our practices, unless we agree in our professional hearts that something should be true, then short of a Damasecene epiphany in front of the whiteboard, it’s going to be extremely hard to embrace it. Let’s not beat ourselves up about it, mind, because that’s not going to help. And don’t let’s beat up others either: we are, after all, only human, and I have a suspicion that, regardless of our politics, one of the things that professional experience leads to is some form of professional conservatism. How do we get past this? 

Expectation, probably, would be a good place to start: it’s too easy for leadership and policy makers to declare that a new practice, with an evidence base, of course, is good and should be enforced. How effectively that gets taken up depends on the size and the immediate visible impact of that practice. When I am leading a training session, I start with a very simple expectation: that everyone go away with just one thing which they can use with immediate and positive impact. It’s u realistic to expect more, and if an individual takes away more than one thing, then that’s a bonus. To expect more than this from any kind of development activity is probably unrealistic, and actually, so what? If someone takes on a new idea and puts it into place, then that’s a success surely? We can apply this also to evidence based practice: make small changes leading up to the big change, and the big change will much more likely happen. This is often not good enough for some leadership mindsets, who demand quick, visible changes, but that is a whole other barrier to teacher development which I’m not going to explore. 

Time, of course, would help, but given that FE in particular is financially squeezed and performance hungry, this time will need to come at the teacher’s own expense. No time will be made for you to read, discuss and understand research (and God forbid that you attempt to try anything new during formal observations) so that time must be found elsewhere. Quite frankly, however, even I would rather watch Daredevil on Netflix of an evening than read a dry academic paper providing evidence in favour of target setting. (Actually, I think I would read that paper; so, you know, when you find the evidence, do let me know: because I’m sure that ESOL manager and inspectors have seen this evidence and are just hiding it for some random reason. After all why would such a thing be an absolute requirement?)

Deep breath. 

I’m sorry this has been such a long post: it’s been brewing quietly while I’ve been off and I’ve been adding bit by bit. But there’s a lot that bothers me about evidence based practice. Things like the way learning styles hangs on in teacher training courses, and therefore is refusing to die. Things like the rare and to easily tokenistic support for teachers in exploring evidence and engaging with it. Things like the complexity of applying a piece of evidence based on first language primary classrooms to second language learning in adults. Things like the way the idea of evidence based practice gets used as a stick (“You’re not doing it right, the evidence says so.”) while at the same time being cherry picked by educational leaders and policy makers to fit a specific personal or political preference. Not to mention the way that the entire concept of needing any evidence can be wholeheartedly and happily ignored by those same stick wielders and cherrypickers when it suits them. An individual teacher’s challenges with evidence which runs counter to their beliefs is a far smaller one than when this happens at an institution or policy level. A far smaller challenge, and an infinitely less dangerous one. 

I Hate Training Days

You probably have training days in some form or another, if you are a teacher, and you probably spend several days of your academic year student free and attending various workshops and presentations about one thing or another. Now, let’s be perfectly honest, in your life, how many of these have genuinely stuck with you? How many of the workshops have been so strikingly informative that you can measure their impact in your day to day practice?

I am, of course, being deliberately provocative here, and definitely link baiting with the title of this post. After all, there has almost certainly been impact from your training events, and there has almost certainly been stuff, useful stuff, which has stuck with you. But I bet there’s of corporate PowerPoint presentations and workshops involving flip chart paper and post it notes.

Mea culpa. I have run these workshops. I have made PowerPoint presentations. I have given you post it notes. I have made you brainstorm ideas on flip chart sheets (and incidentally there is nothing wrong with the term brainstorm, so can we stop calling it “ideas storming” because that’s just a rubbish word). I have even, on one occasion, been left in charge of a half day session for some 40+ people, and I made such an excruciating balls up that I still wake up at night in a cold sweat. Despite this, however, I will probably continue to run workshops and deliver training events in some capacity for some time to come.

Generally, the process for setting up these events is in response to some external or top down driver: internal quality processes identify a gap, and the training day aims to fill that gap. I think there is a need for this kind of thing, and for procedural training and standardisation activities. After all, we need to know how to manage certain processes, and in the nicest possible way, very few individuals would actively go “ooh, I need to go off and learn about equality and diversity” without at least a little prompting. And yes I know you don’t need to do all that stuff, because you never do, but someone probably does.

This is always going to be part of the problem: “it doesn’t apply to me”. Anything based on generalised cross college data is never going to quite fit some people’s needs. But I’ll tell you what: having experienced giving teachers a choice in what they want to have at a staff training day, teachers can be pretty rubbish as well. In some corners of the FE world there exist people who, when confronted with the question “what do you want on training day?” would reply “I don’t know, you tell me.” before then going on to complain that the event didn’t meet their needs.

So here is my answer. It’s quite long, so you may want to pop off and get a cup of tea.

Got one? OK.

Let’s say that you have four days as standard across the whole institution dedicated to staff development days. One of those days is to be divided into two half days, and dedicated to procedural/systems training. This is as a norm. Make it five, if you like, with two days for procedural stuff. Just leave me with three. Two days and a half day if you must.

These three are divided as follows.

Day one, near the start of the year. Every teacher has to devise and submit a proposal for something they wish to research in their own practice. They are given a whole summer of notice for this, so they can think about it well in advance. They can come to the day with an idea, or they can be inspired on the day, but by the close of play, everyone has a reasonably good research question and a number of things to do to explore this. This should, being focussed on classroom practice, involve some sort of peer observation and support group. The support group is there to provide more research based support as much as peer observation, and to make a space for discussion and negotiation of the research aim.

Day two: sometime around January. It’s cold and dark so let’s have some nice warming cakes and mulled wine/hot chocolate at the meeting. By now, various bits of activity should be in place, maybe even finished, and certainly the research should be in the process of being peer observed, shared and discussed. The day would take the format of supported working groups, where everyone shares their findings in a non-critical and supportive context. Next steps are suggested, discussed and agreed. An outline for the final report is shared, to give focus to future activity.

Alternatively day two could be split into two half days, with one of the half days coming a little earlier and being an opportunity to get someone in to talk about some of the things that people are researching. There is, of course, clear expense involved here, so this would probably be impractical. However, it’s a thought.

Day three: Easter, or just after. Either that or at the very end of term. This is so that nobody is worrying about exams and stuff, and has a clear mind. This is the opportunity to share findings, disseminate ideas and practice, and generally look at each other’s work.

This is also supported as an ongoing thing through sections at staff meetings, and informal drop in sessions where anyone can come in and ask questions.

By way of protecting myself from a hundred “yeah but what about…” questions, the version in my head is much more carefully planned and supported, and there is a lot more detail to it than this, but that, essentially, is it. There would, of course, be barriers to such an idea ever taking off, not least of which is teaching staff themselves, some of whom may feel they need to be told stuff, and who still believe in the bonkers notion of the super-duper-expert teacher who can cascade their greatness to the masses. There may also be members of the managerial teams who are reluctant to relinquish such control over what is being done, especially in the face of inspections and the like which insist on knowing what is being done about which issue. However, I’d be willing to bet that a lot of the time would be spent researching stuff which would be applicable to all sorts of inspection-based development outcomes. As for the super duper cascading mega-teacher? They never existed anyway, just a load of clever self-marketers: there is greatness in all teachers. All we need is a better opportunity to find this and share it.

Imagine what the impact of having hundreds of people working on research could be. There would be interdepartmental cross over with themes and topics as well. Teachers with similar or related subjects could be encouraged to work together to support one another, and if the group was of three or four people, the research could be codified and formally written up, published and taken off to external conferences and events, raising the profile of the institution.

The other thing that this sort of activity promotes is a shift in the mind of the teachers. By becoming teacher-researchers they also become learners, potentially more open to new ideas and experimentation, to discovering things about their practices and their learners. Institutions become learning institutions at every level, from a Principal to an hourly paid tutor. None of that can be bad, can it?

Technology, Bias, and Vested Interests

I promise this is my last critical post about technology and learning, I really do. But I’ve been annoyed by something, an article about a report on how FE is lagging behind in the technology stakes.

There are two issues here.

The first is the accuracy of the claim. Do a search for reports and research into the benefits of elearning and, well, you can’t move for the damn things. Technology, if you believe everything you ever read, is the magic bullet that will engage learners with learning and create a fully functioning digital universe. But hang on, if we look a little more closely at these claims, we discover that actually they are just that – claims. Claims that technology benefits learning. Not evidence. Not hard data, that is, like, say teaching two groups of learners one with and one without tech and see who does better in a test. I hope that when you read that you will be sneering at my simplistic test description there, saying “ah but education, it’s just too big and complex to study it in that way.” (although you’d be wrong: it would be perfectly possible to devise a fairly straightforward trial of technologised learning in that way, and if anyone would like to give me some time and money, I’ll do it, because I’m interested.) The reason I hope you’ve said that is because it underlines the fatal flaw for your argument as well. If you to want to argue that you can’t measure education that way, because it is complex, multi faceted blah blah, then you also have to ask yourself the question “how do I know technology works?” Your evidence is a flimsy and as questionable as anyone else’s. That gives us a no score draw at best.

Ok, the second issue. Where evidence does exist or is claimed to exist, it is often in the reports of people with a vested interest in education taking up the use of technology. Take this one, published by Microsoft and Intel, the suppliers of much of the hardware and software used across the vast majority of FE colleges in the UK. Colleges which, perhaps, are upgrading a little less often than a few years back, buying fewer computers, that sort of thing. From a business perspective, two multinational corporations who care not one jot for the young people and adults of the UK, apart from their capacity as consumers, persuading colleges and their learners to invest in their products is a potentially very lucrative investment. Profit is rarely a wholesome motive, even less so when dressed up as the public good.

Vested interest is a dangerous thing: in medical science, for example, research into the effectiveness of a given intervention when funded or carried out by people with a vested interest in said intervention working, is generally peer reviewed and checked for accuracy in its modelling. It’s not a flawless process, but it does cultivate a culture of scepticism and questioning. Alas, in education, we are generally too soft and lacking in confidence for this sort of thing, especially when policy gets behind whatever random claim is being made. Indeed, claims for rigour and robustness suddenly seem laughable in education when we still, as a profession, from teachers in the ground floor all the way to senior managers, don’t engage ourselves in enough research to understand how much of what is presented to us as fact, from e-learning to Bloom’s Taxonomy to hell-in-a-handcart learning styles, is entirely open to discussion and questioning. Nothing is sacred: most scientists can tell you that. Unfortunately, education sometimes seems to be founded on the kind of journalistic psychology not unlike that found in tediously aspirational magazine articles.

I think what I would like is to find some neutral, unbiased studies of actual learners in an actual context who actually end up being more successful than others as a result of engaging with technology. I don’t see it happening any time soon, mind you, so that wish will go into my special internal box of wishes, right next to the one for solid evidence in favour of target setting for ESOL learners. Both practices are policy driven, rather than learner or indeed reality driven, but the difference for me with technology is that I do think it has some positive impact. I’d like to do that research, because I have a suspicion that the technology would be proven right. I don’t think it would be as amazingly wonderful as people would like to believe and I certainly don’t think that teachers can be replaced with hole in the wall grannies or “learning coaches”. I just think we need to redefine what a teacher is, and how they interact with their learners with technology, which is perhaps where my next post is going.

Pascal’s Bicycle Helmet

I have recently restarted wearing a helmet for when I ride on my bicycle. Anyone who reads or thinks anything about riding a bike beyond “it’s a handy way of getting about” will come across the quietly furious debate in cycling circles over helmet wearing. I stopped for a while because
a) on average, the chances of a helmet helping me when some idiot (male or female: they’ve both had a go) in a Mercedes-BMW-Range Rover* sideswipes me are probably pretty small,
b) the UK road system needs improving, as it is designed primarily for the benefit of car-brained idiots, and essentially antithetical to everyone else; and
c) I look an utter dork when I wear one.

But then I had a bit of a think, and, dorkiness aside, the chances of the average British driver ever giving two hoots about any other person on the road are very slim indeed, and certainly no government is going to annoy the oil companies for the time being by actively discouraging car use. I also know that while, statistically, a helmet is unlikely to save you, it doesn’t hurt to wear one. I’m not super sporty and so not put off by a few extra grams or lack of streamlining, and, crucially, absolutely crucially, I reasoned that wearing one may or may not protect you, but not wearing one will absolutely not protect you. So it makes logical sense to wear one. (But not obsessively. If I’m popping to the shops and for some reason I can’t find or forget the helmet, I’m not going to be distraught.)

Reasoning like this is a cycling helmet version of Pascal’s Wager. Now, for those of you who have not yet read the appropriate text, Pascal’s Wager was dreamed up by the French philosopher Blaise Pascal (obviously) in the 17th century as an argument against atheism. It runs like this. If you don’t believe in God and you are wrong you will end up in Hell. If you do believe in God and are right, you end up in heaven. If you do believe in God and are wrong, then it doesn’t matter. In short, The wager says that you may as well believe in God, because the odds of you ending up in Hell are reduced. There are flaws here but you’ve got to admit it’s got some nice logic.

There’s a link to teaching here as well.Teachers do this sort of thing a lot. We are, by our natures, hoarders and thieves, professional magpies, and we won’t, as a rule, let go of something, just in case. Not just materials, mind you, but also methods. Teachers hang on to techniques and methods and neuro-bollocks, again, just in case. So learning styles inventories persist at the back of the popular mind, people default to treating ILT as “good” regardless of whether or not it actually is making the blindest bit of difference, and, of course, despite most evidence to the contrary, we continue to set SMART targets for ESOL learners.

But, you say, where’s the harm? Let’s include some stuff on learning styles, some ILT set some targets, because, well, it’s not going to hurt. Let’s spend ten minutes on brain gym. It might do something, but even if it doesn’t, it hasn’t actually done any damage.

Well, actually, no, you’re wrong. Dead wrong. A hundred and fifty percent wrong, in fact, as wrong as my maths in the first part of this sentence, as wrong in my proposition that this, technically, is a sentence.

Spending classroom time on non-learning bollocks like learning styles is, in fact, damaging. For the bicycle helmet, the logic holds, particularly for a fattie like me. With the uncomfortable levels of force which would be applied to my head, any kind of protection is likely to do some good. For Pascal, the logic fails because it ignores a deity’s omniscience, (don’t you think that God might notice your craven, self serving pretence?) and, frankly, is a bit of a wussy cop out.

CaptureThe neurobollocks one also fails because by promoting anything like this in the classroom, even if it is just to get the learners thinking about their learning, only succeeds in perpetuating the myth. You want to talk about different things learners could try to help their learning? Great, I’m with you all the way, but why on earth do you feel the need to pigeonhole the learners first? And think about the morale of a learner who does your little learning styles bollocks test, who discovers that they are mostly an auditory learner, but they are studying Joinery, which is going to involve lots of doing stuff with your hands, and much less listening time. And then, just to cap it all off you say “well, don’t worry, because all I wanted you to do was have a think about your learning.” which totally negates the last one and half hours you spent on the whole sorry business.

You run the same risk of damage in very established, yet entirely unproven, practices like setting SMART targets with ESOL students (see Chapter 14 here, and the hitherto unanswered challenge laid down here, almost ten years ago). Maybe I’ve misinterpreted what SMART targets are about, but if they are to be used for planning and as evidence of learning, then I think we are on very rocky ground here. Target setting, unless I’ve been reading the wrong research, simply does not link to the way in which language develops in adults, whatever view of learning you take. A learner’s language necessarily goes round in cycles, where bits develop not in directly evidenceable and logically sequenced stages, but in more of a circular, waltzing variation of two steps forward, one step back, with the waters muddied by issues like first language transfer. Achievement in the language classroom doesn’t always immediately become ability in the wider world: it’s not that simple. So when you say to learners “let’s track your learning through these simple little activities like “write five sentences about your house using adjectives” or “talk about my friend’s day using verbs correctly” or “use articles correctly in a short paragraph about yourself” we are constructing a set of false expectations. Learners are being encouraged to think that when they have achieved that target they have therefore learned that thing, and the chances are pretty high that, for a while, anyway, they won’t be able to reproduce it in a different context, or indeed the same context, on demand and as required. (Incidentally, if all ESOL teachers genuinely waited until learners could reliably do whatever it is we are teaching then there would be uproar as success rates plummeted.) As soon as learners realise that they can’t do it, I wonder what happens to their morale?

In short, then, there are practices which we adhere to for whatever reason which are not just pointless, but potentially damaging to learners in the longer term. Sometimes we stick to these because we ourselves are wagering, mistakenly, that the potential benefits outweigh the potential drawbacks. Sometimes I suspect we stick to these because we are told we should, because again, someone, somewhere is making the same wager.

So maybe we need to approach all practices in education not so much with an open mind, but with a questioning mind. I’m not saying we should ever immediately dismiss anything out of hand, or become hardened sceptics, but we should always alwaysask those challenging, difficult questions of the person telling us about it.

***

* I have a theory that whenever anyone buys a large car of some high status marque, they also have a kind of lobotomy which removes common sense, care and respect.

Why Learning Styles are important

Don’t worry, I haven’t cracked. Not yet, anyway. But learning styles serve as a nice object lesson in why evidence is important and should be important.

In the last few weeks, the issue of evidence based practice has been floating around my mind, mainly after Ben Goldacre published, on behalf of DfES, and at the request of the delightful Minister for Education, a short and accessible paper on how randomised controlled trials (RCTs) in particular, but research and evidence in general can benefit teachers, and teaching and learning.

The main paper is published “here” with a summary as a Guardian article here and a response, of sorts, published by those fine folks at the Institute of Education can be found here. There have been a number of other responses as well, all of them interesting: from the Guardian Blog an evaluation from a scientist: which includes a link to a response from the British Educational Research Association, a response so charmingly sniffy and redolent of old-fashioned academia you can practically see the tweed and the half-moon glasses perched partway down the writer’s nose as they type it out on an old manual typewriter in an oak panelled study.

But what came out of it for me was the importance for teachers to be able to draw on solid evidence and research to support their points of view, and to empower them to have a more even and fair debate when discussing their own practice, rather than relying on “vegetable cures cancer!” type of science you find in articles from the Daily Mail  or “psychology” articles in that particularly aspirational class of magazine aimed at women* (“Live life more fully! Revitalise your neurones with daily mushroom meditation!”).This kind of pop psychology seems to inform a lot of what teachers do, and it’s often sold as such to us, using the same kind of chatty journalistic writing, snazzy graphics, and so on. But so rarely do teachers turn around and say “yes, but…” And for me, research and study, of any sort, gives weight to your arguments, especially when you are being presented with some smug trainer type in Converse trainers and a goatee beard saying things like “research says…” Hey trainer, we should be saying, tell me what your evidence is, give me the chuffing reference!

Anyway, all this set me to wondering, and discussing online (again) about not just learning styles, but also their spiritual cousin, the wonderful cobblers which is Neuro-Linguistic Programming (I had an awkward moment the other week when a teacher told me she was interested in how NLP could help her teaching. I’m a nice person so I nodded and smiled, and absolutely resisted the urge to say “and while you’re at it, you could also look at using fairies, aliens and the One Ring.)

You would think that by now the message finally would seem to be sinking in about learning styles, but as this sort of not-even-pseudo-scientific tosh demonstrates, the idea still clings on, temptingly, sexily wooing people with phrases like “optimal learning styles…customize the perfect opportunity for children to grow… strategizing lesson plans…”.

Even when people admit there is no evidence base for VAK, they still somehow want to use it as a justification for using different modes of delivery – sometimes using visuals, sometimes using audio, sometimes doing physical activity, etc. It’s as if they want the justification, the pop-science support for their own ideas. Rather than drawing, as teachers do, on their own experiences and the experiences of others, it’s as if we need the comfort of something science-y sounding to justify what is common sense: it’s more interesting to do stuff in different ways. That, basically, is the appeal of learning styles as a theory, it’s nothing more complex than that. No matter the evidence base, if you get learners to spend two hours faffing about with cards or posters, or listening to a lecture, it is going to be boring as hell. We don’t need to pretend that it’s about meeting individual needs, yadda yadda (it’s not, anyway): we do stuff in different ways because it’s interesting to change approach.

The learning styles/NLP discussion is important because it shows us that teachers do need to engage with evidence, read it critically, and do need to be able to say “this works because…” when discussing practice. When we don’t, no matter how well meaning we are, we spend twenty years peddling lies to learners. So can we stop. Now. Please. I may pop otherwise.

***

*you know these, they usually have inspirational stories of how wealthy, middle class women in their 30s gave up their 9-5 job as senior partner in a law firm to set up a boutique candle making company in the Cotswolds, who struggled with the changes but luckily was supported by her whose husband, an astonishingly well paid company director.

Evidence: something I’ve learned from being a parent

Having babies is a life changing experience, I can tell you this now with some conviction. The benefits, for most people, balance out the assorted deprivations (sleep, money, sanity). It’s fab, and there is no feeling in the universe like having your son or daughter come up to you with the seriousness and honesty that only a small child can manage, and say “I love you” and “I’ve just snotted in your hair. Hee hee.” And watching them grow and change and do all the brilliant children things which adults have forgotten. Cheesy, yes, but also undeniably wonderful. No disrespect meant, but other people’s children really don’t cut the mustard here.

But why am I getting all sentimental here on my usually cynical and questioning blog, you may ask. It’s about one of the worst things about being a parent. It is not the children, but rather the problem with being a parent is that everyone, and I mean everyone, including people who only had the briefest interactions with babies and children 20 years ago, everyone has an opinion. Everyone thinks they know how to do it, what advice to give, what is best for children, what is worst, what to avoid, and so on. So you will get (I kid you not) dopey statements like:

“ooh, stop rocking her to sleep, you’ll make a rod for your own back”
“make sure you trim that fringe out of her eyes, she’ll go blind”
“don’t you think that she’s getting too attached to you?”
“Don’t let them into your bed, you’ll never get them out”
“Of course he does that, he’s a boy.”
“You’ll just have to get used to the pink stuff, she’s a girl”

All of which, like most children based theories and discussions (cf. the almost violent arguments you hear about breast “vs” bottle), are not based on that individual finding out proper research but rather it is based on personal anecdote, urban myth or “what I read in the paper”. (Or more often: “what the people on TV said when they talked about the headlines in the paper”)

This is not unlike advice you get about teaching (including the point about TV). Very very few people read research into education, including those who should know better, (including me, sometimes, I have to be honest). I have yet to meet someone in FE who can cite much beyond Geoff Petty (for whom, I should add, I have the greatest respect for). And practitioners get sniffy about academic research: “what do they know, they don’t have to do it every day.” So battle lines get needlessly drawn, we ignore research-based evidence, and take on practices placed on us without any supporting evidence offered – sometimes where no evidence exists beyond anecdote and hearsay.

There’s nothing wrong with anecdotal evidence. A lot can be learned from this “it worked for me, why don’t you give it a go?” approach, as long as the practice can be rejected if it doesn’t suit. As long as it doesn’t get dressed up as “best practice” or worse an essential component of practice because it suits a particular ethos or political viewpoint.

The other challenge with evidence is that it takes time to read it. But I would strongly recommend it. Don’t just read “Evidence Based Practice”, as good as it is, and believe everything, but look through the references, and see which apply or are analogous to your own context of teaching. While there are indeed parallels between the learning of 16 year old ICT students and 30 year old ESOL students, and while there may be something to learn from this, these parallels are never absolute nor applicable in every context.

And we need to read the research well. Avoid the “scientists say” lazy thinking applied by newspaper articles. Read it critically. Certainly before you start throwing bricks at established practices, find some good evidence to add weight, particularly if your intention is to smash windows. Anyone can write up an extended anecdote as a case study but it needs to be read as such, in the wider context in which it is written.

Then there is the issue of bias. I’ve always been blasé about bias, especially in education research, assuming always that educators are writing to make a better experience for learners. However, education research is just as dogged by bias, particularly articles in wider, less scholarly texts. I’d want to question anything funded by a government or a government agency which almost certainly will have a bias. Bear in mind that no government quango or department, using their own publication channels, is going to happily publish research which goes against what it has been saying for the last x years, after all. And even if an independent, thorough, critically reviewed piece of research came out which demonstrated that some key element of the Common Inspection Framework was wrong, then it would take years before anything changed. Look how long it took to get learning styles out of the inspection regime. (Note for readers in the 22nd century: learning styles was a bonkers idea cooked up by businessmen in the 1980s which lots of people slavishly followed in a misguided attempt to meet individual needs)

Technology in education is another area where bias might easily be an issue: who paid for, or who is promoting the studies into the effectiveness of interactive whiteboards? If it’s SMART technologies or Promethean, then you have to read very very carefully. It’s very seductive to read something like this http://www.fenews.co.uk/fe-news/the-mobile-e-volution and think “hmm, interesting, authoritative, well written” and yet the writer clearly has an interest in getting us to engage with a product which will enable us to do the very thing he is telling us about. This fact makes any claim or suggestion made therein highly suspect: he’s hardly going to exhort us to abandon the ideas that drive the software he is promoting. In the nicest possible way, because I do love them, an organisation like JISC are unlikely to support research which goes against their main raison d’être. The same line of argument could be applied to published materials: a coursebook publisher is unlikely to argue for materials-light learning (unless, of course, they’ve just published a “how to teach materials light” book…).

Evidence is sometimes sadly lacking when it comes to influencing much of what we do, which is ridiculous, when you think about it. After all, educational progress and achievement is regularly and reasonably thoroughly tested, and a fairly well researched and understood area of education, so to assess the impact of a given measure is not always going to be that hard. You would have to be very careful with things like control groups, and making sure that teacher perceptions are taken into account, for example, but it could be done. But also research, in its basic form, is learning, and really, learning is what we are all about.

***

For more on evidence and so on, try reading these, http://media.education.gov.uk/assets/files/pdf/b/ben%20goldacre%20paper.pdf and http://thesuttontrust.wordpress.com/2013/03/13/evidence-is-just-the-start-of-finding-what-works/ both arguing not just for better evidence, but also discussing what to do with the evidence once you have it.

The Wobbly Path to Teacher Development

20121014-105134.jpg

Have a look at the graph above (the reference escapes me, but do comment if you would like to remind me to pop the reference up there). It’s an illustration of two different types of teacher and how they develop. Some teachers develop to a point, then stagnate. Other teachers continually experiment and develop and although (as a result of the experimentation) that performance may dip below a particular standard from time to time, the overall trend is for continual improvement above and beyond that of the stagnated teacher.

Which is better? The official line here is that the better person is the person who improves the most, but this is a line which would like that improvement not to drop or wobble, but to improve and develop in a clear, preferably predictable straight line.

Alas, this is not how it happens. The path to professional development requires experimentation and, as anyone with a basic grasp of science can tell you, one of the points of experimentation is to identify that which doesn’t work. Experiments fail, and are modified, reflected upon, reported upon, and tried again.

So for a practising, developing teacher, what is the practical impact of this? Naturally there has to be some recourse to the standard: after all, we are not dealing with chemicals and micro-organisms but real young people and adults. However, we can’t assume that at the end of any teacher training course the people who pass that course are as full as they can be of the required knowledge and experience! There has to be improvement from that point onwards.

But if that improvement doesn’t, as I think, happen in a straight line, what about if that teacher’s performance is observed as being below that line? Even though their average performance and eventual performance will be better than that, the perpetual focus on the once-a-year snapshot of teaching and learning means that there is every chance that you get observed on a down dip.

So what’s the answer? Do nothing in your observation window which could go wrong? OK, and I’ll admit I’ve done that, especially where there is a lot at stake, but by and large I explore ideas and alternatives in my classroom practice across the year. However, what message does this send to anyone who is not inclined to experiment?

It could be argued that the one lesson, once a year model of classroom observation actively encourages mediocrity. The coaster’s argument, with some logic, is that if they are better playing safe during observation week, then why bother doing anything else? I’m not saying they are right, but rather than necessarily challenge the coaster to improve, the standard model of observation suggests simply that they are better to play safe.

And raising the bar for acceptable performance, as OFSTED did recently, isn’t the answer, either. Coasting can happen at any level of performance. One can coast nicely on the back of a grade 2, or at a grade 1. It doesn’t mean that teacher is innovating, pushing boundaries or exploring alternatives.

The danger with a simple depiction like the graph, of course, is that it suggests that there is a ceiling on development. If I were to redraw the graph I might still lessen the angle of the improvement towards the top to represent the move from big developmental leaps (such as those made by the beginner teacher) to very small, incremental developmental steps, made by the very experienced teacher. There is a gap between what I know (and do) and what there is to know, but it is a lot smaller than it was 13 years ago. That gap is unlikely to ever close, as each group of learners, indeed each lesson I teach, brings countless small (and sometimes large) challenges and changes to my learning as a teacher.

10,000 hours

I attended some training the other day on teacher development resources, all of which were wonderfully backed up with evidence (oh, happy day) and was reminded of two professional development ideas: the first was the proposed concept, popularised in a whole bunch of books, but which I first encountered in Outliers by Malcolm Gladwell, that expertise arises as a result of 10,000 hours deliberate practice. The second was what I think of as the wobbly line of professional development, to which I shall come back to in a later post.

The 10,000 hours theory is quite a tempting one. The idea states that in order to become an expert, one must engage in 10,000 hours of deliberate practice. You can cite all sorts of informal case studies on this: Bill Gates, Mozart, Tiger Woods, and so on. But how does that apply to being a teacher? We’ll ignore for now that Gladwell rather snidely, to my mind, points out that subject specialists who don’t become experts often become teachers, a rather dressed up version of the “those who can’t do, teach” saying. But when we consider how one develops as a teacher, as my colleague pointed out to me, we don’t get 10,000 hours practice. We get 10,000 hours doing which isn’t always the same thing. A concert pianist or ballet dancer can, and does, spend significant amounts of time practising alone, or with isolated support. A teacher isn’t afforded the same luxury of time and space. Imagine if the concert pianist had to perform for up to 5 hours every day, 5 days a week, in front of an audience, with very little opportunity to stuff up, lest the audience write a negative report on their performance, which could lead to their eventual dismissal.

I would also challenge the assumption that 10,000 hours deliberate practice is everything. A lot of theories now about expertise and indeed about learning seem to be dismissive the notion of talent and innate skill. It’s an uncomfortable idea, that a person may, through a combination of temperament, background, beliefs, and more, just not be suited to being a teacher. (It’s about this point I get accused of being a fascist). I would even be so bold as to say some parts of a person’s genetic makeup could have had an impact on this. (This, by the way, is not me saying that there is a teaching gene, simply that genes can have an influence on some of the things which will make you a good teacher.) I have practiced and practiced, for example, but I acknowledge my own poor hand-eye coordination: cutting a straight line with a saw on a piece of wood, for example, is something that bamboozles me, as is fixing bicycle brakes. I’ve been shown, read advice, watched other people, had on the job feedback, and I’ve tried them out plenty of times, but I’m not even close to being able to do it, never mind being an expert.

Other factors need also to be taken into consideration, like motivation. The mediocre joiner who decides that they have had enough of running their joinery business and would like less stress (ha ha) is less likely to become a great teacher than a mediocre joiner who has been inspired to be a teacher after working with several trainee joiners at work. Age also has an impact: for whatever the reason (there are several and I’m not going to discuss them here), you can’t get away from the fact that, by and large, the older you get the harder it is to learn language. You also can’t factor out class and money: 10000 hours is an awfully long time, and time is an expensive luxury restricted to the middle classes. Bill Gates and Tiger Woods may not have come from wealthy backgrounds but they certainly weren’t poor. There’s also the issue of task complexity. Managing a classroom, playing a piano, playing tennis, computer programming, these are all fairly complex tasks. But I learned to make good baked beans from a tin pretty quickly. 10,000 hours practice isn’t about to make me a better baked bean maker.

In a fair society, concerned with equality this is uncomfortable, and opportunities do need to be made for all members of society to succeed, but to say 10,000 hours deliberate practice is going to make anyone expert and that’s that* is a hopelessly naïve view of learning and development. 10,000 hours deliberate practice is simply too trite and simplistic to be anything other than an observation of some people in some cases, from which we can gain a lot of useful insights about the value of practicing a new skill generally, although “to get better at something you need to practice it” isn’t really something which needs a great deal of research.

The final thing that occurs to me is this. If it does take 10,000 hours deliberate practice, and we focus this on classroom practice, then based on me teaching for about 13 years, averaging, lets say, 22 hours class time a week for 36 weeks of the year, gives me a total of 10,296 hours. Hurrah, I am now an officially sanctioned “expert” teacher. Well done me. Let’s have a party. But where does this leave the novice teacher? Can she and I be measured fairly against the same criteria? We are, of course, but how can she best develop if she is expected to be performing at a certain level the minute she steps off the teacher training course? And development from here is the theme for my next post!

*********

*it’s worth noting that most writers on the subject do pick up on these criticisms and acknowledge the interplay between different factors, but it’s usually “inspirational” speakers of the Dale Carnegie mould, business leaders and politicians who whittle the complexity of learning down to funky “headline figures”.