evidence

A Long Ramble on Evidence and Change. No, really, it’s long. 

I read with some interest a post on “Six Useless Things Language Teachers Do.” I like this sort of thing, and it’s why I read Russ Mayne’s excellent blog not to mention several other blogs, and numerous books around a general theme of evidence based practice, and on the theme of challenging sacred cows. I particularly enjoyed the “six useless things” post because it challenged some of my own holy bovines: recasts, for example, being largely ineffective. This error correction strategy is something we teach on CELTA, although not, admittedly, as a key one, and it’s definitely one I apply. I think that if I do use it, mind you, it’s as an instinctive, automatic response to a minor error, rather than a planned or focussed technique. 

More of a challenge for me was the second point: not so much the dismissal of direct correction of written errors, as this more or less chimes with my own stance on this. I’m not sure it’s totally useless, as the piece suggests, but I certainly don’t think it’s much good. The challenge to indirect error correction (using marking codes, etc.) is more of a tricky one. I agree, for sure, that students can’t be expected to know what they have done wrong, but I wonder if there are perhaps one or two errors that a student can self correct: slips, silly spelling mistakes, “d’oh” moments which they know on a conscious level but perhaps forget when focussing on fluency (present simple third person singular S for higher level students. I mean you). I wonder, as well, if there is a pragmatic aspect here. Most teachers are working with groups of students, not individuals on a one to one basis, and using an indirect marking strategy, combined with making students do something about it inside class time, means that you, as a teacher, are then freed up to go round supporting students with the mistakes that they can’t self-correct. Context also counts for a lot here: a groups of beginners is radically different from a group of high intermediate students not only in their language level, but also in their meta-language level. Often, but not always, high level students have been through the language learning system a bit, have an awareness of meta-linguistic concepts,  and, crucially, are used to thinking about language. 

I could go on, but this isn’t about trying to pick holes, or a fight! It’s a naturally provocative piece, with a title like that, how can it not be? It’s also, as far as I’m concerned, correct in many of the other points, learning styles, of course, learning to learn, etc., although on that latter one I’d be interested to know how much time should be spent focussing on learning strategies: I’ve got 90 hours, tops, to help my students gain a qualification. How much of that time can my students and I afford to spend on it? If a one of session is minimally impactful, then I think I probably won’t bother.

What this shows you, and me, however, is that as a teacher I am terribly, horribly biased. I come to the job now with many years of courses, teacher training, reading, research, conference workshops, observing teachers, being observed, getting and giving feedback, in-house CPD, and, of course, a bit of classroom experience. This is bad. Bad bad. Because I have developed a set of routines, of practices, of “knowledge” which are, in fact, very hard to change. Oh, I may make lots of noise about research, about innovation, about challenges and being challenged, reflective practitioner, blah blah blah, but a lot of it, I worry, is so much hot air. 

Take one of my favourite bug bears: SMART targets for ESOL learners. Now let’s imagine that some university somewhere funded some formal research into SMART targets. And they did a massive study of second language learners in a multilingual setting which showed, without question, that students who used SMART targets to monitor their learning achieved significantly higher levels of improvement when compared to those who did not. Let’s imagine that a couple more universities did the same, and found very similar results. In fact, there developed a significant body of evidence that setting SMART targets with students was, beyond a shadow of a doubt, a good idea. Pow! 

Now, in our fictional universe, let’s also imagine that I read these reports and am struck by the convincing nature of the evidence which runs entirely at odds with my opinions, beliefs and understanding. I have to wonder that even, in spite of this, I would be able to make the massive mental leap of faith and accept that I am wrong and the evidence is right. Could I do it? On a similar vein, if it turned out the evidence was all in favour of learning styles; that technology is, in fact, a panacea for all educational challenges; and that there is a fixed body of objective Best Practice in Education which works for all students in all settings all the time, if all this turned out to be true, could I align useful with all this because the evidence told me so? 

Probably not. 

For one, if all these things turned out to be true, I’d probably have some sort of breakdown: you’d find me curled up in a ball in the corner of a classroom, rocking backwards and forwards muttering “it can’t be true, it can’t”. More importantly, however, what this shows is that evidence and facts can say what they want, but the pig-headed stubbornness of a working teacher is a tough nut to crack: it would take a long time for me to adjust, to take on the changes to my perceptions and to work them into what I do. It might not even happen at all: even in the best case scenario, I think I would probably want to cling on to my beliefs in the face of the evidence. 

Unless something chimes with our beliefs about our practices, unless we agree in our professional hearts that something should be true, then short of a Damasecene epiphany in front of the whiteboard, it’s going to be extremely hard to embrace it. Let’s not beat ourselves up about it, mind, because that’s not going to help. And don’t let’s beat up others either: we are, after all, only human, and I have a suspicion that, regardless of our politics, one of the things that professional experience leads to is some form of professional conservatism. How do we get past this? 

Expectation, probably, would be a good place to start: it’s too easy for leadership and policy makers to declare that a new practice, with an evidence base, of course, is good and should be enforced. How effectively that gets taken up depends on the size and the immediate visible impact of that practice. When I am leading a training session, I start with a very simple expectation: that everyone go away with just one thing which they can use with immediate and positive impact. It’s u realistic to expect more, and if an individual takes away more than one thing, then that’s a bonus. To expect more than this from any kind of development activity is probably unrealistic, and actually, so what? If someone takes on a new idea and puts it into place, then that’s a success surely? We can apply this also to evidence based practice: make small changes leading up to the big change, and the big change will much more likely happen. This is often not good enough for some leadership mindsets, who demand quick, visible changes, but that is a whole other barrier to teacher development which I’m not going to explore. 

Time, of course, would help, but given that FE in particular is financially squeezed and performance hungry, this time will need to come at the teacher’s own expense. No time will be made for you to read, discuss and understand research (and God forbid that you attempt to try anything new during formal observations) so that time must be found elsewhere. Quite frankly, however, even I would rather watch Daredevil on Netflix of an evening than read a dry academic paper providing evidence in favour of target setting. (Actually, I think I would read that paper; so, you know, when you find the evidence, do let me know: because I’m sure that ESOL manager and inspectors have seen this evidence and are just hiding it for some random reason. After all why would such a thing be an absolute requirement?)

Deep breath. 

I’m sorry this has been such a long post: it’s been brewing quietly while I’ve been off and I’ve been adding bit by bit. But there’s a lot that bothers me about evidence based practice. Things like the way learning styles hangs on in teacher training courses, and therefore is refusing to die. Things like the rare and to easily tokenistic support for teachers in exploring evidence and engaging with it. Things like the complexity of applying a piece of evidence based on first language primary classrooms to second language learning in adults. Things like the way the idea of evidence based practice gets used as a stick (“You’re not doing it right, the evidence says so.”) while at the same time being cherry picked by educational leaders and policy makers to fit a specific personal or political preference. Not to mention the way that the entire concept of needing any evidence can be wholeheartedly and happily ignored by those same stick wielders and cherrypickers when it suits them. An individual teacher’s challenges with evidence which runs counter to their beliefs is a far smaller one than when this happens at an institution or policy level. A far smaller challenge, and an infinitely less dangerous one. 

Advertisements

Faith and Stuff

“We weren’t supposed to be, we learned too much at school, now we can’t help but think the future that you’ve got mapped out is nothing much to shout about.”

Pulp – Mis-shapes

You will be glad to know that I have no plans to come over all Dawkins at you: your faith, like the absence of mine, is an entirely personal affair, and none of my business. That’s a hint, by the way, for anyone ready to save my soul with a comment…

No, this isn’t a post about that kind of faith. This is a post about teacher faith. You see, not so long ago I was a keen enthusiastic little trainee on a part time training course and there were all these people telling me stuff. Stuff that would make me a good teacher, stuff that I needed to do to pass the course. That sort of stuff. Then I did DELTA, and learned shitloads of stuff. And I believed every single word, referenced or not. Because they were trainers and they knew their stuff, right? And then later on I started working in the public sector and managers told me stuff, and people said that certain stuff was best practice and that I should do that stuff because OFSTED said it was good stuff. I was a true believer. I listened, I absorbed, I followed the True Path of the Righteous.

I can pinpoint, to within a few months, the arrival of my professional scepticism. It was around September 2004, and it was the insistence that setting targets helped ESOL learners learn English. It was my first encounter with the “because it’s good practice” non-argument, and when pushed for a better explanation, nobody could come up with anything. (Still waiting for the research to prove it, by the way.) Suddenly all these authority figures saying stuff began to sound, well, unauthoritative. I asked for evidence, politely, and was politely rebuffed. So I tried to find out for myself, and I found a big fat heap of nothing. Lots of “best practice guidelines”, lots of advice, and a tiny little bit of not very informative “how to” guidance, but nothing that said “this works because this research and this study said so.” I started to look around at all the other stuff I’d been told over the years, looking for evidence for all sorts and by golly was it interesting. Some of it was there, some of it wasn’t, but for an awful lot of things I’d been told was good practice had little or no evidence base. The whole house of cards started to look very rickety indeed.

We place a lot of faith and trust in our teachers. Necessarily so. ESOL students trust that we are telling the truth when we say that we use some for positive statements and any for negatives (I have some apples, I don’t have any bananas) even though it later turns out that this is not really the rule as it is used. (I like most pop music, but I don’t like some of it. I like any coffee, I’m not fussy). It’s an uncomfortable way to phrase it, but sometimes teachers lie to learners in order to make a complex thing less complex, more easily understandable, and this is what happens on initial teacher training courses: we simplify and tell lies so that some basic level understanding can be established before the teachers go off and discover that more or less everything they’ve learned is not wrong, as such, but is almost certainly not much more than a useful guideline.

Asking difficult questions like “who says so?” however, tends not to make you very popular. Nobody likes a smart Alec, after all. You get accused of all sorts when you ask questions: accused of being disrespectful, of being a cynic, of mocking, of not being aspirational, as well as quiet but stern reminders of your place in the grand scheme of things. People who ask questions make life difficult. I have had trainee teachers in the past who ask awkward, challenging, perceptive and generally brilliant questions of you. So far, these trainees have been, without exception, the strongest trainees on their respective courses, and the most successful subsequently. But when they ask those questions it’s bloody annoying, and so it should be. After all, you are getting your beliefs challenged, and that’s hard, but the benefits are endless. It forces you to go away and examine your position much more carefully and thoroughly, and either you come back stronger, and with greater evidence or support, or you come back humbled and your mind broadened. The worst thing you can ever do to people in this situation is dismiss them with “it just is” statements like “it’s good practice” because that’s deeply patronising. You may as well just pat them on the bottom and say “don’t you worry your pretty little head about it”.

Like I’ve written before, there’s nothing absolute in teaching, nothing fixed, although absolutes are which new teachers might find reassuring. Perhaps atheism and religious belief is not the right parallel here: they depend on absolute beliefs. Perhaps agnosticism is the better parallel: we can never fully know for sure, and we are always learning and changing as teachers in face of the evidence as it occurs before us. Any faith we do have must be a flexible faith, one which is open to new thoughts, new developments and interpretations. We must never assume that something is right, at least not on face value, and even where the evidence does exist we must still analyse it and think about how well it can apply to our own contexts of teaching and learning.

Evidence, Anecdotes and bike helmets again.

Last week, Chris Boardman, Olympic cycling champion, and general ambassador for cycling in the UK went on public British TV to talk about the benefits of cycling, reducing the barriers to it and the rest. The piece showed him and the presenter riding sedately round London, with the BBC presenter wearing the full cycling monty: helmet, hi vis, lycra, the works. Chris Boardman, who has made his career on two wheels, was wearing not only normal clothes, but no hi vis stuff and crucially, no helmet. Naturally various social media went off on one, complaints were made to the BBC for Mr Boardman setting a bad example, and with gloomy predictability the calls for helmet wearing to become a legal requirement followed. Sensing a nice bit of pent up outrage the BBC followed this up with an online poll about cycling with headphones on, where a majority said cyclists shouldn’t be allowed to do so.

Well, ok. I talked about helmets before, and there is an excellent site devoted to the statistics for and against helmet wearing. And by and large, statistics would suggest that a) the public health benefits of cycling far outweigh the impact of wearing a helmet; and b) where helmet wearing has been made a legal requirement this has led to a decrease in cycling uptake. The cycling fraternity is usually quick to point out as well that in enlightened countries like Denmark and the Netherlands, helmet use is unusual and the number of cycling head injuries are lower because the infrastructure in those countries is far better for a more sustainable transport economy.

The pro-helmet lobby is a bit rubbish with its statistics and prone to a nice bit of cherry picking – especially when one considers that far more head injuries in the UK occur when people are driving or walking and yet nobody is suggesting that pedestrians or motorists wear a helmet or not listen to music. (Statistically, especially in the case of pedestrians, helmets might not be a bad idea…). The most pervasive and engaging pro helmet lobby, of course, is the personal anecdotes: genuine sad stories along the lines of “if my son had been wearing a helmet….” My possibly slightly callous answer to that is to say “yes, but if your son had been riding in an environment where the culture and road infrastructure was less aggressively pro-motorist he might not have had an accident at all.” Unfortunately you can’t fit that sort of thing into a neat headline or Facebook meme.

All well and good. I still wear a helmet, if I remember. I encourage my children to wear one. Sometimes I forget and cycle a few miles without one, but when I do I get it in the neck. For some people to is tantamount to just walking out in the street and lying down under the nearest bus.

This is all about evidence and the use of evidence. When it suits a particular social or political viewpoint, evidence is triumphantly pulled out and waved under the noses of everyone who cares to listen. When it doesn’t, however, it gets quietly ignored in favour of emotional anecdotes.

We can see a parallel here with teaching. When a piece of evidence exists in favour of a particular viewpoint it gets quickly trumpeted across various media. This happened with the Sutton Trust report “what makes great teaching” last week. Here we have a fairly big study which is generally pretty careful in its claims and suggestions. It is pretty measured stuff, but the only bit that made it into the media was the big claim that evidence suggests that discovery learning is less effective than direct instruction. It’s worth noting that this claim is made carefully in the report. First up, the direct instruction it describes is telling people stuff, yes, but also questioning them, finding out what they remember, and generally involving them in being talked at. What it isn’t is the two hour lecture that comes to mind, and which suits a nostalgic conservative view of education in the Good Old Days. And actually, discovery learning done well can be effective, but which needs lots of support and scaffolding from the teacher. Done badly, all educational interventions are rubbish.

Much the same happened with the FELTAG report. This made all sorts of good suggestions, but the only one which appears to have made any inroads into education is the suggestion that 10% of all courses become online. It’s been almost amusing to watch the handwringing that has happened by various commentators in social media about this. “But FELTAG is about more than the 10% thing! It’s about teacher development and sharing and all sorts of good things!” And they are absolutely right, but I’m not sure in which naïve universe the hand wringers are living when they think that a sector under increasing financial pressure is not going to focus on this as an opportunity to cut costs. The only time anyone ever hears of FELTAG is when it gets cited for the reason behind the 10% of courses going online. Pretty much all the good ideas about supporting and training teachers have been forgotten, because they are expensive ideas. Because the FELTAG report has an evidence base, or at least claimed to, then it gets used as a stick with which to beat staff into getting on board. I’m lucky: we have regular training days at work where we have the whole day to explore these things, and some serious time has been given over to preparing and supporting staff. I’m also pretty technologically confident, so my learning curve hasn’t been that steep. But that’s not true for every teacher in the land, not by a long shot.

The other challenge with the 10% rule is that on reading the report there appears to be no evidence to suggest that it works. The whole of FE is basically taking part in a massive experiment. It’s not been sold that way, of course, but there doesn’t appear to be any evidence nationally or internationally that this is going to work. For me, when viewed that way, this becomes actually more exciting, because UK FE can be the big study cited in the rest of the world about how this worked really well on a massive scale. I hope.

One way or the other, it does fit a particular need at a particular time, it fits trends and fashions and what people are interested in. Ditto the cherry picking of the Sutton Trust Report, and much of the discussion around helmet use for cyclists. Evidence is there, yes, but it’s usually filtered through the censors of fashion and policy, and so needs to be viewed as sceptically as the rest.