Just a quick note before I start – this is about the UK ESOL context, rather than the international context. Anyway, the thing with exams is that they exist for a range of reasons, and rarely do they put the learner first. Actually that’s not fair, they’d like to but they can’t. An exam board is pulled in several different directions. Firstly, they are an organisation that is in the business of getting your institution to have your exams run by them. This is pretty fierce competition in the UK college sector, and I wouldn’t be that far off the mark if I said that Cambridge ESOL seem to have it pretty much sewn up, with Trinity following second. Part of this is historical. In the pre-SfL days you could get learners to do what they wanted, including nothing, and this was fine (as well as the teaching being hit and miss as well). So Cambridge ruled the roost with theirsuite of international exams (KET & PET, easy, FCE harder, CAE & CPE don’t even bother). Trinity had a few, the other boards had a little stab at some, buut none really got to Cambridge heights.
Then came the Skills for Life crackdown. Suddenly international exams weren’t fundable (arguably due to lots of EU students being mixed up with the more traditional ESOL learners, and unscrupulous EFL departments using skills for life funding to add to their coffers from international learners). So there was a gap. Trinity got there first, and ran the first SfL exams in 2005, but these felt, to be honest, a bit half baked. Cambridge took a year later and most of the colleges I know suddenly decided, en masse, to drop Trinity and hook up with Cambridge.
Things got more complex a year or so later, with the introduction of the ESOL for work quals, and some of the other exam boards finally catching up. Some of these experimented with portfolio assessment – quite a clever way of getting a weaker learner through entry level courses, and thus boosting your success rates (achievement x retention divided by the square root of the cost in pence of the principal’s car). City & Guilds took this route, and have been quite successful even against the mighty behemoth that is Cambridge.
So Exams. What are the pros and cons? First, assuming that your learners are successful, and pass, they are a real boost to an individual’s self esteem, and, as they are tied to the national qualifications framework (hence the levels) then they should, in some small way, contribute to that learner’s employability and overall standing. Then there is the benefit to a learner applying for citizenship or ILR where they are required to prove advancement from one level to the next – something that has recently been tightened up. This requirement has an inevitable impact on the learners wanting exams, and in some cases their overall attitude to the exams. Other positives – they create an aura of professionalism in the class, that the learners are not just there for a bit of a chat, but to actually achieve something.
The flip side to exams, and indeed this could be said of any kind of exam, is that you end up almost inevitably teaching to the exam. There is a term for this – the backwash effect, which is the effect that a given assessment has on the course being taught. This can apply to formative assessments, considering the way in which we teach to a lesson with learning objectives or outcomes. If a lesson has outcomes which are required to be measurable then it follows that you need to assess these outcomes. So you need to devise assessment activities, which have an impact on the way in which your lesson is taught (you need, for a very simple example, to build in time for assessment activities). This impact can be quite subtle – having decided your learning outcomes, you need to assess the learners against these. How do you do this? You monitor, get feedback, give feedback, use peer checking, self checking, error correction, all sorts of things. God forbid you might even go through the learning outcomes with the learners at the start of the lesson and then devise some sort of final activity to assess the learners on these points. (Incidentally the lamest way of doing this is to tell the learners the outcomes at the start (probably on powerpoint with nippy little animations) “In this less we will…” then “review” the learning outcomes by going back over them at the end saying “In this lesson we have…”. This is rubbish and is no use to any learner, especially if you have all the codes and core curriculum speak in them. It doesn’t actually involve the learners, or help them engage with the learning. So don’t do it like that. Be creative, for example, give the learners some questions at the start of the lesson which reflect the learning outcomes, and see which they can answer, then at the end of the lesson, go through the questions again and see if they can answer them. You might need to spell the purpose of this out on your lesson plan lest an ignorant observer thinks you haven’t shared/reviewed your learning outcomes. This is quite possible, especially if they are from the powerpoint slides “We will.. / we have..” school of thought.)
Anyway, backwash. At its most dramatic, this can involve the whole contents of a course being skewed to take into account the needs of the exam. A simple example is the choice of topics being used – you may, rather than asking your learners to think of their own topics, give them a list of topics which they then select a few from. Ideally this is wrong, the course should revolve around and be drawn from the learners, not the exam syllabus. However in practice, it’s the best workaround. To be fair, the topics of the ESOL exams are pretty reasonable (shopping, health, homes, family, etc.) and generally do reflect the main interests / needs of the learners. However, the potential is there for this not to be the case (imagine, for example, a level 2 group of graduates from eastern European and African countries who might be more interested in global and international affairs, rather than local issues, or the issues covered in the exam topics. The exams make major assumptions about the learners, which may not be borne out by a given group, and therefore create a tension between what you need to cover for the exam and what the learners need/want to cover. This goes to an extra level when you consider that pretty much 75% of the last term of any given year will involve enormous amounts of exam related material and exam practice. And you can argue that a quantity of the rest of the academic year involves preparation for exams, indeed, perhaps all of it. Backwash is enormous, and important, and we need to be aware of it.
There is a big “however” here. We can’t do much about it, other than adapt to it as best we can. Exams are where funding is drawn from (success rates, etc.) and therefore are the vehicle through which our learners can keep their free/subsidised places. They have benefits in terms of motivation and giving your course a real respectability, validity and authority in the eyes of learners and in the eyes of the wider community. Not an evil as such, but neither are they as bad as they could be. But necessary they definitely are, whether we like it or not.