Assessing Children’s English – Again

If I asked teachers to ‘agree or disagree’ with the following statements, what would most say?:

‘The purpose of assessment is to allow students to demonstrate what they know’.
‘One function of assessment is to increase student motivation to learn’.
‘Assessment should give students a sense of achievement’.
‘Assessment should encourage creativity’.

My hunch is that most teachers would agree with such sentiments. Translating them into practice, though, remains a challenge and the way students’ English is assessed is very often in stark contrast to such ideas. I reflected on this in an earlier blog called ‘Why do teachers assess English the way they do? A recent test paper again highlights important issues in the assessment of children’s English and I would like to comment further.

I will take some examples from a recent English test for 13-year olds in Slovenia (though I am sure the issues highlighted apply more widely). Overall, about 90% of the test involved grammar-focused gap-filling work, with some vocabulary and translation exercises. Here is one example:

The question tested students’ knowledge of phrases following the pattern ‘That looks/ sounds ….’. In this case the student provided three reasonable answers, two of which were awarded zero marks because of spelling errors. There is no recognition of the students’ knowledge of the target expressions and the focus on spelling implies that accuracy is all that matters.

The test also contained a gap-filling vocabulary exercise. Here is an extract:

This looks like a free exercise but students were in fact expected to remember words used in their textbook. So, while ‘went to the USA’ makes reasonable sense and might deserve half a mark, ‘emigrated’ was the expected answer. And, once again, spelling errors are heavily penalised, with zero marks awarded for ‘vulcanic eruption’. I am sure that many creative answers to the item about Scotland were possible – but ‘parliament’ (spelled correctly) was I suspect the only one that would have been marked correct.

Some of the translation items (Slovene to English) were also interesting. One was this:

Question: Vojska Edwarda I je v 13 stoletju osvojila Wales.
Translation: The army of Edward I conquered Wales in the 13th century.

The rationale for such content is that it appears in the textbook and students should therefore know it. But there must be more relevant and meaningful language contexts in which to use translation as an assessment tool. In the example I reviewed, the student scored zero marks on three items of this kind. Overall, they received the lowest possible grade and it is unlikely that the experience will have stimulated in any way their interest in learning English.

How teachers assess is often largely a matter of tradition – things are done the way they always have been and new teachers entering the system are quite quickly assimilated (see my earlier blog on assessment in pre-service teacher education). Changing traditions of this kind is only possible with education, informed leadership , collective engagement by teachers and, importantly, greater overall awareness of the impact (positive and negative) that assessment can have on children’s attitudes to learning English.

Much information is available online to support the development of teachers’ assessment literacy. I would once again recommend The British Council’s Teacher Assessment Literacy project, including this video on assessing young learners. This Handbook of Assessment for Language Teachers is also an excellent resource.

If you are involved in supporting the assessment literacy of pre-service or in-service teachers of English, please consider using the above examples (and those I discussed earlier) and relevant online resources to encourage teachers to reflect on their assessment practices and, very importantly, the impact these have on students’ attitudes to English.

This entry was posted in professional development, teacher education and tagged , , , , , . Bookmark the permalink.