Adapting Assessment for Learning techniques for an IFP Mathematics classroom

by Jess Thomas

As many of us working in the context of an International Foundation Programme (IFP) experience, the challenges we face within our classrooms can vary in contrast to other teaching contexts. After joining the University of Bristol from teaching Mathematics on an International Baccalaureate (IB) programme at an international school in France, I started to think about how I could adapt my teaching to better suit the context I am now working in here at the Centre for Academic Language and Development (CALD). I began by reflecting on some of the key differences and challenges within the classroom, then looked at viewing them in light of some literature on formative assessment. Finally, I turned my thoughts to planning and delivering my lessons.

Challenges in an IFP Mathematics classroom

Some of the main challenges within our IFP Maths classes are the large variation in prior knowledge, combined with the rapid pace and large volume of content in the courses. Both of these are markedly different to the school context I was used to working in previously. I also noticed that some IFP students were not very accurate at gauging their own understanding of a topic: they were overconfident whilst still misunderstanding some key points. One other challenge I was looking to address was students being afraid of making mistakes and reluctant to discuss things with each other. Alongside these challenges, I was looking to find ways to increase student engagement and active involvement during my lessons, particularly during the explanation of examples.

Formative Assessment / Assessment for Learning literature

The seminal work of Black and Wiliam (1998) has shaped the thinking within education around assessment. According to their definition, assessment only becomes formative when the evidence leads to adaptations in teaching which address students weaknesses. Through the analysis of a wide range of studies, they showed that strengthening formative assessment in the classroom led to significant improvements in learning (Black and Wiliam, 1998). However, as teachers began to implement this more and more within lessons, Wiliam noticed that at times it was becoming associated with the practice of summative assessment. With hindsight, he suggested that perhaps ‘responsive teaching’ could have been a better name to make its purpose more explicit (Wiliam, 2013). I have certainly noticed the frequency of the way students in our Centre use the word formativegenerally with a different association than that intended by Black and Wiliam. Christodoulou (2016) emphasised that ‘formative assessment’ should not become a high stakes tracking tool; instead, it should focus on low stakes diagnostics. Rather than formative assessment, or responsive teaching, these techniques now tend to be referred to as assessment for learning’ in the literature.

If formative assessment is about more frequent, assessment for learning is about continuous.

If formative assessment is about providing teachers with evidence, assessment for learning is about informing the students themselves.

Rick Stiggins (2005)

Choosing Assessment for Learning techniques, and why I like to use Plickers

Throughout my teaching career, I have enjoyed incorporating different assessment for learning techniques within my lessons. These have ranged from direct questioning and mini-whiteboards, to using summative assessments for formative purposes. When it came to selecting a tool for use with my classes, I wanted to choose something that would also address some of the other challenges I had noticed – such as students’ fear of making mistakes and unwillingness to discuss answers. I had tried using Multiple Choice Questions to generate discussion in maths classes before and thought that by combining MCQs with a tool that allows anonymous voting, students would not only feel more secure in making mistakes, but this would also generate more discussion around the questions we were working on. This is where the ‘Plickers’ tool came in.  

Each student is given a unique card (similar to a simplified QR code). If a student – let’s call them Chen – with the card #1 below wants to vote for multiple choice option B (e.g., as their answer to Figure 3, below), they hold their card up so that the B is uppermost (as shown in Figure 1). Other students can’t easily see which option Chen has chosen (as A, B, C, D are written in a very small, light grey font), so students are unable to influence each other’s choices. The teacher then scans the students’ cards with their phone and the Plickers software records how each student voted (e.g., Card 1. Chen: B).

Plickers card (#1) with options A, B, C, D written in a small, grey font.
Figure 1: A Plickers card (#1) with options A, B, C, D written in a small, grey font

For anyone who is interested in learning more about this tool, the link below provides more details about how it works and information about how to set this up with your own classes: What is Plickers? – Plickers

The application promotes engagement from all class members, and I felt helped address a key point from the Assessment Reform Group (2002), around the importance of the active involvement of students in their own learning. I hoped that – alongside other benefits – it could also counteract moments of disengagement that I had been noticing, particularly when the class were noticeably flagging during a 4-6pm workshop!

Using Assessment for Learning techniques effectively

For assessment for learning techniques in the classroom to be effective, the way in which they are implemented is just as important as the techniques themselves. Figure 2 below (Cauley and McMillan, 2010) shows how different characteristics can lead to low level formative assessment or high level formative assessment depending on how they are implemented. One of the aspects that particularly interested me was that it highlighted the importance of when the techniques are deployed, and particularly the importance of this happening during the instruction. This influenced my thinking when working out how to most effectively use the ‘Plickers’ tool within my classroom.

Figure 2: Table from Cauley and McMillan (2010)

The importance of timing of a formative assessment technique being implemented was also highlighted by Cowie and Bell (1999, p.32) and Kahl (2005, p.11) who specify in their definitions of formative assessment that it must occur during the learning (i.e., a mid-stream tool).

In light of the pace of mathematics content within the IFP, these aspects of the definitions were particularly interesting. We do not often have the luxury of spreading the learning of skills or content out over multiple lessons. Reflecting on the number of learning points throughout a lesson, I realised that for this to occur during the learning, there were multiple points in the lesson where it would be beneficial to incorporate it, over and above where I would have traditionally done so. I also thought the idea of a mid-stream tool was particularly interesting, as in the past I would wait to assess the learning of students at the end of a learning point. By embedding my implementation of the ‘Plickers’ tool by using this throughout my lesson, and even within the introduction of some examples, I increased the effectiveness of this formative tool.

In an environment where we are bridging the gap between school and undergraduate studies, I have also become more mindful of developing students’ independence. I found that using the ‘Plickers’ tool helped the students identify areas of misconception and understand better what they needed to work on themselves.

Examples and reflections about using Assessment for Learning techniques earlier in lessons and teaching, in the context of an IFP Mathematics classroom

When I initially introduced the ‘Plickers’ tool within my classroom at CALD, I quickly noticed a positive reaction from my students and increased engagement with questions I was answering. As I have begun to develop how I am using it further, it has been a helpful way of making changes to the culture of the classroom where it is okay to make mistakes – as well as encouraging more discussions around the problems that I was setting. The live feedback not only informed my teaching but also provided students with instant feedback that helped them to understand where they had not achieved full understanding. The fact that the data is stored on the website is also useful, particularly for helping to identify any students that might benefit from follow up discussions.

In light of the literature discussed above, I also began to explore using this within an example itself rather than using the tool to check understanding after I had explained an example (see Figure 3 below). One of the main things I noticed was this led to a more active involvement of the students in the example I was giving. Each student was given the opportunity to think about the question, and a high proportion were able to work out the next step when the example was broken down in this way, without my explaining first (as I would usually do). Having the live information about students’ misconceptions also helped me encourage conversations between students that would facilitate their learning and also helped me know which students might feel more prepared to explain their thinking behind their answers with the whole class.

As a development point, I began to notice that sometimes when students were unsure between the four multiple choice answers – they would just take a guess, which resulted in my receiving slightly less useful information. I then introduced the fourth option of the multiple choice as ‘please explain’. Although there are times when it doesn’t get used, I have found it provides much more useful information than some of the students just guessing at random. It also provides a justification to students as to why I might be taking time with an explanation before moving on, if they can see this has been requested by multiple people. Being able to display the range of (anonymous) answers to the class has been really useful, not only for this, but for facilitating discussions around any misconceptions that lead to the wrong answers being chosen.

A pile of potatoes weighing 100 kg is put in the sun. 99% of the weight of these potatoes is made up of water. After a day, some of the water evaporates. Now 98% of the weight of the potatoes is made up of water. What is the new weight of the pile of potatoes? A: 99 kg; B: 98 kg; C: 50 kg; D: I'm not sure how to start - please explain!
Figure 3: A multiple-choice question taken from Alex Bellos’ 2016 book Can You Solve My Problems? This was the example shown in CALD’s Cross CoP Sharing event. Using the Plickers tool, the teacher can reveal the percentages of who answered what (e.g., A: 26%; B: 27%; C: 31%; D: 16%) before colours are added showing correct (C in green) and incorrect answers (A, B, D in red).

One of the unintended benefits that I have found from engaging students early in accessible answers, such as Figure 3 above, is that whether or not a student has come to the correct answer independently, they have had time to think about and understand the question before I begin an explanation. This then helps to reduce the cognitive load for students during my explanation of the solution, as students have already had time to process the question. For our students learning in a second language, I think this is particularly important. The technique naturally slows down the examples, giving time needed to understand the question in an engaging way. Students are also made aware of whether or not they have got the answer correct, before I begin further explanation – which sometimes reminds students that may have been overconfident in their understanding that they need to pay attention to the explanation…!

It has been great to hear a few stories from those of you who have tried this yourselves in lessons, and I hope my thoughts here are useful to those of you who would like more information.

 

References:

Bellos, A. (2016). Can You Solve My Problems? Guardian Faber Publishing.

Black, P. and Wiliam, D. (1998). Inside the black box : raising standards through classroom assessment. Florida, U.S.A. : Gl Assessment.

Cauley, K.M. and McMillan, J.H. (2010). Formative Assessment Techniques to Support Student Motivation and Achievement. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 83(1), pp.1–6. doi:https://doi.org/10.1080/00098650903267784

Christodoulou, D. (2016). Making good progress? The future of Assessment for Learning. Oxford University Press.

Cowie, B. and Bell, B. (1999). A model of Formative Assessment in Science Education, Assessment in Education: Principles, Policy & Practice, 6(1), pp. 101–116.

Kahl, S. (2005). Where in the world are formative tests? Right under your nose! Education Week, 25 (September (4)) (2005), p. 11.

Stiggins, R. (2005). From formative assessment to assessment FOR learning: A path to success in standards-based schools. Phi Delta Kappan, 87 (4) (2005), pp. 324-328.

Wiliam, D. (2013). Example of a really big mistake: Calling formative assessment formative assessment rather than something like ‘responsive teaching’. [online] Available at: https://x.com/dylanwiliam/status/393045049337847808 [Accessed 7 Sep. 2017].

2 thoughts on “Adapting Assessment for Learning techniques for an IFP Mathematics classroom

  1. I really enjoyed this post. Thanks, Jess! I have been conducting a lot of polls with my online class this summer, and I’m now wondering if I should use Plickers with my face-to-face classes during the rest of the year. We have a lot more ambiguity in EAP, which means sometimes open questions work better for us but I do like to vary the pace throughout my lessons, so this could help.
    I am also inclined to agree with William (2013). Especially in the age of AI, calling things “assessment” when they aren’t summative risks students getting help from ChatBots when they don’t feel entirely ready.

    1. Thanks Julia! There is an option within Plickers, where you can use ‘survey’ instead of ‘question’ – meaning there is no right answer. Feel free to ask me anything about it when you see me around the centre 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *