Abstract: thoughts about the difficulties and benefits of negative questions and experimental results relating to anchoring, cognitive load, age and ability and curiously not squeezing in any references to Rogue One which is out TOMORROW
Last week I delivered some CPD which partly involved the uses of multiple-choice quizzes to reinforce content. I use multiple-choice quite regularly as homeworks (and of course it is now a part of the A level testing regime so students need to be familiar with the style), constructing them myself to better target misconceptions and errors I pick up from assessments. During the preparation for the CPD I came across some suggestions that were pretty useful ideas to consider when building a test, but the one that struck me was about using negative questions, for example:
Which of the following is not an endocrine gland?
I like this kind of question because I think there is often more value in thinking about why something does not belong to a group than why it does (which can often simply be a case of recognition of a word rather than involving any more advanced cognition). The advice given suggested not using this kind of negative question for important concepts since it presumably could increase the chance of incorrect answers being associated with the question. I wondered if my tendency to sue this type of negative questioning was not actually helpful at all.
In a content heavy subject like science there aren’t so many opportunities for new ideas to be prone to bias from our intrinsic models: few people have any prior investment in how chlorphyll absorbs light. It can be taught and demonstrated and accepted by students with little chance of students bringing prior experience and personal internal models into the mix and create a competing idea. Some concepts do suffer from the inability to model easily without making comparisons with some imperfect analogies (I’m looking at you, electricity) and some subject areas do have a more ‘intrinsic knowledge’ feel to them, such as reproduction, where students do bring their (invariably incorrect) ideas to the subject. I suspect this is one of the reasons students claim physics is harder than biology. They have an intuitive ‘feel’ for biology because they can build their models on something they feel is concrete, such as owning their own body, in a way which forces and energy topics don’t really lend themselves. Knowing that students may build the architecture of their models on pre-existing ideas makes me tend towards the side of teaching the correct (or as near as possible given age limitation) versions of content, rather than risking creating a castle in the sand of incorrect assumptions. Granted, by the time they are completing multiple-choice questions they shouldn’t be at the construction stage but be consolidating and reinforcing the correct knowledge. Yet for many students it is the repeated exposure to the content that is important, and presenting ambiguous questions whether something is correct or not may not be helpful for lower ability students. I suspect there is also an element of anchoring going on here too; the tendency to rely on initial information to form judgements. If you are looking to build firm mental models then the introduction of wrong information could well be a problem.
A few weeks prior I had done an experiment with Year 8 students using washing powder, bio non bio on stains at different temperatures. I knew what I expected to happen, only the actual results were not so clear. The bio wash powder appeared to be working at high temperatures in direct contradiction to what I wanted the students to understand from the experiment, namely denaturing. Presumably the effect of the hot water was sufficient to remove many of the stains on its own, rendering the presence of denatured bio powder pointless. Experimentally this interesting because it presents the issue of disproving a hypothesis, speculating on what went wrong and how to design an experiment to test a new hypothesis. This is all bread and butter stuff for science. It turns out to be very difficult for students to grasp though. It’s a very time intensive process to tease these ideas out of students, particularly when they are younger or lower ability groups. Trying to build this sort of discrete knowledge through an investigative or enquiry approach, or even using the ‘get students to think like scientists’ is difficult (though not impossible) but it would certainly be very time-consuming. Most students simply don’t have the necessary breadth of knowledge to tackle the question from a scientific standpoint. They can’t recognise what is important from among the distractions. A key difference between novices and experts is experience, which tends to manifest as the ability to recognise the underlying idea.
Another experiment where this difficulty is obvious is the old classic food tests. There is such an expectation that a result will ‘show something’ that students start to see colour changes in reagents which aren’t there. Negative results are viewed as ‘not working’ rather than being a valid experiment showing the absence of something. The high cognitive load of experiments (physical manipulation, observation, recording, safety…) that it can be very difficult to be sure that students are really thinking about what you want them to learn. It’s hard enough for a student to keep track of all that without considering the reverse possibilities and outcomes.
It’s such an important idea for science to be able to be able to test the limits of a an idea, to ask the negative side of a question: “what would I need to see for this NOT to be true?”. It’s not a natural way of thinking and I’m not really sure of the best way to develop students into thinking this way. I think for higher ability (A level, Triple) there is mileage in using these negative question and negative results because I assume their level of understanding of the underlying concepts is firm enough to avoid cognitive overload. At what point I can use these strategies with lower ability and younger students though…I don’t know. Bad models are harder to remove than day old fried cornflakes from a bowl. Simply telling students the answer may turn out to be the best approach in many cases.
One final point on the negative answers. Our SCITT student has just (as in last week) successfully got through her PhD viva. I invited her into a year 12 lesson a few weeks back to give the class a taste of what was involved in research, PhDs and so on, and to talk about what she’d studied for her thesis (pathogenic bacterial communication, which linked nicely to what the class had learnt recently). During the Q&A I asked what her hypothesis was for the thesis and whether her results had supported it. After laughing for a moment, she basically said that she’d not found what she’d expected, and that ‘further research was needed’. Essentially a negative result. Viva la science!