Thanks to everyone who came to our latest Faculty Resource Workshop. Lindsay Schwarz (Pharmacy), Tony Frankino (Biology), Donna Pattison (Biology), and James Garson (Philosophy) spoke on “The Critical Mulitple Choice: Using Multiple Choice to Foster Critical Thinking,” and the audience included faculty members from every discipline. Some were interested in how to improve their multiple choice (MC) questions, but others were looking for new ways to use MC.
Lindsay Schwarz started things off with the question, where do we learn to write multiple choice questions? By taking MC tests, of course. Because so many of us think of MC as information recall only, we must have taken some bad tests. Lindsay, therefore, led us through best practices that link MC questions to course objectives, explaining how to move up Bloom’s taxonomy of learning domains. Lindsay recommends creating a test blueprint that maps out how much lecture time is devoted to each topic and then creating MC questions that mirror those time ratios. Even the types of questions can be based on the test blueprint: if she asks students to do critical thinking on three of the test topics, then those topics should use critical thinking questions. The point is that if instructors use good questions that connect to course lectures and objectives, then students will think that the course met its objectives.
Lindsay then went through methods of analyzing test data received from University Testing Services. The extended item analysis shows how the top, middle, and bottom performing students scored on each question. The analysis can help explain whether the question is valid and which distracters were easily omitted.
Lindsay ended by mentioning the advantages and disadvantages of MC questions. One disadvantage, she said, was that MC cannot be used to assess writing. The rest of the room began to discuss this idea with one participant saying that it is not appropriate to assess something that was not specifically covered in class lectures. Therefore, it is not okay to assess writing unless the class teaches writing. Therefore, it isn’t a disadvantage of MC that is doesn’t assess writing. Another participant claimed that assessment of writing is different from testing content, and that engineering assesses writing in many different classes that don’t actually include lectures on writing.
Tony Frankino then spoke about how he uses MC questions to foster critical thinking through his use of graphics to teach evolution. Instead of asking students to merely interpret or recall a formula, he asks questions that force students to recall, identify, interpret, and apply their knowledge. He uses CASA, as well, to switch response orders, but he retains distractor groupings. He reuses questions on the final exam, but they won’t see the same question twice because he has different versions of the same questions, each with different answers. He may have four different versions of each question. One participant asked about using different level questions from Bloom’s taxonomy for each topic, but Tony claimed that such a tactic penalizes the student twice if they don’t know a single answer or topic. Tony also uses “None of the above” for all of his MC questions, and versions of each question include “None of the above” as the correct answer. So it is always a choice, and it can’t be eliminated quickly.
Donna Pattison then led the group through a series of poorly written MC questions. The questions didn’t link to goals or objectives of the course, included opinions, used different structures or length for answers and distracters, or included clues such as article usage in the questions themselves. She then went through some best practices and mentioned why she doesn’t like the questions included in textbook question banks. Those questions don’t sound like the professor, so the students are at a disadvantage when they have been taught by someone other than the one that wrote the question.
Discussion after Donna’s workshop moved into ESL and how much professors should tailor their questions for these students. Donna mentioned that she encourages asking questions about words and language during the tests themselves, but another participant claimed that some students have been “beaten up” for asking questions, so they don’t. Tony Frankino then mentioned that one of the disadvantages of using CASA for his tests is that he can’t be there to answer questions. Then the discussion branched into whether professors give exams back to students, and there is no consensus there. Some do, and some don’t.
To wrap things up, James Garson presented briefly about his use of MC questions in a class on critical thinking. He asks students to make a diagram and then asks multiple choice questions about their diagram. In essence, he has made the rubric for the diagram into a question. Students are, in fact, self-reporting their diagram. He uses MC questions in way that suggests that the answer isn’t the point; instead, the critical thinking skill to arrive at the answer is what is important.
Lindsay then brought up MC questions with clickers where students are polled with correct answers, and another participant mentioned www.polleverywhere.com and Site44.com, which allow real-time polling, as well. Another option to allow for discussion of MC questions is http://peerwise.cs.auckland.ac.nz/, where students discuss questions asynchronously.
Overall, it was a great workshop, and everyone seemed interested in more workshops on writing multiple choice questions and innovative methods for their use. A future Faculty Forum will take up the topic again, we’re sure.
Chad A. Wilson