Exploring How To Ask The Right Questions And Measure Success With Patti Shank
Patti Shank, Ph.D., is listed as one of the top 10 most influential people in eLearning internationally. She has written numerous books and her articles are found all over the internet. Having a strong business background and a long experience as a trainer, Patti Shank applies learning and related sciences to improve results from instruction and performance interventions. Today, she’s discussing effective assessments, learning objectives, and bringing skills development into the digital age.
In your opinion, what’s the most common mistake Instructional Designers make when designing multiple-choice questions for eLearning assessments?
Research shows a slew of typical mistakes when writing multiple-choice questions, but the biggest mistake in my view is not assessing the right things. Most question-writers write questions by going through the content and finding easy things to assess. What we typically need to assess, however, is whether people can make correct decisions and adequately solve problems involved in critical tasks described by the learning objectives.
If the learning objectives don’t adequately describe the critical tasks and outcomes, this mistake is intensified. In this case, our assessments are likely to be invalid, which sets organizations and participants up for failure.
When we write multiple-choice questions to measure relevant decision-making and problem-solving skills, we are measuring the higher-level skills needed to do the tasks described in the objective learning. This is what we should be assessing.
Other common mistakes made when writing multiple-choice questions include making it easy to guess the correct answer (No! Don’t do it!) and making questions and answer choices hard to understand (No! Don’t do it!). Nedeau-Cayo’s 2013 study of thousands of nursing certification questions, for example, found that 84% (!) of the questions had one or more significant mistakes! 
Poorly written questions typically don’t assess the intended knowledge and skills, leading to less or not valid assessments and frustration. If you use multiple-choice assessment results to make decisions—such as who can proceed—poorly written questions can open your organization to legal risk.
One of your deepest areas of expertise is writing learning objectives to describe needed outcomes. What do you think is the most challenging aspect of creating measurable objectives and why?
Too many learning practitioners don’t think knowing what participants need to be able to do is critical to their job. I’ve been told that learning objectives are old-fashioned and not a valuable use of time. But here’s the thing. If you don’t know what your participants need to be able to do, how will you know what to teach? Or if instruction “works” to help them do it?
When learning objectives are written well, they provide clear and critical guidance for designing and assessing instruction. Without them, we are building “content” and not “instruction.” If you’re supposed to help people build needed skills, content alone is usually insufficient. (And if you don’t know what people need to do, the content itself may be wrong or irrelevant.)
Too many learning practitioners don’t have adequate knowledge about what people they are training or building training for need to know and be able to do. They don’t necessarily understand the jobs they are building training for. How do we even build training when we don’t know what participants need to do?
Learning are objectives clear statements of intent: people need to be able to do a specific task with a specific outcome. We need this information!
Think of it this way: correctly written learning objectives are like GPS. When we input a location or address into a GPS, we are telling the system where we need to go so the system can get us there. Learning are objectives like inputting a location or address. It provides clear information about how to get there (what and how to teach) and whether we did, in fact, make it to the correct destination (assessment).
Can you tell our community a bit more about your course “Write Better Multiple-Choice Questions to Assess Learning” and who it’s intended for?
Write Better Multiple-Choice Questions to Assess Learning is a brief but effective online course targeting the skills needed to do a good job of assessing whether instruction achieved the desired results.
Most people think they write acceptable multiple-choice questions. I did until my mistakes were pointed out to me many, many years ago. I was humiliated and resolved to improve my skills, and have spent years reading the research on how to do this. I’m thrilled to help others perform this critical skill well!
Research shows many multiple-choice question writers don’t know what multiple-choice questions should assess. As a result, many multiple-choice questions measure the wrong things.
Writing well-written, valid multiple-choice questions that measure the achievement of the learning objectives is a critical skill for people who design instruction. The learning objectives for this course are:
- Write well-written learning objectives to inform your multiple-choice questions.
- Analyze learning objectives and multiple-choice questions to identify flaws that must be fixed.
- Write relevant multiple-choice items that measure the achievement of your learning objectives.
If you want more information, visit this page. If you have a team to train, you can reach out for group discounts or to set up this course specifically for your team. We offer discounts for 3+ team members enrolling in the course at the same time or 10+ team members enrolling in a private team course.
You can also join my email list (I never, ever share it) by filling out the form on my site to get notified of discounts.
You’ve written numerous articles for eLearning Industry over the years. Is there a particular article you’re most passionate about that you’d like to highlight for our readers?
I love writing for eLearning Industry! The articles I write teach me a lot about what evidence people in our field most need, the implications of this evidence. I love sharing my analysis with my colleagues.
I think my favorite article is the five-part article series on when and how to use asynchronous and synchronous learning tools. The articles are called (The Right) Learning Modalities To Deliver Digital Learning: (Parts 1-5). You can find them here.
I’m now working on a multi-part article series on digital video and enjoying it a lot. I should have the first article out soon!
What excites you most about the future of Learning and Development?
I’m super excited to see how AI and other technologies will help us learn and grow critical skills. Due to continual workplace change, maintaining and growing skills is becoming increasingly critical. In many ways, though, we’re still teaching people and working using 1940 methods.
I’ve read about how the right intangible assets are now needed to make viable and competitive companies. These new insights come from innovation, which typically comes from human work. Maybe technology can help us become better innovators! For example, advanced, complex cancer diagnoses and treatments are coming from AI added to human insights. I’m wondering how these kinds of innovations will help us learn and innovate in the workplace.
We greatly appreciate Patti Shank participating in our Thought Leader Q&A and sharing her expertise with us. If you’d like to learn more about Patti, please visit her author profile page, where you can find the articles she’s penned for the eLearning Industry, as well as her social media links.
 Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, 29, 52–57.