This blog/video focuses on tips for writing questions aligned with Bloom’s Revised Taxonomy. You can read the blog below or see the video. Both contain the same content.
To learn more tips about writing effective quiz questions, do check out our top-rated course “Writing Effective Quiz Questions” on Udemy.
Bloom’s Revised Taxonomy is one of the most recognized learning theories in the education industry. Bloom’s traditional taxonomy framework was published by Benjamin Bloom in 1956 under the title Taxonomy of Educational Objectives. Most educators write their learning objectives for a course, or a curriculum based on the various stages of Bloom’s Taxonomy. It is split into various categories that are arranged with a rising level of hierarchy. The key phases are Knowledge, Comprehension, Application, Analysis, Synthesis, and Evaluation.
- During the Knowledge stage, a learner is taught to recall the specifics, universal facts, processes, methods, a pattern, a structure, or a setting.
- In the Comprehension level, learners are taught in a manner that after the training they are able to understand the meaning of the content. Examples can be used to facilitate interpretation of the key facts, definitions, and processes being taught. This goes beyond simply recalling the content from a training.
- Application refers to the ability to use the knowledge and comprehension to solve real-world problems.
- A training achieves the Analysis level when learners are able to identify the different components and parts of the concepts and then establish a relationship between them. For example, if your training explains the steps of a process, then to help learners achieve the Analysis level, the training should teach them relationship between the different stages of the process, and identify the organizational principles involved.
- Synthesis refers to the ability to put together elements and parts to form a whole. That is, having the ability to use the learnings from a training to create something new. This may involve publishing a new research, blog, or a video that demonstrates how the components from a training can be implemented in a real-world environment.
- Evaluation is considered as the highest level in Bloom’s Taxonomy wherein on achieving this level, a learner is able to critique existing processes, content, or any form of material.
Traditional Bloom’s Taxonomy vs. Bloom’s Revised Taxonomy
In 2001, a group of cognitive psychologists, curriculum theorists and instructional researchers, and testing and assessment specialists published a revision to the traditional Bloom’s Taxonomy. They titled the new study as A Taxonomy for Teaching, Learning, and Assessment. This study revised the traditional Bloom’s taxonomy to include more action-based verbs and gerund form of verbs to define the different levels or categories. Knowledge was changed to Remembering, Comprehension to Understanding, Application to Applying, Analysis to Analyzing, Synthesis to Evaluating, and Evaluation to Creating.

Remembering
At the Remembering level, you set objectives and teach content such that on completing the training, the learners can retrieve, remember, recall, or recognize the required knowledge from long-term memory. The exercises and interactivities in such courses are aimed at presenting information and facts to aid recall. When writing objectives for such courses, you are recommended to use verbs like cite, define, describe, identify, label, list, match, name, outline, quote, recall, report, reproduce, retrieve, show, state, tabulate, and tell.
To test the remembering aspect of any course, you can include a wide variety of question types like multiple-choice questions, multiple response questions, true or false, etc. The questions can be simple and ask learners to answer based on straightforward facts or information from the training.
Example: Remembering
For Example:
Which of the following is the most used international currency?
A) U.S. Dollar
B) Euros
C) Yen
D) Yuan
In this sample question, you are simply asking the learners to identify the most prevalent international currency.
Understanding
To help learners achieve an Understanding level from a course, they should be able to organize, compare, translate, interpret, write or present descriptions, and explain the key ideas from a course. The presentation of courses that aim to inculcate an understanding in learners include examples, and exercises that enable them to understand and explain the key topics from the course.
The verbs to use for writing objectives for this level are abstract, arrange, articulate, associate, categorize, clarify, classify, compare, compute, conclude, contrast, defend, diagram, differentiate, discuss, distinguish, estimate, exemplify, explain, extend, extrapolate, generalize, give examples of, illustrate, infer, interpolate, interpret, match, outline, paraphrase, predict, rearrange, reorder, rephrase, represent, restate, summarize, transform, and translate.
To test the understanding level of key topics from a course, you can write quiz questions that require learners to explain, define, categories, or classify the key topics. Such questions should evaluate whether the learners can make decisions in choosing the correct definitions, category, classification, or explanation related to any given topic.
Example: Understanding
For example,
How would you classify animals into different biological kingdoms?
A) Vertebrates and Invertebrates
B) Birds and Mammals
C) Yen
D) Yuan
This sample question asks learners to choose the correct classification of animals into different biological kingdoms.
Applying
The next level in the new Bloom’s hierarchy is Applying. To ensure your learners reach this level, write course objectives and content with tips on applying what they learn to real-world problems. This goes beyond simply presenting information. At this level, you need to include examples and practical tips that will facilitate application. For example, design and implement activities in your courses to simulate real-world problems, so learners can practice and implement solutions in the given situation. In software training, this is achieved with simulation practices. In managerial or soft-skills training, this can be achieved with the help of activities, such as role plays, group discussions, debriefing sessions, etc.
The verbs to use for writing objectives for courses at this level include apply, calculate, carry out, classify, complete, compute, demonstrate, dramatize, employ, examine, execute, experiment, generalize, illustrate, implement, infer, interpret, manipulate, modify, operate, organize, outline, predict, solve, transfer, translate, and use.
The questions, which test the Applying level go beyond asking learners to identify facts or key points. These questions test whether learners can apply the acquired knowledge, facts, processes, methods, rules, and standards to solve real-world or new problems. Let’s examine some examples of such question types.
Examples: Applying
Here is a scenario-based example that tests the Applying expertise, which learners have achieved for a particular topic:
Two employees who report to you, don’t get along. They’ve regular conflicts and the resulting ineffective communication impacts the quality of the projects. Furthermore, there are significant delays in completion of key tasks. Which of the following actions is most appropriate in this case?
A) Let the conflict resolve at its own pace as both employees are experienced professionals.
B) Terminate employment of both employees due to violation of policies.
C) Move one employee to another project to reduce their communication.
D) Host a conflict resolution meeting to find a solution to their problems.
In this case, the question asks learners to identify the most appropriate action they will take to resolve a conflict between two members of their team. The scenario provides enough context, so learners can choose the correct action based on their learning from a course. The scenario in the question stem also helps to simulate a real-world problem and then asks learners to apply an appropriate solution. Also, notice how the options are also feasible decisions, which a manager of conflicting employees may make in a similar situation. So, a learner will need to apply their learning in an effective manner.
Do note that it is not necessary to have a scenario in every question for testing whether a learner can Apply what they have learned. Let’s examine another example in which we don’t use a scenario, but still ask learners to Apply what they’ve learned. The question is:
How would you modify the preferred language in Microsoft PowerPoint?
A) Click the current language on the bottom left bar in PowerPoint, and then select and apply the new preferred language.
B) Restart Microsoft PowerPoint in Safe Mode.
C) Change the preferred language in your Operating System.
D) Contact Microsoft Support or Helpdesk to know the procedure to do the required.
In this case, to answer the question correctly, they need to recall, and then apply the knowledge to modify or change the existing state in a software. So, they are not merely answering whether language can be changed in Microsoft PowerPoint but are applying this knowledge to identify the correct way to update the language.
Analyzing
To start a course that aims to provide analyzing capabilities to your learners, write objectives to state that learners will be able to categorize, arrange, organize, compare, investigate, and interpret ideas and concepts related to the topics taught in the course. This is because after taking the course, your learners should be able to inspect and break the taught topics into logical sections by recognizing the patterns and relations between them. For example, if your course is about the different categories of animal kingdom, then the learners must be able to classify the different mammals as per their animal kingdom.
The verbs recommended to use for writing objectives in such courses are: Analyze, arrange, break down, categorize, classify, compare, connect, contrast, deconstruct, detect, diagram, differentiate, discriminate, distinguish, divide, explain, identify, integrate, inventory, order, organize, relate, separate, and structure.
To test the analyzing capabilities of learners after taking a course, frame questions to evaluate whether learners can classify, categorize, organize, and differentiate the key topics taught in the course. This level allows you to use a variety of question types, such as multiple-choice questions, match the columns, sequencing, etc. Let’s examine some examples.
Examples: Analyzing
In this matching question, the learners are asked to drag and drop a color to classify it as either a warm or cool color. This is a simple classification question, which lets learners analyze the different options shown on the screen and then answer the question based on their understanding.
In this matching question, the learners are asked to drag and drop a color to classify it as either a warm or cool color. This is a simple classification question, which lets learners analyze the different options shown on the screen and then answer the question based on their understanding.
Let’s examine another example.
ACME Enterprise’s onboarding process doesn’t include a walkthrough of the payroll structure. As they hire experienced staff, they assume that all hires will understand the payroll structure. What changes would you recommend to ACME’s onboarding process? Select all that apply.
A) Provide a handout to new joiners explaining the payroll structure.
B) Have your payroll supervisors explain the structure during the onboarding meeting.
C) Explain the structure via. a self-paced video or eLearning module.
D) Ask employees to contact their bank when their first salary is credited.
In this question, learners are presented with a scenario that outlines the onboarding process at a fictitious company called ACME Enterprise. To answer the question, the learners need to analyze the current process and then choose the enhancements to it.
Evaluating
In the evaluation level, the learners develop the ability to critique and derive opinions on the presented information or data. Here, the learner also questions the existing processes, content or any form of material and draws a new inference with justification. The verbs for writing objectives for this level include Assess, critique, determine, evaluate, judge, justify, measure, recommend, appraise, argue, attach, choose, compare, defend, estimate, predict, rate, core, select, support, value, debate, rate, decide, argue, consider, and relate.
The questions that you write for the Evaluating level should help you to determine whether a learner can evaluate and justify their decisions. The learners can be asked to confirm their decisions using question types, such as open questions eliciting descriptive responses, multiple-choice questions, match the columns, sequencing, etc. If you are using MCQs to test the evaluating capabilities of your learners, you need to ensure that the quality of your distractors is high. Ensure that the incorrect options or distractors in your question make the learners think and evaluate what they’ve learned.
Examples: Evaluating
Let’s examine a sample question that tests the evaluating capabilities of learners. The question is:
Which mobile phone is better, Android or iPhone? Justify.
Every individual has some opinion on this debatable topic. To answer this question, the individual must recall, understand, apply, and analyze the information they possess on the given topic. Hence a question like this falls under the evaluation level as the learners need to evaluate all that they know and then answer the question.
Another example can be asking the reason why movies and web-series are being easily binge watched whereas people struggle to watch an educational video. Before answering, the learners will draw conclusions on their experiences and acquired knowledge. At the first glance, all the four options seem correct, but the learners should revisit each option and rethink whether the statement is giving a strong reason of why people are watching one kind of videos but not another. The learners need to evaluate every option through the lens of their existing knowledge before choosing the correct answer. Below is this sample question:
Why people find it difficult to watch an educational video on the internet whereas they binge web-series and movies for hours at a stretch?
A) Because nobody likes studying. Everybody wants to enjoy and have fun.
B) Because educational content is always boring. Movies and web-series have entertainment.
C) Because movies and web-series have engaging storytelling whereas educational videos are monotonous.
D) Because educational videos are followed by assignments while nobody ask questions and give marks on movies and web-series.
Creating
The learner puts together all the parts of the learnings to form a new whole in this level. It gives the ability to create something new with the help of existing knowledge. This level encourages creativity, innovation, and originality. Solutions to real world problems, formulation of new structures, patterns and alternative to existing solutions can be expected from the learners in this level.
Recommended verbs for writing effective creation level course objectives are: create, organize, produce, modify, invent, substitute, propose, imagine, rearrange, predict, improve, arrange, combine, integrate, assemble, collect, compose, construct, design, develop, formulate, manage, organize, plan, prepare, set up, and write.
To test the creating level of the learners, curate questions that encourage the learners to create, design or develop something new. You can use question types such as open-ended questions that require descriptive answers, multiple choice questions with plausible distractors or giving a hypothetical situation challenges the problem-solving ability and creativity of the learners.
If you want to use multiple-choice questions to test the creating capabilities of learners, consider giving options that ask learners to validate the steps to follow for creating something new. In software training, the Creating level of learners is tested in the form of lab exercises. These lab exercises are designed to let the learners create something new based on what they’ve learned.
Examples: Creating
A good example of testing creation is assigning a task like asking learners to design and present a case study on emerging trends in Social Media Marketing. This question demands some prior knowledge of social media marketing. Here, the learners must gather information from multiple sources and put it together for their presentation.
Example Question: Design and present a case study on emerging trends in Social Media Marketing.
Another example to test learners who’ve learned Microsoft PowerPoint is to let them create a new template based on the branding guidelines of their enterprise. In this exercise, the learners will use their knowledge to create something new.
Example Question: Create a new PowerPoint template that aligns with the branding guidelines of your enterprise.
Let’s us look at another example in a multiple-choice question or MCQ format to assess the learners’ creation ability.
Stan, a very hardworking colleague, and a good friend of yours has been struggling with delivering effective presentations in front of the senior management. As you are a fluent and a confident presenter, Stan has reached to you to help him develop his skills. He has asked you write the slide notes for upcoming presentation. What would you do?
A) You will respectfully deny as you have plenty of tasks to complete.
B) You will sit with him for hours and write everything for him.
C) You will share the relevant resources with him and ask him to write on his own.
D) You will ask him to make the first draft and will help him to edit and polish the presentation.
In this example, the learners are tested on how they will approach to the solution.
Writing slide notes is something the learners know, but here they are asked to write for someone else, hence they need to think and create an approach that is feasible for both. You can pause the video to examine the question at your own pace.
Thank you for reading this blog on using Bloom’s Revised Taxonomy for objectives and quizzes. To learn more about writing effective quiz questions, do check out our course on Udemy.
Enhancing Assessment Over Time
Assessment should never be a static process—think of it as an ever-evolving toolkit rather than a dusty old lockbox. By consistently reflecting on and refining your approach, you can make assessments smarter, fairer, and far more meaningful for learners.
Here are a few timely tips for innovating assessment practices:
- Diversify Question Types: Don’t rely solely on multiple choice or true/false. Incorporate open-ended questions, scenario-based MCQs, and practical exercises like case studies or project assignments. This not only addresses different learning preferences but also allows for a deeper evaluation of understanding.
- Gather Feedback: Regularly solicit input from your learners about what’s working and what isn’t. Short surveys after quizzes or informal check-ins can uncover blind spots in your current approach.
- Review and Revise: Periodically revisit your assessments. Are your questions still aligned with your learning objectives? Are they challenging enough, or too challenging? Replace stale examples with fresh, relevant scenarios—perhaps you could draw inspiration from rapidly changing fields like technology or trending topics on platforms like LinkedIn.
- Leverage Analytics: Many learning platforms, such as Udemy or Coursera, provide analytics on how learners interact with your quizzes. Analyze question-level stats for patterns—are many struggling on a particular item? That might signal a confusing question or a gap in instruction.
- Promote Real-World Application: Whenever possible, design assessments that connect back to realistic problems or professional contexts. For instance, ask learners to create marketing plans for a brand like Nike or develop a social media campaign for a local nonprofit. This not only tests knowledge, but also fosters job-ready skills.
By taking a reflective, creative, and data-informed approach, you’ll ensure your assessments continue to grow alongside your learners.
How to Brainstorm, Write, and Evaluate Learning Outcomes
When it comes to developing effective learning outcomes, there’s a bit of an art—and a bit of science—to the process. Here’s a practical approach for educators and trainers:
-
Brainstorming Outcomes:
Start by identifying what you want learners to walk away with. Ask yourself: What should learners know or be able to do after this course? Jot down ideas, drawing inspiration from industry standards, real-world scenarios, and any recurring gaps you’ve noticed among previous learners. -
Writing Effective Outcomes:
Use actionable, specific verbs tied to Bloom’s Taxonomy. Instead of vague goals like “understand the material,” aim for clarity—such as “describe key concepts,” “apply theories to case studies,” or “analyze and compare different methodologies.” Make sure each outcome is measurable and observes the hierarchy from basic recall to higher-order thinking. -
Evaluating Learning Outcomes:
Once drafted, assess whether your outcomes are clear, achievable, and aligned with course objectives. Peer review can help; ask colleagues or even students for feedback. As a final check, imagine a practical assessment that would measure each outcome—if you can’t think of one, it’s time to clarify or revise.
Using this method ensures your learning outcomes are meaningful, actionable, and pave the way for both effective teaching and powerful student learning experiences.
Approaches to Assessing Learning
There are a variety of approaches educators use to gauge whether learning objectives have been met. Assessments may range from formal tests and quizzes to more interactive and practical formats, depending on the desired learning outcome.
- Traditional Tests and Quizzes: These are often used for quickly checking recall and comprehension. Multiple choice, true/false, and short answer questions can efficiently determine if foundational knowledge has been absorbed.
- Projects and Presentations: For objectives that stretch into application, analysis, or synthesis, learners may be asked to complete projects, deliver presentations, or build portfolios showcasing their understanding and creativity. Think of a student using what they’ve learned in a chemistry class to design a safety protocol or present their findings in a simulated boardroom.
- Practical Demonstrations: Especially in skill-based training—whether programming, culinary arts, or first aid—demonstrating a process or technique often reveals learning far better than a written test.
- Peer Review and Group Work: Collaborative exercises can help assess abilities in analysis, synthesis, and evaluation, as learners critique each other’s work and defend their decisions. It’s like conducting a mock trial to evaluate arguments or working together to design a product prototype.
- Reflective Writing: Asking learners to reflect in journals, essays, or discussion forums encourages deeper analysis and self-evaluation, highlighting how well they’re internalizing and integrating the material.
Choosing the right assessment method often depends on the stage of Bloom’s Revised Taxonomy you aim to address. For example, a multiple choice quiz might be best suited for checking recall, while a research project is better for gauging synthesis or creation.
By blending these different approaches, you can tailor your assessments to genuinely reflect your learners’ progress at every level of the taxonomy.
Steps in the Assessment Process
The assessment process in education follows a deliberate sequence, helping educators ensure that learning objectives are met and continuously improved. Here’s how the typical cycle unfolds:
-
Define Purpose and Objectives
Begin by clarifying the overarching goals of your course or curriculum. Specify what you expect students to know or do by the end of the instruction. This step involves distinguishing between broad curricular goals and more specific learning outcomes. -
Select What to Assess
Not every objective needs to be measured at once. Determine which projects, skills, or areas warrant close assessment this cycle. Prioritizing is key—think of it as triaging for maximum impact. -
Map the Curriculum
Curriculum mapping helps visualize where in the program each learning objective is taught, practiced, and assessed. This ensures alignment and reveals any gaps or redundancies. -
Develop Assessment Strategies and Rubrics
Choose appropriate assessment tools—tests, portfolios, presentations, or peer reviews—to capture students’ progress toward the objectives. Rubrics are especially handy for maintaining consistency and clarity in grading. -
Gather and Analyze Evidence of Learning
Collect data from assignments, projects, and exams. Analyzing this evidence illuminates trends and areas where students excel or struggle. -
Document and Report Results
Summarize your findings in a format accessible to faculty, administrators, and stakeholders. Reporting closes the loop and sets the stage for continuous dialogue. -
Use Results to Inform Improvement
Finally, take stock: What worked? What needs tweaking? Use these insights to enhance the curriculum, instructional techniques, or even the assessment tools themselves—ensuring a cycle of ongoing refinement, not just a stamp of approval.
By thoughtfully working through these steps, educators can create a robust framework for evaluating and improving learning outcomes—benefitting both learners and instructors.
Writing Assessment Reports
Just as designing a new PowerPoint template asks learners to synthesize what they’ve learned, writing an assessment report is about assembling information in a clear, purposeful way. An effective assessment report demonstrates your ability to collect data, analyze it, and then construct a well-supported set of findings and recommendations.
To craft a well-structured assessment report, consider these steps:
- Gather and Organize Data: Start by collecting relevant information from a variety of credible sources. Group and categorize your findings to highlight patterns or trends.
- Structure Your Report: Use a logical framework—introduction, methodology, results, analysis, and conclusion. This makes your report easy to navigate.
- Present Evidence Clearly: Use visuals, bullet points, and concise language to bring out the most important insights. Clarity is key.
- Offer Actionable Recommendations: Go beyond describing results. Suggest practical improvements or next steps based on your analysis. For instance, if you’re assessing a training program, propose changes grounded in the evidence you gathered.
- Review and Revise: Before finalizing, review your report for coherence and accuracy. Peer feedback can offer new perspectives or catch errors you didn’t spot.
Whether you’re reporting on the outcome of a case study or providing feedback on a creative project, the skills you develop at the ‘Creating’ level—synthesizing knowledge, critical thinking, and presenting solutions—will help you present a compelling and insightful assessment report.
Putting Assessment Results to Work
So, you’ve designed thoughtful questions, run through examples, and collected answers—now what? Assessment results shouldn’t just sit in a spreadsheet or be handed back for learners to file away. Instead, they’re a treasure trove of insights waiting to be mined. The real value comes from analyzing these outcomes and using them to refine both your instruction and your learners’ experiences.
Here are some ways you can put assessment results to good use:
- Adjust Your Approach: If lots of learners struggled on a particular section—say, designing a template in PowerPoint—it’s a sign to revisit that topic. Maybe offer another tutorial, run a hands-on workshop, or provide additional resources.
- Personalize Feedback: Use the results to guide individualized feedback. For instance, if someone consistently shines in “creating” tasks, challenge them with more open-ended projects. If another tends to select implausible MCQ distractors, offer targeted advice or one-on-one help.
- Spot Trends and Gaps: Over time, you’ll spot patterns—perhaps learners regularly excel at proposing solutions but falter when integrating multiple tools. Use this knowledge to tweak your curriculum or introduce supplemental materials, like LinkedIn Learning videos or templates from Canva and Visme.
- Celebrate Progress: Don’t forget to highlight improvement! Share successes (with permission), create badges or certificates, and encourage peer-to-peer learning. Nothing motivates like recognizing growth.
- Iterate and Innovate: Treat each set of results as feedback for your own process. Were your questions fair? Did they demand creative thinking, or simply rote responses? Use this intel to revise future assessments for even greater impact.
By making assessment an ongoing, cyclical process—rather than a one-way street—you’ll create a learning environment that’s dynamic and continuously improving.
Book a Call with Our CEO
Lokesh has over 20 years of experience managing custom e-learning, L&D, design, and web development projects. Book a slot on his calendar to explore how we can help you.
Thank you for considering Check N Click as your partner. We are dedicated to delivering outstanding eLearning content, graphic design, web development, & LMS administration services tailored to your unique needs.

