Tag: branching scenario

Better Feedback for Scenario-Based eLearning Session Trailer

I’m presenting at the eLearning Guild’s Learning Solutions Conference again this year on “Better Feedback for Scenario-Based eLearning.”

You can watch a two-minute trailer for my session (if the video isn’t embedded below, watch it on YouTube).

When you create a scenario, you work hard to make it realistic and relevant for your learners. Unfortunately, even otherwise engaging scenarios sometimes include abstract feedback like “Incorrect. Please try again.” Simply saying the choice is right or wrong can make learners lose interest and focus, and it doesn’t help them learn from their mistakes.

You will learn how to show learners the consequences of their decisions rather than telling them they’re right or wrong in scenario-based eLearning. This is the difference between “intrinsic feedback” and “instructional feedback.” We will explore several different options for intrinsic feedback, such as progress meters, character responses, and environmental changes. You’ll learn guidelines for when to use immediate feedback and when to delay the feedback in scenarios. We’ll discuss how to design feedback to meet the needs of both novice and expert learners. You’ll also learn when direct instructional feedback is beneficial for learning.

In this presentation, you’ll learn:

  • multiple methods to show the consequences of decisions in scenarios
  • when to use immediate or delayed feedback
  • how to provide appropriate feedback for novice and expert learners
  • when to use intrinsic feedback (showing consequences) and instructional feedback (direct coaching)
  • how to work with SMEs to get information to provide realistic consequences
  • how to write better feedback for short scenarios and complex branching scenarios

Read More about Feedback

This session draws from several previous blog posts (as well as some additional information from other sources).

If you’re attending the Learning Solutions Conference, I hope to see you there!

Using Time as Scenario Feedback

Nicole is creating a branching scenario practicing communication techniques for nutrition counselors to better understand their clients’ goals. She has written a simulated conversation between a counselor and a client. Her SME, Brian, provided this feedback after reviewing the prototype.

The conversation overall does a good job giving plausible choices for questions and showing realistic responses from the client. I want this to be really exciting for learners, like a game. Let’s add a timer for each decision. That way, they’ll be motivated to answer quickly and keep pushing through the scenario.

What do you think? How should Nicole respond to Brian?

A. That’s a great idea! I think that will enhance the learner experience.

B. I’m not sure. Let me do some research.

C. Timing might not be the best form of feedback for this particular course.

Remember your answer; we’ll come back to this question at the end of the post.

Using Time as Scenario Feedback

When Time is Effective

Time can be a very effective consequence in some learning situations. Check out the Lifesaver training on what to do in emergency situations for an example with effective use of time as feedback.

In the first scenario with Jake, you help someone in cardiac arrest. Each question has a 5 second timer, and you are scored for both accuracy and speed.

6/6 Right First Time - Avg Speed 1.32s

Later in the scenario, you simulate performing CPR by pressing two keys on your keyboard in the same rhythm as CPR. While you practice, you see a scale from good to bad showing how close you are to the ideal timing. This lets you adjust your rhythm. After you finish the 30 seconds of simulated CPR, you see a percentage score for your accuracy.

Scale showing good and bad for speed

This feedback works in the Lifesaver training because timing really is a critical part of the skill being taught. Speed of response matters in these emergency situations, as does knowing the right rhythm for CPR.

Time can work for other skills too, like manufacturing, making sandwiches in a chain restaurant, or safety training.

When Time is Counterproductive

If the skill you’re practicing and assessing requires critical thinking and careful consideration, measuring time can be counterproductive. For simulated conversations where you want learners to pause and think about their options, it’s better to not use a timer.

You might be thinking, “But in a real conversation, people need to think quickly. Doesn’t that mean we should use timers?” That’s a question about fluency, which requires more practice over time. If your goal is to get people to that point of fluency, you might add a timer, but not for the initial practice. Teach the skill without a timer first, then provide additional practice opportunities to build fluency and speed in the skill.

Do Timers Improve Motivation?

Does a timer increase motivation for getting the right answer? Maybe, if the learners are already motivated prior to starting and time makes sense in the context of the activity. Many games use time as a way to keep players engaged and excited.

I suspect in practice that unnecessary timers encourage people to guess randomly for whatever they can click fastest. Time may actually decrease learners’ motivation to be truly cognitively engaged with the learning experience. They may be more motivated just to click and get through it quickly than to read carefully and understand their decisions.


Timers can create challenges for accessibility. Learners with visually impairments who use a screen reader and keyboard navigation will generally need more time to answer. Learners with  mobility impairment may have trouble manipulating a mouse or keyboard quickly. Depending on your audience, adding timers may prevent some learners from being successful in your elearning courses, even if they could do the real task (like having a conversation) without problem.

Revisiting the Communication Scenario Example

Think back to Nicole’s scenario at the beginning of this post. She’s teaching communication skills to nutrition counselors, using a simulated conversation. Her SME, Brian, suggested adding a timer.

What do you think? Does a timer seem helpful in this situation?

Probably not. In this training, it’s more important for learners to think carefully about their choices and responses than to be speedy. Feedback like the expression of the client or a scale showing the client’s motivation to change their eating behavior would be more beneficial than feedback on how quick they are.

Your Examples?

Time can work as feedback in learning scenarios, but it should be used sparingly, and only when it is actually relevant to the skill being practiced or assessed.

Do you have any examples of time used successfully as feedback in a scenario? I’d love to see some more samples. Share them in the comments.


Branching Scenario Prototype in Twine

I built this branching scenario in the open source tool Twine. This scenario is moderately complex, with a total of 17 pages (or passages in Twine terminology) and 8 different endings. The ideal path has 5 decisions to reach the best conclusion.

I generally use Twine as a prototype for review and testing purposes. You can use Twine as the finished product though, especially if you do some formatting to make it look better. This is currently pretty rough (just text on a white background), but that’s OK for a prototype.

If you use Twine as a prototyping tool, you can build the finished version in Captivate, Storyline, or another tool of your choice.

Try the scenario out yourself by clicking below (the scenario will open in a new tab).

Click to open the scenario in a new tab.

This is the map of the entire scenario. You can see how many of the choices are reused.

Twine map of the entire scenario

Want to learn how I created this?

Read the previous posts in the series to see my process for creating this scenario.


Can’t get enough? Check out all of my posts on Storytelling and Scenarios.



ID and eLearning Links (12/3/17)

  • This is the link I send people to debunk the blanket claims about “people forget X% after Y time.” The reality is that how much people forget depends on who your audience is, what they’re learning, and how you train them.

    tags:training instructionaldesign research myth

    • The amount a learner will forget varies depending on many things. We as learning professionals will be more effective if we make decisions based on a deep understanding of how to minimize forgetting and enhance remembering.

    • To be specific, when we hear statements like, “People will forget 60% of what they learned within 7 days,” we should ignore such advice and instead reflect on our own superiority and good looks until we are decidedly pleased with ourselves.

    • Many of the experiments reviewed in this report showed clearly that learning methods matter. For example, in the Bahrick 1979 study, the best learning methods produced an average forgetting score of -29% forgetting, whereas the worst learning methods produced forgetting at 47%, a swing of 76% points.

  • Mini-scenarios and branching scenarios provide better assessment than traditional multiple choice, but this provides some other options for deeper assessment that can still be scored by a computer.

    tags:assessment scenario e-learning instructionaldesign feedback

Posted from Diigo. The rest of my favorite links are here.

Instructional Design and E-Learning Links

Writing Mistakes and Consequences

In my previous post, I explained how I write the ideal path for a branching scenario first. Once that is complete, I write the mistakes or errors and consequences for those choices.

First, Draft One Alternate Path to Its Conclusion

I start writing a single alternate path from beginning to end. This is the easiest way for me to create continuity in the narrative. I have the ideal path already created at this point, but you can see below that some choices are dead ends. Other choices are still marked as TBD for placeholders.


In this case, I’ll go back to the very first decision and start writing the mistakes and consequences. When I was drafting the ideal path, I knew the worst choice was to send a price estimate right away. I’ll start there and finish writing that path. In this case, this mistake is a big one that will result in an immediate failure and restart.

Sophie provides a fixed-price quote based on her limited information. Robert immediately accepts without negotiating, making Sophie wonder if her price was too low.

One month into the project, Sophie realizes what a terrible mistake she has made. She severely underestimated the scope of the project and the time required. She’s frustrated, and her client is frustrated at how long everything is taking. Sophie works long hours for weeks–so many hours that her effective hourly rate is much lower than usual.

Once the project is finished, Sophie vows to never give a price estimate again unless she has more information.

Draft Another Alternate Path (and Another, and Another…)

Once the first alternate path is written, go back to the beginning of the scenario. Find the first decision point that isn’t fully fleshed out, and start writing there. As before, try to write one complete path from start to finish.

In this example, I go back to the first choice. In my first pass at drafting the ideal path, I left a placeholder for “Some other OK choice TBD.” Now I need to create a third option and follow through on the consequences.

In this case, a partially correct choice would be to ask specific questions about the project. (The best choice is to ask high level questions about the business goals first, rather than getting into the details of the training too early.)

Sophie realizes she doesn’t have enough information to provide an accurate estimate. She asks Robert to clarify how long the courses actually are.

Robert replies with the details.

  • Course 1 is a half day (3.5 hours).
  • Course 2 is one day (about 6 hours).
  • Course 3 is two days (about 12-13 hours).
  • Course 4 is four days (about 26 hours).

What should Sophie do next?

[[Send Robert a price estimate for the whole project.|Send Robert a price estimate.]]
[[Ask Robert what level of eLearning he wants.]]
[[Ask Robert some high level questions about goals and budget.|Send Robert some client screening questions.]]

I continue like this with each choice until every path reaches its conclusion. I use my notes from my conversations with the SME and other stakeholders, plus any other research, to determine realistic consequences.

Good, Bad, and OK

By default, I usually aim for one choice that is the best (the ideal path), one choice that is clearly bad, and one choice that is OK or partially correct. If you’re stuck on what choices or mistakes to provide, try aiming for that pattern in each choice.

I do vary that, especially later in the scenario. By the time someone has made multiple good choices and is nearing the end of the ideal path, there might not be any terrible choices. It may be Good-OK-OK instead. If someone has made a large mistake, there might not be any good options; maybe the best recovery is an OK choice.

In Twine, I tag my choices as Good, Bad, and OK. In PowerPoint or other tools, I usually color code them in a flow chart as green, yellow/orange, and red. This helps me keep track of the structure. It also helps voice over or actors know the tone if this will be turned into audio or video later.

Let People Recover from Mistakes

Not every mistake should result in immediate failure and a restart. Just like in real life, sometimes people can recover from their mistakes. Branching scenarios can help people practice that skill of moving past an error.

In the example above, I want the learners to have a chance to realize they’re getting into the details too early and to back up. After asking about the length of the training, they can choose to ask high level questions instead. That gets them back to the ideal path.

Screenshot from Twine showing choices reused

Crossing Paths

I usually have some paths that cross, as shown in the screenshot above. This is one reason I really like Twine for drafting branching scenarios, regardless of what tool or format the final version will be.

Often, I create several ways to get to the same path or ending. Reusing some choices or endings helps reduce the complexity of the scenario.

After a Failure

You have several choices of what to do next after you reach a failed ending of a scenario.

  • Ask learners to restart and try the scenario again. This is what I did in the immediate failure of providing a price estimate with woefully inadequate information above.
  • Let learners back up one step so they can make a better choice.
  • Let learners go back to a milestone or checkpoint so they don’t have to redo the entire scenario. This is especially helpful in longer scenarios with many decision points.

I explain the pros and cons of these restart or retry options in detail in a previous post.

Testing and Revising

Once I have a complete draft of everything, I test the options, reading through each path and revising to fix anything broken. I prefer to do this testing at least a day after I finish the initial draft. It’s easier to see my mistakes when I have time to set it aside for a while and come back later.

I also review my notes. Did I include all of the mistakes that were critical to include? Does the flow make sense? Is there anything too repetitive that should be collapsed into a single path or rewritten?

After that, it’s ready for me to send for review by the SME and other stakeholders. I post the Twine version as an interactive prototype so they can test it themselves. I also export to Word so it’s easier to track changes in wording (especially with multiple reviewers).

Looking for more?

Check out my previous posts on branching scenarios: