Benefits of Scenario-Based Learning

Why are scenarios effective for learning? They provide realistic context and emotional engagement. They can increase motivation and accelerate expertise. Here’s a selection of quotes explaining the benefits.

Benefits of Scenario-Based Learning

Accelerating Expertise with Scenario-Based e-Learning – The Watercooler Newsletter : The Watercooler Newsletter: Ruth Clark on how scenario-based elearning accelerates expertise and when to use it

What is Scenario-Based e-Learning?

  1. The learner assumes the role of an actor responding to a job realistic situation.
  2. The learning environment is preplanned.
  3. Learning is inductive rather than instructive.
  4. The instruction is guided.
  5. Scenario lessons incorporate instructional resources.
  6. The goal is to accelerate workplace expertise.

As you consider incorporating scenario-based e-Learning into your instructional mix, consider whether the acceleration of expertise will give you a return on investment.  For example, interviews with subject matter experts indicated that automotive technicians must complete about 100 work orders to reach a reasonable competency level in any given troubleshooting domain.  Comparing delivery alternatives, OJT would require around 200+ hours, instructor-led training would require around 100 hours, and scenario-based e-Learning simulations require approximately 33–66 hours.

Finally, many learners find scenario-based e-Learning more motivating than traditional instructional formats.  Solving a work-related problem makes the instruction immediately relevant.

The Benefits of Scenario Based Training: Scenario-based training better reflects real-life decision making

There is no linear path into what they are subjected. The situations are complex. They often fail and they learn by reflection, becoming much better at the judgements they make next time, even though next time the environment and the scenarios presented are different.

After completing a few exercises, they build their own view of the patterns that are evident and are able to move into a new scenario with confidence even if the environment and scenario is radically different.

Learning on reflection before plunging into the next scenario helps to build the patterns in the participants’ minds that are the evidence that they have learnt.

Quizzes based on scenarios with a, “What would you do next?”, question builds quick and fun repetition into the training programme, helping transfer from short term memory to long term memory.

Scenario-based-learning: PDF explaining theory and how to decide if SBL is the right strategy

Scenario-based learning is based on the principles of situated learning theory (Lave & Wenger, 1991), which argues that
learning best takes place in the context in which it is going to be used, and situated cognition, the idea that knowledge is
best acquired and more fully understood when situated within its context (Kindley, 2002).

SBL usually works best when applied to tasks requiring decision-making and critical thinking in complex situations. Tasks
that are routine to the students will require little critical thinking or decision-making, and may be better assessed using
other methods.

Checklist: Is SBL the right option? (Clark, 2009)
* Are the outcomes based on skills development or problem-solving?
* Is it difficult or unsafe to provide real-world experience of the skills?
* Do your students already have some relevant knowledge to aid decision-making?
* Do you have time and resources to design, develop, and test an SBL approach?
* Will the content and skills remain relevant for long enough to justify the development of SBL?

Learning through storytelling | Higher Education Academy: Why storytelling works for learning

Stories are effective tools for learning due to their ability to facilitate the following cognitive processes: i) concretizing, ii) assimilation, and iii) structurizing (Evans and Evans 1989).

Top 7 Benefits You Get From Scenario-Based Training: Infographic on benefits. “Falling forward,” accelerating time, critical thinking, shared context, engaging emotions, retention, trigger memories

Scenarios allow “falling forward”: Providing a safe space to fail helps build the capacity to fix mistakes as you would in real life

Book Review: Performance-Focused Smile Sheets

On a scale from 1 to 5, how useful are your current level 1 evaluations or “smile sheets”?

  1. Completely worthless
  2. Mostly worthless
  3. Not too bad
  4. Mostly useful
  5. Extremely useful

Chances are, your training evaluations aren’t very helpful. How much useful information do you really get from those forms? If you know that one of your courses is averaging a 3.5 and another course is averaging a 4.2, what does that really mean? Do these evaluations tell you anything about employee performance?

Personally, I’ve always been a little disappointed in my training evaluations, but I never really knew how to make them better. In the past, I’ve relied on standard questions used in various organizations that I’ve seen over my career, with mixed results. Will Thalheimer’s book Performance-Focused Smile Sheets changes that by giving guidelines and example questions for effective evaluations.

smile_sheets

Raise your hand if most of your evaluation questions use Likert scales. I’ve always used them too, but Thalheimer shows in the book how we can do much better. After all, how much difference is there between “mostly agree” and “strongly agree” or other vaguely worded scales? What’s an acceptable answer–is “mostly agree” enough, or is only “strongly agree” a signal of a quality course?

The book starts with several chapters of background and research, including how evaluation results should correspond to the “four pillars of training effectiveness.” Every question in your evaluation should lead to some action you can take if the results aren’t acceptable. After all, what’s the point of including questions if the results don’t tell you something useful?

The chapter of sample questions with explanations of why they work and how you might adapt them is highly useful. I will definitely pull out these examples again the next time I write an evaluation. There’s even a chapter on how to present results to stakeholders.

One of the most interesting chapters is the quiz, where you’re encouraged to write in the book. Can you identify what makes particular questions effective or ineffective? I’d love to see him turn this book into an interactive online course using the questions in that quiz.

I highly recommend this book if you’re interested in creating evaluations that truly work for corporate training and elearning. If you’re in higher education, the book may still be useful, but you’d have to adapt the questions since the focus is really on performance change rather than long-term education.

The book is available on Amazon and on SmileSheets.com. If you need a discount for buying multiple copies of the book, use the second link.

 

Save

When Is Audio Narration Helpful?

In a discussion on eLearning Heroes, Judith Reymond asked about the research on when or whether audio narration is helpful to adult learners.

Speaker and sound waves

In Clark and Mayer’s eLearning and the Science of Instruction, they say that the research generally supports using narration with on-screen visuals. Adult learners retain more from a narration plus visuals approach than from reading on-screen text. They call this the “modality principle.”

Generally speaking, when you have narration, you shouldn’t also have that same text on the screen. This is called the “redundancy principle.” Clark and Mayer note some exceptions when text should be shown on screen (pp. 87-88, 107-108 in the 1st ed):

  • Complex Text: Complex text like mathematical formulas may need to be both on-screen and in narration to aid memory. (In practical experience, I also do this for text that has to be memorized word for word, such as screening questions for addiction.)
  • Key Words: Key words highlighting steps in a process or technical terms
  • Directions: Directions for practice exercises. “Use onscreen text without narration to present information that needs to be referenced over time, such as directions to complete a practice exercise.”
  • No Graphics: When there are no graphics or limited graphics on the screen
  • Language Difficulties: When the audience has language difficulties. I have used redundant on-screen text for an audience with very low literacy and a high percentage of learners with English as a second language. It might be enough to simply provide a transcript or closed captions in those situations so people who don’t need it can ignore or turn off the text.

In practical terms, I’ve found that if every page has narration and you suddenly have no narration for a practice exercise, some learners think something’s broken on the page. I generally have the narrator say something short to introduce the practice exercise, but leave the directions as on-screen text.

However, it’s also tiring to listen to a voice. I usually don’t provide audio feedback on practice activities to give people a break. I’ll sometimes provide other kinds of interaction or content delivery to provide a break from the audio (tabs or “click to reveal” text).

In the book, Clark and Mayer say this:

“Does the modality principle mean you should never use printed text? Of course not. We do not intend for you to use our recommendations as unbending rules that must be rigidly applied in all situations. Instead, we encourage you to apply our principles in ways that are consistent with the way that the human mind works—that is, consistent with the cognitive theory of multimedia learning rather than the information delivery theory.”

The principle of avoiding redundant on-screen text is sometimes treated as sacrosanct. I’ve seen some big names in the field practically yell that this is a firm rule that should never be broken. In real life, it’s not as clear cut, as even Clark and Mayer acknowledge. There’s plenty of redundant on-screen text that has no business being there. You should be thoughtful and intentional if you’re going to provide on-screen text. Generally, it shouldn’t be there, and you need a real reason to break the redundancy principle.

What are your experiences with audio, especially with on-screen text? What have you found works with your audiences?

Do Learning Styles Really Work?

Does tailoring your content for different learning styles really work? Maybe there’s a better use of your time and resources.

View the presentation for more. (If you’re reading this in email or a feed reader and don’t see anything below, you may have to click through to view the presentation.)

Ready to learn more?

Debunker Club Works to Dispel the Corrupted Cone of Learning

A new group called The Debunker Club is working to dispel myths and misinformation in the learning field. From their website:

The Debunker Club is an experiment in professional responsibility. Anyone who’s interested may join as long as they agree to the following:

  1. I would like to see less misinformation in the learning field.
  2. I will invest some of my time in learning and seeking the truth, from sources like peer-reviewed scientific research or translations of that research.
  3. I will politely, but actively, provide feedback to those who transmit misinformation.
  4. At least once a year, I will seek out providers of misinformation and provide them with polite feedback, asking them to stop transmitting their misinformation.
  5. I will be open to counter feedback, listening to understand opposing viewpoints. I will provide counter-evidence and argument when warranted.

This year, coinciding with April Fool’s Day 2015, the Debunker Club is running an experiment. We’re making a concerted effort to contact people who have shared the Cone of Experience (also known as the Cone of Learning or the Pyramid of Learning).

Many iterations of this cone exist. A Google image search for “cone of learning” returns dozens of results, most of which are false. If you’ve seen something like this that said, “People remember 10% of what they read, 20% of what they hear, 30% of what they see” etc., you’ve seen a variation on this theme.

Image search results for Cone of Learning

The original cone was developed by Edgar Dale and didn’t include any numbers. The later versions are the “corrupted cone” with fictitious statistics added.  Will Thalheimer’s post from 2009 debunking these claims is where I learned it was incorrect. Common sense might give you a hint that these numbers aren’t really based in research though. Think about it–how many times have you see research where all the categories broke into even 10% segments?

As part of the Debunker Club’s efforts, I discovered a post on Dane’s Education Blog called A Hierarchy of Learning. Although this post cites a great debunking article (Tales of the Undead…Learning Theories: The Learning Pyramid), the blog author only says that he “appreciate what it conveys.”

I left the following comment on his Learning Pyramid post.

Thanks for collecting so many resources on your blog. I can see that you’ve worked really hard to share many links and ideas with your readers.

However, the information above, though it may appear to have scientific support, has been exhaustively researched and found to have no basis in science. In fact, the “Tales of the Undead” link you cite debunks it.

An article from the scientific journal Educational Technology shows no research backing for the information. (Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Mythical Retention Chart and the Corruption of Dale’s Cone of Experience. Educational Technology, Nov/Dec 2014, 54(6), 6-16.)

The information presented is likely to produce more harm than good, promoting poor learning designs and hurting learners.

While we might abstract some beneficial notions from the percentages portrayed in the misleading information — namely that encouraging realistic practice has benefits — there are numerous faulty concepts within the bogus percentages that can do real harm. For example, by having people think that there are benefits to seeing over hearing, or hearing over reading, we are sending completely wrong messages about how learning works.

Most importantly, recent advances in learning science have really come together over the last two decades. The misleading information was first reported in 1914, with no research backing. It’s better to follow more recent findings than information that has no scientific basis. See, for example, the book Make it Stick by Brown, Roediger, and McDaniel. Julie Dirksen’s Design for How People Learn is another great selection.

I’m part of a larger community of folks called the Debunker Club who are attempting to encourage the use of proven, scientifically-based learning factors in the learning field.

I’m going to be posting about this misleading information on my blog. I hope you’ll comment and respond to my post if you wish. I (and the debunker community in general) want to learn how other people feel about the issues and ideas surrounding the original information and our approach to debunking myths and sharing evidence.

If you’re interested in dispelling misinformation and improving the learning field, please join the Debunker Club and participate in the conversation.