Category: Workplace Learning

Should We Create Courses or Just Performance Support?

In my last post, I shared some thoughts about why people need to actually learn and remember things, rather than assuming we can always look them up. This post continues that discussion with the question of whether we should create courses or whether informal learning and performance support are sufficient.

create_courses

Question 2: Should We Create Courses?

Another argument is that while people do need to learn, they can do it all on the job with performance support and coaching. According to this perspective, informal and nonformal training is good, but formal training and courses are a waste of time.

Alexander Salas has argued against courses as an “academic model.” His LinkedIn post asking to “stop giving me courses” has generated almost 100 comments to date.

What Is a “Course”?

Some of our disagreement is due to differing definitions of “course,” which Alexander sees as a purely academic tool, divorced from practice and feedback. I think courses can and should include practice and feedback.

Alexander defined a course as “an academic tool to achieve educational objectives.” If you define course as something that can only be used in academia, obviously it doesn’t fit with workplace training. I’m not sure that’s a useful definition though. If courses are only academic, what do you call formal workplace training?

In that same conversation, Mirjam Neelen explained, “For me, a course means nothing but ‘a formally designed learning experience’ and can include many different instructional AND learning methods. A course can include on the job learning, coaching, performance support tools, in other words, the whole shebang.”

Mirjam’s examples might be a bit too broad, but I agree with the first part of a “formally designed learning experience.”

My definition: A course is a formally designed learning experience with a defined start and end point (either in time or content).

I want to differentiate a particular course from a whole curriculum or longer program, and I think ongoing performance support and coaching aren’t actually courses and should be excluded from the definition. Coaching and on-the-job learning are also not formally designed.

Five Moments of Need

I find it helpful to refer to the Five Moments of Need for these types of discussions. Conrad Gottfredson and Bob Mosher have identified five different types of situations when learning needs to occur. Here’s how they define the five moments:

  1. When people are learning how to do something for the first time (New);
  2. When people are expanding the breadth and depth of what they have learned (More);
  3. When they need to act upon what they have learned, which includes planning what they will do, remembering what they may have forgotten, or adapting their performance to a unique situation (Apply);
  4. When problems arise, or things break or don’t work the way they were intended (Solve); and,
  5. When people need to learn a new way of doing something, which requires them to change skills that are deeply ingrained in their performance practices (Change).

— Conrad Gottfredson and Bob Mosher in Are You Meeting All Five Moments of Learning Need?

When it’s a New skill, formal training is usually the fastest way to get people up to speed. It may not be the only way, but it gets people to the desired level of competency faster. If your work mostly deals with Apply, Solve, and Change, courses might not be the best approach.

Mosher and Gottfredson argue for performance support through the learning process. They’re right to criticize the field for focusing solely on single event training, which is really most appropriate for New needs (and sometimes More). However, a performance support approach doesn’t mean we should never create courses or provide formal training. It means formal training isn’t our only solution.

Faster Expertise

Ruth Clark has written about accelerating expertise with scenario-based elearning (to be clear: I consider that a course). In one example described, they found that automotive technicians need to complete 100 work orders before reaching competence.

    • If they do on-the-job training (OJT), that takes them 200 hours.
    • If they do instructor-led-training (ILT), they can cut the time in half to 100 hours.
    • If they do scenario-based elearning, it only takes 33-66 hours to reach competence.
Chart showing Hours for Automotive Technicians to Reach Expertise with OJT, ILT, and scenario-based learning
Chart based on data from Ruth Clark’s book Scenario-Based eLearning and article Accelerating Expertise with Scenario-Based eLearning

Can technicians get there with OJT and no courses? Sure, but you waste money and time doing so. The best decision for the business and the learners is to create a scenario-based elearning course. In this case, ILT might be a viable solution too, since it cuts the time to expertise in half. Regardless of the method or technology, formal training means becoming competent at least twice as fast as just learning as you go.

For a new skill, learning how to do something for the first time, you need formal training to establish the foundation skills. Learning a new skill on the job means more errors, greater frustration, and longer time. People may develop faulty mental models of how things work if they aren’t trained, which becomes more difficult to unlearn than if they’d gotten formal training in the first place.

Practice with Feedback

One of the criticisms of courses raised in this discussion was that how people really learn is through practice with feedback. That is clearly true; practicing a skill while getting feedback to adjust and improve your performance is critical. I argue that good courses should (and do!) include opportunities for practice.

In both academic and workplace training courses, we can spend time on practice, not just information sharing. When I taught K12 music and band, we spent probably 4 times as much time singing or playing as talking about theory. We spent most of our course time doing the thing, rather than talking about the thing. That’s my background, and that’s still how I try to approach workplace training.

For training network engineers, I’ve done paper cutouts, stickers with icons, or digital graphics to practice making network diagrams to solve a problem in a case study. For training WIC counselors, one way I provide practice opportunities is with branching scenarios to simulate conversations. For training bulldozer safety, I gave learners a simulated dashboard with a warning light and asked them to decide what to do next. For food safety training, I gave learners a picture of an employee where they needed to identify the violations and how to meet the standards.

Every one of those practice examples above was part of a course. Any definition of course that excludes practice isn’t a viable definition.

Use Both Courses and Performance Support

The solution here isn’t to only use courses and forget about everything else. The question shouldn’t be should we use courses or performance support; this doesn’t have to be either/or. The answer is to use both courses and performance support, depending on the learner and organization needs.

Your Thoughts?

When do you use a course as a solution versus performance support? How do you determine which solution (or combination) is the best path? Let me know in the comments.

Do People Need to Learn, or Can They Look It All Up?

I have been part of several discussions recently that questioned the value of creating courses and delivering formal training. There’s a perception among some people (including some L&D folks) that as long as you have Google and a good network of resources that you can look up anything you need. The other, related idea is that everything can be learned on the job with performance support, without formal training. In this post, I’ll examine the first question.

Do People Really Need to Learn?

Question 1: Do People Need to Bother Learning?

The first argument asks if people need to bother learning anything at all, or if they can just look it up when they need it. Do you really need to remember if you have a mobile phone and a search engine always available?

For example, Bruce Graham started a lively conversation in the Articulate Heroes community by describing someone he met at a conference. She said she takes all the elearning in their organization, regardless of quality, but doesn’t bother to remember much because she knows she can always look it up later. Bruce explains that Henry Ford approached building cars the same way; he found ways to assemble a group of experts and made them available any time he had a question.

He did not need to learn, just have access to knowledge.

If this is how people are REALLY now using online learning, and using our product(s), do all our clever animations, graphics, interactions and so on actually matter any more?

Let’s just give out facts, because millennials know how to access it, and will go back when they need it.

Why do they need to bother learning?

Sometimes You Can Look It Up

I think plenty of things can just be looked up at the time of need. I don’t need to memorize the recipes for most of the dishes I cook; I can just read the recipe to get the exact amounts and steps. For those sorts of tasks, we should probably be creating job aids (recipes and hints for work tasks) rather than courses. At a minimum, we should be creating training plus job aids, or training that helps people learn how to use performance support.

I recently wrote a course where one of the main goals is for people to know where to find and how to use the resources. We don’t care if they can remember all 10 points and 50+ subpoints of this policy. We care that they’re aware that the policy exists and that they can navigate the website to look up the policy when they need it. Therefore, the content delivery is very light. The practice activities are questions like “look up in Table 1 what you need for this safety precaution” and “use this self-assessment to determine what components of the standard you’re currently meeting or not.”

Deeper, Internalized Expertise

Some tasks require a deeper expertise though. A musician can’t stop in the middle of a song to look up a fingering. A salesperson can’t ask a customer to “hold that thought” while he fires up the elearning on objection handling. A doctor can’t ask a patient to wait while she pulls up the example audio of what a heart murmur sounds like for comparison. A line manager can’t walk out in the middle of a meeting to review the online course about delegation. Those skills require internalizing knowledge deeply enough that you can use them at the time of need. You can’t have everything be “just in time.”

Finding Information Isn’t Learning

Steve Flowers argued that searching is fine for finding information, but that’s not the same as training.

We have unprecedented access to good and bad information. To perfectly valid facts and information that can help us get things done. To perfectly misleading and wrong information that can lead us down the wrong path.

This is the core problem with the way many view training and learning. This conflation of movement of information with the efficiency of a training solution is flat wrong. It’s not about storing information in our heads. It’s about being able to adapt and adapt quickly to whatever challenges the task you’ve trained for presents. This rarely hinges on our ability to recall information. Doesn’t mean information isn’t important. But that’s only one ingredient.

Cake != Flour. It’s more than that.

Information != Knowledge != Behavior != Task Success != Results

Work is more complicated than, “Let me Google that” for many types of things that we do. Adept search skills are great. Helpful. But that domain expertise does not transfer to all other domains equally.

I think Steve makes a really important point here. Training is more than just sharing information. It’s also providing people opportunities to practice skills and get feedback to improve performance.

Which Tasks Need to Be Trained?

Sometimes, looking things up (like a recipe or a table of standards) is enough. Sometimes, it’s not. So how do we figure out which tasks are skills that need to be trained and which ones just need a job aid or a searchable resource?

Julie Dirksen shared an idea in her Learning Solutions Conference presentation that I think helps make that distinction.

Think of a task or topic where you might create training or performance support. Is it reasonable to think that someone can be proficient without practice? If people can be proficient without practice, you don’t need training. If people need practice to be proficient, that’s a skill where training might be helpful.

Performance support might also be helpful, especially after the initial training when people are practicing on the job. If it’s something you need to practice, just searching for information won’t be enough though.

Should We Create Courses?

In my next post, I’ll expand more about whether or not we should create courses. (Hint: I think we should, at least sometimes.)

What I Learned at LSCon18

Last month, I attended the Learning Solutions 2018 Conference in Orlando. Once again, it was a great experience. I had fun meeting people like Judy Katz, Tracy Parish, Cammy Bean, and Clark Quinn in person who I have known online for years, plus seeing people again from last year.

Now that I’ve had a few weeks to process and reflect, I want to summarize some of what I learned. I did a similar post last year, and it helped me reinforce and remember what I learned. This is my own “spaced repetition” to help me use these ideas. These comments won’t always be the most important thing each speaker said, but one thing I took away from the session and think I can apply in my own work.

Diane Elkins: Microlearning Morning Buzz

One of the things I appreciated from Diane’s discussion was the balanced approach. This wasn’t the “microlearning will solve all of our problems!” hyperbole I see from many sources. We talked about how microlearning is sometimes a solution, sometimes on its own, sometimes in combination with other forms of training.

Diane also shared a really great idea for training where you have to meet a certain minimum time to meet a legal or regulatory requirement. Instead of doing lots of content dump (which is sometimes padded to fill the required time), why not do just the minimum content plus a lot of practice and maybe reflection?

As a side note, Diane’s hand puppet demonstrations of pointless conversations with SMEs are hilarious.

Kai Kight: Composing Your World

As a musician and former music educator, it was really fun to hear a session start with violin and to watch the reactions of the audience.

One of the ideas from his keynote was to not get so wrapped up in the notes that you forget who you’re playing for. That applies to our work (and many fields); we have to always keep thinking about the audience and what they need.

Kevin Thorn: Comics for Learning

This was a session I attended specifically because it’s outside of what I normally do for work. I’m not quite sure how I’m going to apply this yet, but thinking about the various visual styles for comics gives me some new ideas.

Kevin shared a ton of research, examples, and resources. One that I need to dig into more is the Visual Language Lab.

Ann Rollins and Myra Roldan: Low-Cost, High-Impact AR Experiences

This was another session where I have no experience with the topic, but was curious to see some possibilities. My biggest takeaway is that several tools for simple AR are pretty affordable and easy enough to get started. Simple things like showing a video or a little information to explain features is very doable. We tested Zappar in the session to create a quick sample, but Layar also looks promising.

Julie Dirksen: Strategies for Supporting Complex Skill Development

I took 5 pages of notes from this session, so this could be several blog posts on its own.

How do you know if something is a skill or not? Is it reasonable to think that someone can be proficient without practice? If not, it’s a skill.

One idea I’m going to start using immediately is for self-paced elearning where I ask learners to type a longer answer. I have been using a model answer to compare as a way to help learners evaluate their own work, which is a good start. Julie talked about giving learners a checklist to guide their self-evaluation even more. I can implement that right now.

Tracy Parish: Free eLearning Design Tools

This was a discussion about free tools and how people use them, based largely on Tracy’s immense list of free tools.

Platon: Powerful Portraits

This was an engaging keynote because he has so many great stories about famous (and not famous) people.

One idea he shared was that if you can get people to see themselves in the story you put forward, maybe you can build bridges to connect people.

Cammy Bean: Architecting for Results

The big idea from this presentation was to think about broader systems for learning. Instead of content in a single event, it’s a journey over time. It’s a mix of what you do before the training, during the training, and after the training, but we often focus just on the middle portion.

The mix is going to be a little different for every program. This model is one way to think about the different pieces.

  1. Engage
  2. Diagnose
  3. Learn/Understand
  4. Apply
  5. Assess
  6. Reinforce

The simplified version is Prepare – Learn – Practice – Reflect.

Connie Malamed: Design Critique Party

The best thing I took away from this session was actually the protocol for requesting and giving critiques.

The protocol for the designer requesting critique:

  1. State your objective.
  2. Walk people through the experience.
  3. Say what you’d like to get feedback on
  4. Become an impartial observer

The protocol for giving critique:

If your objective is ____________, then _________ [critique].

Joe Fournier: Novel Writing Tricks

One idea from this session was about how the idea of a theme might apply to learning. The theme ties to the objective. It’s the emotional or value shift in a story. In learning, the theme might be how reporting employee theft is better for everyone.

Panel: Evolution of Instructional Design

This panel included Connie Malamed, Diane Elkins, Kevin Thorn, and Clark Quinn.

Diane Elkins pointed out that in classroom training, we often ask questions that only a few people will answer, and we don’t track them. If it’s OK to do that in classroom training, why not do it in elearning too? It’s OK to ask learners to type longer answers even though some of them will skip it. Let’s not punish the people who are willing to do the work and learn more just because we can’t track it or not everyone will do it.

David Kelly moderated the panel, and he pointed out how our field has a lot of “shiny object syndrome.” We’re often looking for “one tool to rule them all” that will fix every problem in every situation. That just doesn’t exist.

David Kelly: Shifting from Microlearning to Micromoments

There is no definition of microlearning. There are lots of opinions, some of which are labeled as definitions.

Maybe we should be thinking about micro as in microscope: something that narrows the scope of focus to a tiny part of the whole.

Bethany Vogel & Cara Halter: cMOOCs can be Effective

They used Intrepid Learning as the platform, which may be worth exploring more.

In their model, a MOOC is a time-bound online program that contains highly contextualized spaced, micro, and social learning. “Massive” and “Open” aren’t really part of their model, so I admit I’m not sure why they’re calling it a MOOC (other than that’s what their clients are asking for). I think you could do this same model with moderated social, spaced learning and call it “blended” or a “journey.” The experience was good, even if I might quibble with the label.

Photos

You can get a feel for the conference here.

 

Using Time as Scenario Feedback

Nicole is creating a branching scenario practicing communication techniques for nutrition counselors to better understand their clients’ goals. She has written a simulated conversation between a counselor and a client. Her SME, Brian, provided this feedback after reviewing the prototype.

The conversation overall does a good job giving plausible choices for questions and showing realistic responses from the client. I want this to be really exciting for learners, like a game. Let’s add a timer for each decision. That way, they’ll be motivated to answer quickly and keep pushing through the scenario.

What do you think? How should Nicole respond to Brian?

A. That’s a great idea! I think that will enhance the learner experience.

B. I’m not sure. Let me do some research.

C. Timing might not be the best form of feedback for this particular course.

Remember your answer; we’ll come back to this question at the end of the post.

Using Time as Scenario Feedback

When Time is Effective

Time can be a very effective consequence in some learning situations. Check out the Lifesaver training on what to do in emergency situations for an example with effective use of time as feedback.

In the first scenario with Jake, you help someone in cardiac arrest. Each question has a 5 second timer, and you are scored for both accuracy and speed.

6/6 Right First Time - Avg Speed 1.32s

Later in the scenario, you simulate performing CPR by pressing two keys on your keyboard in the same rhythm as CPR. While you practice, you see a scale from good to bad showing how close you are to the ideal timing. This lets you adjust your rhythm. After you finish the 30 seconds of simulated CPR, you see a percentage score for your accuracy.

Scale showing good and bad for speed

This feedback works in the Lifesaver training because timing really is a critical part of the skill being taught. Speed of response matters in these emergency situations, as does knowing the right rhythm for CPR.

Time can work for other skills too, like manufacturing, making sandwiches in a chain restaurant, or safety training.

When Time is Counterproductive

If the skill you’re practicing and assessing requires critical thinking and careful consideration, measuring time can be counterproductive. For simulated conversations where you want learners to pause and think about their options, it’s better to not use a timer.

You might be thinking, “But in a real conversation, people need to think quickly. Doesn’t that mean we should use timers?” That’s a question about fluency, which requires more practice over time. If your goal is to get people to that point of fluency, you might add a timer, but not for the initial practice. Teach the skill without a timer first, then provide additional practice opportunities to build fluency and speed in the skill.

Do Timers Improve Motivation?

Does a timer increase motivation for getting the right answer? Maybe, if the learners are already motivated prior to starting and time makes sense in the context of the activity. Many games use time as a way to keep players engaged and excited.

I suspect in practice that unnecessary timers encourage people to guess randomly for whatever they can click fastest. Time may actually decrease learners’ motivation to be truly cognitively engaged with the learning experience. They may be more motivated just to click and get through it quickly than to read carefully and understand their decisions.

Accessibility

Timers can create challenges for accessibility. Learners with visually impairments who use a screen reader and keyboard navigation will generally need more time to answer. Learners with  mobility impairment may have trouble manipulating a mouse or keyboard quickly. Depending on your audience, adding timers may prevent some learners from being successful in your elearning courses, even if they could do the real task (like having a conversation) without problem.

Revisiting the Communication Scenario Example

Think back to Nicole’s scenario at the beginning of this post. She’s teaching communication skills to nutrition counselors, using a simulated conversation. Her SME, Brian, suggested adding a timer.

What do you think? Does a timer seem helpful in this situation?

Probably not. In this training, it’s more important for learners to think carefully about their choices and responses than to be speedy. Feedback like the expression of the client or a scale showing the client’s motivation to change their eating behavior would be more beneficial than feedback on how quick they are.

Your Examples?

Time can work as feedback in learning scenarios, but it should be used sparingly, and only when it is actually relevant to the skill being practiced or assessed.

Do you have any examples of time used successfully as feedback in a scenario? I’d love to see some more samples. Share them in the comments.

 

Book Review: Practice and Feedback for Deeper Learning

Patti Shank’s Practice and Feedback for Deeper Learning is a summary of tactics you can use to create memorable, relevant practice opportunities and provide constructive, beneficial feedback for learners. Everything in the book is backed by research and written to be immediately usable by instructional designers and trainers.

Cover: Practice and Feedback for Deeper Learning

This is the second installment in Patti’s “Make It Learnable” series, which is shaping up to be one of those sets of fundamental reading in the field of instructional design. The first book is Write and Organize for Deeper Learning; you can read my review of the first book. As with that book, this book gives you a shortcut to what really works based on evidence, without having to wade through complex (and often contradictory) research yourself. Specifically, this is based on training research, not research on K-12 or higher education learners.

Have you ever wondered…?

  • How do we create practice activities that will help transfer skills to the workplace?
  • Ho can we create practice activities that are more memorable?
  • How can we create more effective feedback than just “correct” and “incorrect”?
  • Do novice and experienced learners benefit from the same strategies?
  • How do we make sure learners are practicing the right skills and behaviors?
  • How can we help learners deal with errors and mistakes?
  • If we’re training a complex task, should we divide the task into small parts or train a simple version of the whole task?
  • Is it better to give feedback right away or to delay it?
  • What kinds of realism are important to training practice? Is it necessary to use lots of multimedia to make training look exactly like the work environment?
  • Is it better to set goals for specific performance levels or goals for making progress in learning?

All of these questions are addressed in this book through 5 overall strategies divided into 26 tactics.

Go buy Practice and Feedback for Deeper Learning now. Read it, and then pick something relevant to apply to your own work. After all, the best way to improve your own learning design is to practice using these tactics yourself.