Do Learning Styles Really Work?

Does tailoring your content for different learning styles really work? Maybe there’s a better use of your time and resources.

View the presentation for more. (If you’re reading this in email or a feed reader and don’t see anything below, you may have to click through to view the presentation.)

Ready to learn more?

Debunker Club Works to Dispel the Corrupted Cone of Learning

A new group called The Debunker Club is working to dispel myths and misinformation in the learning field. From their website:

The Debunker Club is an experiment in professional responsibility. Anyone who’s interested may join as long as they agree to the following:

  1. I would like to see less misinformation in the learning field.
  2. I will invest some of my time in learning and seeking the truth, from sources like peer-reviewed scientific research or translations of that research.
  3. I will politely, but actively, provide feedback to those who transmit misinformation.
  4. At least once a year, I will seek out providers of misinformation and provide them with polite feedback, asking them to stop transmitting their misinformation.
  5. I will be open to counter feedback, listening to understand opposing viewpoints. I will provide counter-evidence and argument when warranted.

This year, coinciding with April Fool’s Day 2015, the Debunker Club is running an experiment. We’re making a concerted effort to contact people who have shared the Cone of Experience (also known as the Cone of Learning or the Pyramid of Learning).

Many iterations of this cone exist. A Google image search for “cone of learning” returns dozens of results, most of which are false. If you’ve seen something like this that said, “People remember 10% of what they read, 20% of what they hear, 30% of what they see” etc., you’ve seen a variation on this theme.

Image search results for Cone of Learning

The original cone was developed by Edgar Dale and didn’t include any numbers. The later versions are the “corrupted cone” with fictitious statistics added.  Will Thalheimer’s post from 2009 debunking these claims is where I learned it was incorrect. Common sense might give you a hint that these numbers aren’t really based in research though. Think about it–how many times have you see research where all the categories broke into even 10% segments?

As part of the Debunker Club’s efforts, I discovered a post on Dane’s Education Blog called A Hierarchy of Learning. Although this post cites a great debunking article (Tales of the Undead…Learning Theories: The Learning Pyramid), the blog author only says that he “appreciate what it conveys.”

I left the following comment on his Learning Pyramid post.

Thanks for collecting so many resources on your blog. I can see that you’ve worked really hard to share many links and ideas with your readers.

However, the information above, though it may appear to have scientific support, has been exhaustively researched and found to have no basis in science. In fact, the “Tales of the Undead” link you cite debunks it.

An article from the scientific journal Educational Technology shows no research backing for the information. (Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Mythical Retention Chart and the Corruption of Dale’s Cone of Experience. Educational Technology, Nov/Dec 2014, 54(6), 6-16.)

The information presented is likely to produce more harm than good, promoting poor learning designs and hurting learners.

While we might abstract some beneficial notions from the percentages portrayed in the misleading information — namely that encouraging realistic practice has benefits — there are numerous faulty concepts within the bogus percentages that can do real harm. For example, by having people think that there are benefits to seeing over hearing, or hearing over reading, we are sending completely wrong messages about how learning works.

Most importantly, recent advances in learning science have really come together over the last two decades. The misleading information was first reported in 1914, with no research backing. It’s better to follow more recent findings than information that has no scientific basis. See, for example, the book Make it Stick by Brown, Roediger, and McDaniel. Julie Dirksen’s Design for How People Learn is another great selection.

I’m part of a larger community of folks called the Debunker Club who are attempting to encourage the use of proven, scientifically-based learning factors in the learning field.

I’m going to be posting about this misleading information on my blog. I hope you’ll comment and respond to my post if you wish. I (and the debunker community in general) want to learn how other people feel about the issues and ideas surrounding the original information and our approach to debunking myths and sharing evidence.

If you’re interested in dispelling misinformation and improving the learning field, please join the Debunker Club and participate in the conversation.

My History of Live Blogged Notes

When I attend webinars or participate in online courses and conferences, I usually live blog my notes. That helps me remember what I attended and what I learned, and it lets me share that knowledge with others. In a recent discussion about how I have learned about instructional design without getting a master’s degree, someone asked me what courses and webinars I’ve attended. Because I have done so much live blogging, I was able to provide proof of my ongoing professional development efforts. These posts go back to 2007, so some of the content and references are dated. Generally newer posts are at the top of each category.


Storytelling and Scenario-Based Learning

Synchronous Learning

 Attention and Motivation

Trends and Future Predictions

Games and Simulations

LMSs and Other Tools

Learning Communities

Other Topics

Image credit: Matthew Bowden ( [Attribution], via Wikimedia Commons

ID and e-Learning Links (2/2/14)

  • Gavin Henrick on Moodle Repositories, including tables comparing features

    tags: moodle

  • Example of a branching scenario activity with a coach and a meter showing progress on every screen, built in ZebraZapps

    tags: scenarios e-learning zebrazapps branching

  • Research on feedback’s effect on performance. Feedback is generally helpful but can be detrimental, especially for more complex tasks. Goal setting can help mitigate the risks of feedback interventions.

    tags: feedback research learning

  • Research on the effects of feedback interventions. Feedback is not always beneficial for learning; in some cases, it can actually depress performance.

    tags: feedback instructionaldesign learning research

    • The MCPL literature suggests that for an FI to directly improve learning, rather than motivate learning, it has to help the recipient to reject erroneous hypotheses. Whereas correcting errors is a feature of some types of FI messages, most types of FI messages do not contain such information and therefore should not improve learning—a claim consistent with CAI research.

      Moreover, even in learning situations where performance seems to benefit from FIs, learning through FIs may be inferior to learning through discovery (learning based on feedback from the task, rather than on feedback from an external agent). Task feedback may force the participant to learn task rules and recognize errors (e.g., Frese & Zapf, 1994), whereas FI may lead the participant to learn how to use the FI as a crutch, while shortcutting the need for task learning (cf. J. R. Anderson, 1987).

    • In the MCPL literature, several reviewers doubt whether FIs have any learning value (Balzer et al., 1989; Brehmer, 1980) and suggest alternatives to FI for increasing learning, such as providing the learner with more task information (Balzer et al., 1989). Another alternative to an FI is designing work or learning
      environments that encourage trial and error, thus maximizing learning from task feedback without a direct intervention (Frese & Zapf, 1994).
  • Female voice over and audio editing for e-learning. Demos are on the website. She has done dialog for more conversational courses in the past, although that demo isn’t on her public website.

    tags: voiceover audio

  • Calculator for pricing custom e-learning based on a number of factors (graphics and multimedia, interactivity, instructional design)

    tags: e-learning pricing

  • This isn’t really about SCORM, but a question on pricing e-learning courses for perpetual licenses rather than annual per user fees

    tags: e-learning pricing

    • Most perpetual license deals I’ve seen in the eLearning space are usually priced at 3.5x the annual user price plus another 10-15% of the contract value for course maintenance and support.
  • Example of why experts often can’t teach well (or write good courses without an ID) based on research of NICU nurses who knew how to recognize infections but were such experts that their knowledge had become automatic and intuitive for them.

    tags: sme training research

    • When you’re a domain expert in your field, it’s difficult to step back and remember what it was like to be a beginner. Once we have knowledge, it’s very hard to remember what life was like without it.

      Instead of placing the burden of training on a subject-matter expert, it’s often more effective to establish a collaboration between subject-matter experts and trainers who are experts in breaking down information, recognizing the critical elements, and putting it back together in a way that’s digestible for people who aren’t experts.

Posted from Diigo. The rest of my favorite links are here.

Research in Gamification of Learning and Instruction

Last week, I posted a rebuttal to Ruth Clark’s claim that “Games Don’t Teach.” In that post, I shared several links to research about the effectiveness of games for learning. If you are interested in a more in-depth review of research, Karl Kapp’s new book The Gamification of Learning and Instruction has an entire chapter titled “Research Says…Games are Effective for Learning.” This chapter focuses on two areas of the research: meta-analysis studies and research on specific elements of games.

Gamification of Learning and Instruction

The meta-analysis section has a useful table providing a quick summary of the major findings of each meta-analysis reviewed. Here’s a few points from that research:

  • “Game-based approach produced significant knowledge-level increases over the conventional case-based teaching methods.” (Wolfe, 1997)
  • “An instructional game will only be effective if it is designed to meet specific instructional objectives and used as it was intended.” (Hays, 2005)

In the elements of games section, Karl summarizes several individual studies and their findings in the following areas:

  • Reward structures
  • Player motivation (both intrinsic and extrinsic)
  • Avatars
  • Player perspective

Gamification in learning is often viewed very superficially as just adding extrinsic motivators like badges and leaderboards. In this book, Karl recommends going beyond that shallow understanding to look at the ways that games can be effective and to use those elements to enhance learning.

If you’re interested in more information about the book, check out the other posts in the blog book tour.

References (as cited in The Gamification of Learning and Instruction):

Hays, R.T. (2005). The effectiveness of instructional games: A literature review and discussion. Naval Air Warfare Center Training Systems Division (No 2005–004).

Wolfe, J. (1997) The effectiveness of business games in strategic management
course work. Simulation & Gaming, 28(4), 360–376.