Found 36 bookmarks
Newest
(PDF) Storifying Instructional Videos on Online Credibility Evaluation: Examining Engagement and Learning
(PDF) Storifying Instructional Videos on Online Credibility Evaluation: Examining Engagement and Learning

This study researched whether adding story elements to an instructional video affects motivation, emotional engagement, and learning. In the research, they explain that they did not find any difference between a well-produced instructional video and a storified instructional video.

However, the storified video feels very artificial to me. This isn't a story about a relevant character the learners can identify with who uses the concepts in realistic situations (or even slightly exaggerated ones). This is about a fake detective agency. I'd be cautious about assuming this research applies to realistic stories as well.

·researchgate.net·
(PDF) Storifying Instructional Videos on Online Credibility Evaluation: Examining Engagement and Learning
Masters' Q&A - Clark Quinn
Masters' Q&A - Clark Quinn
Clark Quinn answers 5 questions on simulations for training leaders and how they help provide practice opportunities. I appreciate the clarification on what he means by a simulation and the differentiation between "simulation" and "branching scenario," as those terms are often muddy.
Scenarios can be implemented in actual simulations (where the world is actively modeled, and the consequences are calculated), or in branching scenarios where the relationships are hard-coded in the consequences attached to a decision.
scenarios give us contextualized practice, which research shows leads to better retention and transfer. With the right choices, the scenario is engaging and provides meaningful practice, which leads to acquiring new abilities.
<div data-breakout="normal"><p class="-Zro6 -ZADH IjV6v AnCKd _57WYp" id="viewer-avnma"><span class="d0767"><span style="font-weight:700"><span>DAN: What makes learning through a simulation experience unique and meaningful?</span></span></span></p></div><div type="paragraph" data-hook="rcv-block26"></div><div data-breakout="normal"><p class="-Zro6 -ZADH IjV6v AnCKd _57WYp" id="viewer-d53db"><span class="d0767"><span style="font-weight:700"><span>CLARK</span></span><span>: As above, research says that contextualized practice (with feedback) is the best way to develop new abilities. They need to have a ’story’ setting: a context, then something happens that precipitates the need for a decision, and then the decision has consequences. That’s just a better-written multiple-choice question (please!), but if we drive the outcomes from a branching or simulation basis, this can lead to new decisions (they travel in packs!)</span></span></p></div>
·topfbusinesslearning.com·
Masters' Q&A - Clark Quinn
Two Questions That Should Be Influencing Every L&D Team's Strategy
Two Questions That Should Be Influencing Every L&D Team's Strategy
"Does learning science play a role in our work, and, just how much of an impact does it have on our profession?"
Does learning science play a role in our work, and, just how <em>much</em> of an impact does it have on our profession?
We must be able to see the gaps in our instructional design, which learning science helps to <a href="https://blog.learnlets.com/2020/08/the-case-for-learning-science/">support</a>. It also gives us a basis to infer how to use new technologies. If we want to avoid doing slide presentations, we have to know what cognitive (and emotional) advantages these technologies have so we can leverage them for success.
However, I’ve argued strongly that what’s most critical is the ability to make better decisions. The decisions we make determine our success. We make better decisions <em>about</em> learning when we know how learning works. It may not be all the time, but they will be the most impactful decisions when leveraging evidence-based approaches to our work. If we don’t have the foundation for learning-grounded decisions, the important ones not be made on a solid basis.
Regardless of its latest terminology, we need to be able to see past the hype and be able to evaluate the claims.
·linkedin.com·
Two Questions That Should Be Influencing Every L&D Team's Strategy
Best from the Brightest: Key Ideas and Insights for L&P Professionals - TiER1 Performance
Best from the Brightest: Key Ideas and Insights for L&P Professionals - TiER1 Performance
48 learning and performance leaders share their favorite content shared in 2021. Use this list to find both new sources to read and new people to follow. Many of the participants also shared trends to watch in 2022, other recommended content, and additional people to follow. This list is heavy on evidence-informed design.
·tier1performance.com·
Best from the Brightest: Key Ideas and Insights for L&P Professionals - TiER1 Performance
Learning Objectives: GOAL!?! – 3-Star learning experiences
Learning Objectives: GOAL!?! – 3-Star learning experiences
Summary of research on the value of telling learners the objectives at the beginning of training. The research supports giving learners specific "focusing objectives" to help them recognize what's important. However, that doesn't mean those objectives need to be the same formal learning objectives we use as IDs. In fact, using objectives as multiple choice questions to show people what they don't know yet may be effective.
As instructors and designers, we need to keep in mind that there can be other reasons to use objectives <em>and </em>we need to clearly distinguish between objectives that we use as instructional/learning designers versus the ones we might use for learners.
·3starlearningexperiences.wordpress.com·
Learning Objectives: GOAL!?! – 3-Star learning experiences
Three Answer Options Are All You Need on Multiple-Choice Tests!
Three Answer Options Are All You Need on Multiple-Choice Tests!
While we are used to providing 4 options in multiple choice questions, using 3 is just effective. Writing good distractors is the hardest part of writing multiple choice questions. If you only have to write 2 distractors instead of 3, you can create questions faster. While it's not mentioned in this post, reducing the number of options also immensely reduces the complexity of branching scenarios.
So here’s the main finding:&nbsp;<u>no significant differences were found in terms of item difficulty</u>.&nbsp;There were also <u>no differences found in terms of test reliability</u>. Thus, Baghaei and Amrahi (2011) concluded that three answer options are all you need.&nbsp;If the test characteristics are essentially the same, there doesn’t seem to be any reason to spend our time developing additional answer options.
&nbsp;Rodriguez (2005) argues that shifting to three answer options also increases the amount of content that can be tested.&nbsp;Because students don’t have to spend as much time reading four or five answer options, there will be more time during the test for students to read additional questions on different course content.&nbsp;Instead of spending your time on identifying more answer options, spend your time developing additional test questions.
·blog.cengage.com·
Three Answer Options Are All You Need on Multiple-Choice Tests!
How Much Do People Forget? – Work-Learning Research
How Much Do People Forget? – Work-Learning Research
This is the link I send people to debunk the blanket claims about "people forget X% after Y time." The reality is that how much people forget depends on who your audience is, what they're learning, and how you train them.
The amount a learner will forget varies depending on many things. We as learning professionals will be more effective if we make decisions based on a deep understanding of how to minimize forgetting and enhance remembering.
To be specific, when we hear statements like, “People will forget 60% of what they learned within 7 days,” we should ignore such advice and instead reflect on our own superiority and good looks until we are decidedly pleased with ourselves.
Many of the experiments reviewed in this report showed clearly that learning methods matter. For example, in the Bahrick 1979 study, the best learning methods produced an average forgetting score of -29% forgetting, whereas the worst learning methods produced forgetting at 47%, a swing of 76% points.
·worklearning.com·
How Much Do People Forget? – Work-Learning Research
Secrets of Star Training Consultants | Training Magazine
Secrets of Star Training Consultants | Training Magazine
Preliminary findings from Saul Carliner and John Murray's research and interviews with "star consultants" in the field of learning
<p>Participants also indicated the types of assignment they feel are inappropriate for them. Most of the assignments refused could be characterized as “conventional.” Several participants specifically mentioned that they distance themselves from training about products and software to focus on more strategic projects.</p> <p>One participant avoids “order-taker projects.” </p>
·trainingmag.com·
Secrets of Star Training Consultants | Training Magazine
Don't fall for these adult learning myths
Don't fall for these adult learning myths
"How to be a learning mythbuster" from Cathy Moore. Part of this is the broader problem that most people are lousy at understanding research and verifying sources. This isn't exclusive to the learning profession. We should be better about avoiding the myths in our own field though.
We work in organizations that believe harmful myths. We’re pressured to work as if the myths are true, and we can’t or don’t take the time we need to keep our knowledge up to date and combat the myths.
·blog.cathy-moore.com·
Don't fall for these adult learning myths
The Effects of Feedback Interventions on Performance: A Historical Review, a Meta-Analysis, and a Preliminary Feedback Intervention Theory (Kluger & DeNisi, 1996) | Reading for Pleasure
The Effects of Feedback Interventions on Performance: A Historical Review, a Meta-Analysis, and a Preliminary Feedback Intervention Theory (Kluger & DeNisi, 1996) | Reading for Pleasure
Research on the effects of feedback interventions. Feedback is not always beneficial for learning; in some cases, it can actually depress performance.
<p>The MCPL literature suggests that for an FI to directly improve learning, rather than motivate learning, it has to help the recipient to <em>reject erroneous hypotheses.</em> Whereas correcting errors is a feature of some types of FI messages, most types of FI messages do not contain such information and therefore should not improve learning—a claim consistent with CAI research.</p> <p>Moreover, even in learning situations where performance seems to benefit from FIs, learning through <em>FIs may be inferior to learning through discovery</em> (learning based on feedback from the task, rather than on feedback from an external agent). Task feedback may force the participant to learn task rules and recognize errors (e.g., Frese &amp; Zapf, 1994), whereas FI may lead the participant to learn how to use the FI as a crutch, while shortcutting the need for task learning (cf. J. R. Anderson, 1987). </p>
In the MCPL literature, several reviewers doubt whether FIs have any learning value (Balzer et al., 1989; Brehmer, 1980) and suggest alternatives to FI for increasing learning, such as providing the learner with more task information (Balzer et al., 1989). Another alternative to an FI is designing work or learning<br> environments that encourage trial and error, thus maximizing learning from task feedback without a direct intervention (Frese &amp; Zapf, 1994).
·dixieching.wordpress.com·
The Effects of Feedback Interventions on Performance: A Historical Review, a Meta-Analysis, and a Preliminary Feedback Intervention Theory (Kluger & DeNisi, 1996) | Reading for Pleasure
Optimal Video Length for Student Engagement | edX
Optimal Video Length for Student Engagement | edX
In edX courses, about 6 minutes is the maximum length students will watch. In traditional online graduate courses for credit, the length could be longer, but this is a good reminder to keep things short.
The optimal video length is 6 minutes or shorter -- students watched most of the way through these short videos. In fact, the average engagement time of any video maxes out at 6 minutes, regardless of its length. And engagement times decrease as videos lengthen: For instance, on average students spent around 3 minutes on videos that are longer than 12 minutes, which means that they engaged with less than a quarter of the content.
·blog.edx.org·
Optimal Video Length for Student Engagement | edX
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
Research on the effectiveness of concept mapping, answering retrieval questions, and reading in multiple sessions. I like the presentation of this in a scenario where you are asked to predict the results of research rather than simply summarizing the study.
·willatworklearning.com·
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
elearn Magazine: Why Is the Research on Learning Styles Still Being Dismissed by Some Learning Leaders and Practitioners?
elearn Magazine: Why Is the Research on Learning Styles Still Being Dismissed by Some Learning Leaders and Practitioners?
Comments from a number of experts dismissing learning styles, plus discussion on why we still talk about learning styles even though the research doesn't support it
·elearnmag.acm.org·
elearn Magazine: Why Is the Research on Learning Styles Still Being Dismissed by Some Learning Leaders and Practitioners?
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
Judy Unrein researched animated and static learning agents and found no difference in animation. Learning agents have value, but this research points to no extra value for more expensive and time-consuming animation.
·onehundredfortywords.com·
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
The impact of instructional elements in computer-based instruction_July2007.pdf
The impact of instructional elements in computer-based instruction_July2007.pdf

Study examining what happens when you remove common elements of instruction. Practice with feedback was critical; information, objectives, examples, and review made little difference.

"This study investigated the effects of several elements of instruction (objectives, information, practice, examples and review) when they were combined in a systematic manner." "Results indicated participants who used one of the four versions of the computer program that included practice performed significantly better on the posttest and had consistently more positive attitudes than those who did not receive practice."

·florencemartin.net·
The impact of instructional elements in computer-based instruction_July2007.pdf
Cognitive Load Theory: Failure? « EdTechDev
Cognitive Load Theory: Failure? « EdTechDev
Explanation of cognitive load theory and the problems with it, both conceptual and methodological. Lots of sources to dig into deeper if you want more research on this issue.
Numerous contradictions of cognitive load theory’s predictions have been found, but with germane cognitive load, they can still be explained away.&nbsp; de Jong does not use this term (unfalsifiable) but instead states that germane cognitive load is a <em>post-hoc</em> explanation with no theoretical basis: “there seems to be no grounds for asserting that processes that lead to (correct) schema acquisition will impose a higher cognitive load than learning processes that do not lead to (correct) schemas” (2009).
2. <span style="text-decoration: underline;">Poor external validity of lab-based studies</span>.&nbsp; Moreno doesn’t touch on something in the de Jong article – the fact that most cognitive load (and multimedia learning) studies are conducted in labs that “includes participants who have no specific interest in learning the domain involved and who are also given a very short study time” (de Jong, 2009), often only a few minutes.&nbsp; Quite a number of findings from these studies have not held up as strongly when tested in classrooms or real-world scenarios, or have even reversed (<a href="http://www.txwes.edu/professionaldevelopment/materials/Multimedia%20Instructions.pdf">such as the modality effect</a>, but see <a href="http://dx.doi.org/10.1016/j.learninstruc.2007.09.010">this refutation</a> and this <a href="http://cat.inist.fr/?aModele=afficheN&amp;cpsidt=5135499">other example of a reverse effect</a>).
·edtechdev.wordpress.com·
Cognitive Load Theory: Failure? « EdTechDev
Open Access Educational Technology journals – George Veletsianos
Open Access Educational Technology journals – George Veletsianos
Looking for research on e-learning, instructional design, educational technology, or related topics? Check out these open access journals. Great to have a filtered list for this rather than having to dig through some of the larger directories.
·veletsianos.com·
Open Access Educational Technology journals – George Veletsianos
Will at Work Learning: New Research Report on Using Culturally, Linguistically, and Situationally Relevant Scenarios
Will at Work Learning: New Research Report on Using Culturally, Linguistically, and Situationally Relevant Scenarios
Research on how to support learning with scenarios that are relevant to the specific situation. Even though this is explicitly about workplace training, the major recommendations could be adapted for instructional design in education contexts too.
Utilize decision-making scenarios. Consider using them not just in a minor role—for example at the end of a section—but integrated into the main narrative of your learning design.
Determine the most important points you want to get across AND the most important situations in which these points are critical. Then, provide extra repetitions spaced over time on these key points and situations.
·willatworklearning.com·
Will at Work Learning: New Research Report on Using Culturally, Linguistically, and Situationally Relevant Scenarios
Whatever You Do, Don’t Drop Practice | Tom Werner
Whatever You Do, Don’t Drop Practice | Tom Werner

Summary of research which compared courses with the same content but with specific elements of Gagne's instructional events removed. The strongest correlation with student performance and satisfaction was with practice with feedback. (This is an old post, but it's moved since I originally bookmarked it.)

The only instructional element that really matters is practice with feedback.
·brandon-hall.com·
Whatever You Do, Don’t Drop Practice | Tom Werner
How to get an Instructional Design education without paying tuition | effectivedesign.org
How to get an Instructional Design education without paying tuition | effectivedesign.org

A reading list for instructional designers, especially those of us doing the "informal masters" on our own rather than enrolling. More than just instructional design, this list includes project management, psychology of learning, and other topics.

Related link: http://www.dctrcurry.com/2008/02/immediately-accessible-instructional.html

·dctrcurry.blogspot.com·
How to get an Instructional Design education without paying tuition | effectivedesign.org