Found 79 bookmarks
Newest
The politeness effect: Pedagogical agents and learning outcomes
The politeness effect: Pedagogical agents and learning outcomes
Using polite language in elearning improves learning outcomes.
The polite version yielded better learning outcomes, and the effect was amplified in learners who expressed a preference for indirect feedback, who had less computer experience, and who lacked engineering backgrounds. These results confirm the hypothesis that learners tend to respond to pedagogical agents as social actors, and suggest that research should focus less on the media in which agents are realized, and place more emphasis on the agent's social intelligence.
·sciencedirect.com·
The politeness effect: Pedagogical agents and learning outcomes
Brain Science: Enable Your Brain to Remember Almost Everything | Learning Solutions Magazine
Brain Science: Enable Your Brain to Remember Almost Everything | Learning Solutions Magazine
Use memory boosters to reduce how much people forget after training.
So how often should information be boostered? We recommend that you send boosters out in three phases. You can keep this in mind by remembering 2+2+2. Send out boosters after two days, two weeks, and two months.
This first set of boosters should be “recognition boosters.” The strategy here is just to get people to try to recognize the right answer from a list of options.
The second phase of boosters should be sent about two weeks after the training and at this time you should send out “generative boosters.” In a generative booster, the learner does not just recognize the right answer from a list. Instead, they have to think about the topic and then create an answer out of their head.
The third phase of boosters should be sent about two months after the training, and at this time you should send out “integrative boosters.” An integrative booster again prompts the learner to retrieve the information, but this question specifically asks them to provide concrete examples of how they have made use of this information in their job.
·learningsolutionsmag.com·
Brain Science: Enable Your Brain to Remember Almost Everything | Learning Solutions Magazine
Learning Technology Mystery Series Presents “The Case of the Disengaged Learner” with Cara North - The Training, Learning, and Development Community
Learning Technology Mystery Series Presents “The Case of the Disengaged Learner” with Cara North - The Training, Learning, and Development Community
Cara North's recorded presentation on engagement in learning. Engagement can be cognitive, behavioral, or emotional. Additional resources at go.osu.edu/disengaged
·tldc.us·
Learning Technology Mystery Series Presents “The Case of the Disengaged Learner” with Cara North - The Training, Learning, and Development Community
Understanding Attention and eLearning: A Primer on the Science of Eye-Tracking - ArcheMedX
Understanding Attention and eLearning: A Primer on the Science of Eye-Tracking - ArcheMedX
I asked in Julie Dirksen's Facebook group if there was any eye tracking research specific to elearning. I've read research related to general web reading and usability, but I wondered if there are any differences in attention when people are reading to deliberately and consciously learn. Brian McGowan helpfully pulled together this list of resources as a starting point for research.
·archemedx.com·
Understanding Attention and eLearning: A Primer on the Science of Eye-Tracking - ArcheMedX
What Do You Know: About Brain Science and Adult Learning
What Do You Know: About Brain Science and Adult Learning
When people claim they are designing learning based on "neuroscience" or "brain science," be skeptical. Sometimes it's real cognitive psychology research mislabeled as neuroscience. Sometimes it's fake research.
Cognitive science has to do with the mind and mental processes, such as thinking, learning, and problem solving at the human (or other organism) level.<em> </em>Neuroscience has to do with the biology of the nervous system, including how the brain works, at the anatomical level such as neurons.
Bottom line: When you hear claims about <em>neuro</em> or <em>brain</em> related to training, you should ask: Is it cognitive science or is it made up?
·td.org·
What Do You Know: About Brain Science and Adult Learning
Accelerating Expertise with Scenario-Based e-Learning - The Watercooler Newsletter : The Watercooler Newsletter
Accelerating Expertise with Scenario-Based e-Learning - The Watercooler Newsletter : The Watercooler Newsletter
Ruth Clark on how scenario-based elearning accelerates expertise and when to use it
What is Scenario-Based e-Learning?
<em>A. The learner assumes the role of an actor responding to a job realistic situation.</em>&nbsp;
<em>B. The learning environment is preplanned</em>.&nbsp;
<em>C. Learning is inductive rather than instructive.</em>&nbsp;
<em>D. The instruction is guided</em>.&nbsp;
<em>E. Scenario lessons incorporate instructional resources.</em>&nbsp;
<em>F. The goal is to accelerate workplace expertise.</em>&nbsp;
As you consider incorporating scenario-based e-Learning into your instructional mix, consider whether the acceleration of expertise will give you a return on investment.&nbsp; For example, interviews with subject matter experts indicated that automotive technicians must complete about 100 work orders to reach a reasonable competency level in any given troubleshooting domain.&nbsp; Comparing delivery alternatives, OJT would require around 200+ hours, instructor-led training would require around 100 hours, and scenario-based e-Learning simulations require approximately 33–66 hours.
Finally, many learners find scenario-based e-Learning more motivating than traditional instructional formats.&nbsp; Solving a work-related problem makes the instruction immediately relevant.
·watercoolernewsletter.com·
Accelerating Expertise with Scenario-Based e-Learning - The Watercooler Newsletter : The Watercooler Newsletter
Secrets of Star Training Consultants | Training Magazine
Secrets of Star Training Consultants | Training Magazine
Preliminary findings from Saul Carliner and John Murray's research and interviews with "star consultants" in the field of learning
<p>Participants also indicated the types of assignment they feel are inappropriate for them. Most of the assignments refused could be characterized as “conventional.” Several participants specifically mentioned that they distance themselves from training about products and software to focus on more strategic projects.</p> <p>One participant avoids “order-taker projects.” </p>
·trainingmag.com·
Secrets of Star Training Consultants | Training Magazine
The Top 20 Most Popular LMS Software Solutions powered by Capterra
The Top 20 Most Popular LMS Software Solutions powered by Capterra
Capterra's analysis of top LMSs by customers, users, and social media popularity. Many people only review 2-3 LMSs before making a decision. This list gives people some additional choices to review while still being a manageable list. The explanation of their research is linked below the infographic.
·capterra.com·
The Top 20 Most Popular LMS Software Solutions powered by Capterra
eLearning Guild Research: Gender Issues in Pay, or What You Don't Know Does Hurt You by Patti Shank : Learning Solutions Magazine
eLearning Guild Research: Gender Issues in Pay, or What You Don't Know Does Hurt You by Patti Shank : Learning Solutions Magazine
Patti Shank on the gender gap in e-learning pay (almost 10% lower on average). Educate yourself and do a better job negotiating your own salary, at least as one way to improve the issue.
·learningsolutionsmag.com·
eLearning Guild Research: Gender Issues in Pay, or What You Don't Know Does Hurt You by Patti Shank : Learning Solutions Magazine
Optimal Video Length for Student Engagement | edX
Optimal Video Length for Student Engagement | edX
In edX courses, about 6 minutes is the maximum length students will watch. In traditional online graduate courses for credit, the length could be longer, but this is a good reminder to keep things short.
The optimal video length is 6 minutes or shorter -- students watched most of the way through these short videos. In fact, the average engagement time of any video maxes out at 6 minutes, regardless of its length. And engagement times decrease as videos lengthen: For instance, on average students spent around 3 minutes on videos that are longer than 12 minutes, which means that they engaged with less than a quarter of the content.
·blog.edx.org·
Optimal Video Length for Student Engagement | edX
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
Research on the effectiveness of concept mapping, answering retrieval questions, and reading in multiple sessions. I like the presentation of this in a scenario where you are asked to predict the results of research rather than simply summarizing the study.
·willatworklearning.com·
Will at Work Learning: Case Question -- Concept Mapping, Question Answering, Multiple Sessions
Games Teach! | Kapp Notes
Games Teach! | Kapp Notes
Karl Kapp responds to Ruth Clark's claim that "games don't teach" and Richard Clark's claim that no research supports gaming with a review of the research and what it actually does and doesn't tell us.
Instructional games seem to foster higher-order thinking such as planning and reasoning more than factual or verbal knowledge.
Specifically, learning from simulation games was maximized when trainees actively rather than passively learned work-related competencies during game play, trainees could choose to play as many times as desired, and simulation games were embedded in an instructional program rather than serving as stand-alone instruction.
Challenge, interactivity and continual feedback can be applied to a classroom exercises, a paper and pencil activity or used online. The design is universal while the delivery vehicle can change. It is not technology that makes a game a game—it’s the design, the inclusion of a challenge and interactivity that make a game a game.
·karlkapp.com·
Games Teach! | Kapp Notes
Reset? - Games In Learning | EPPIC - Pursuing Performance
Reset? - Games In Learning | EPPIC - Pursuing Performance
Guy Wallace makes some ad hominem attacks against me for my criticism of Ruth Clark's claim that "games don't teach" (although he doesn't mention me by name or link to me, it's pretty clear that he is talking about my post). Once you get past the part where he says that Clark has made so many contributions to the field that it's not fair to attack her, especially if you're someone like me who isn't a "star," there are some valid points. He's correct that "popularity is not evidence" and that games can be more expensive than other solutions that might be just as effective.
·eppic.biz·
Reset? - Games In Learning | EPPIC - Pursuing Performance
Why Games Don't Teach
Why Games Don't Teach

Ruth Clark claims that "games don't teach," an obviously false statement. She has some legitimate points about matching the game design to the learning outcomes, but her claim that no research supports using games for anything other than "drill and practice" type activities is clearly incorrect. She makes this claim without addressing any work by Squire, Aldrich, etc., so it appears she didn't do a literature review prior to writing.

She cites one study with two games that were less effective at helping learners remember, and she believes that discounts the dozens of other studies on the topic. First, maybe those games were poorly designed. Second, if you're just measuring "transfer and retention" rather than application, I wouldn't be surprised that a game didn't do as well. Games are often better at moving from recall to application--but of course, she didn't measure application.

The goal of the research was to compare learning efficiency and effectiveness from a narrative game to a slide presentation of the content. Students who played the Crystal Island game learned less and rated the lesson more difficult than students who viewed a slide presentation without any game narrative or hands on activities. Results were similar with the Cache 17 game. The authors conclude that their findings “show that the two well-designed narrative discovery games…were less effective than corresponding slideshows in promoting learning outcomes based on transfer and retention of the games’ academic content” (p. 246).
Often the features of a game are at counter-purposes to the learning objectives. For example, many games incorporate an onscreen clock requiring the learner to achieve the goal in seconds or minutes. For learning outcomes that are based on understanding and critical thinking, games with time goals that reinforce fast responses are a poor match.
Despite the uncontested popularity of commercial games and a lot of hype in the training community, the reality is that there is scarce credible evidence on how and when to best use games to improve instructional outcomes and motivation. At this stage, I recommend games to implement drill and practice exercises for tasks that require immediate and accurate responses.
·astd.org·
Why Games Don't Teach
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
Judy Unrein researched animated and static learning agents and found no difference in animation. Learning agents have value, but this research points to no extra value for more expensive and time-consuming animation.
·onehundredfortywords.com·
Animated vs. Static Learning Agents - My M.Ed. Capstone Research | onehundredfortywords
The Human Factor: How Gender Differences Matter in Software Training by Mary Arnold : Learning Solutions Magazine
The Human Factor: How Gender Differences Matter in Software Training by Mary Arnold : Learning Solutions Magazine
If your software training includes time to explore or "tinker," men and women will have different rates of success. A strategic approach may be better than going through individual features. This research focused on adding new features with an audience who was already familiar with the software; I'm not sure the same training technique would work with beginners with an application.
Tinkering with the spreadsheets seems to be a reasonable approach to working with a new problem, in line with generating and testing alternative strategies to find a solution. In other words, learning.&nbsp; Women who tinkered with the spreadsheets seemed to be doing just that, and, for them, tinkering predicted more effective problem solving. &nbsp;Counter-intuitively, though, when men tinkered with the spreadsheet, they were <i>less</i> effective in correcting the errors.&nbsp; The opposite results seem attributable to the fact that women paused before trying something else, long enough to process the information.
In the final experiment, researchers provided a different kind of tutorial — one that emphasized a strategic, rather than a feature-by-feature approach to the problem.
Women who participated in this condition were almost as likely to use the new features as the men in the same study, and were able to solve more problems more quickly than women who didn’t use the new features.&nbsp; Men in this condition were not significantly helped or hindered, which means that it’s possible to prevent a bias against women without introducing a bias against men.
·learningsolutionsmag.com·
The Human Factor: How Gender Differences Matter in Software Training by Mary Arnold : Learning Solutions Magazine
The impact of instructional elements in computer-based instruction_July2007.pdf
The impact of instructional elements in computer-based instruction_July2007.pdf

Study examining what happens when you remove common elements of instruction. Practice with feedback was critical; information, objectives, examples, and review made little difference.

"This study investigated the effects of several elements of instruction (objectives, information, practice, examples and review) when they were combined in a systematic manner." "Results indicated participants who used one of the four versions of the computer program that included practice performed significantly better on the posttest and had consistently more positive attitudes than those who did not receive practice."

·florencemartin.net·
The impact of instructional elements in computer-based instruction_July2007.pdf
People like virtual instructors that look, act like them
People like virtual instructors that look, act like them
Learners like avatars with the same gender and ethnicity, but they also like those who give feedback the way they want: comparing against others or comparing against their own past scores. However, learning didn't always improve based on liking the avatar better.
Although they may seem horribly fake, past research has suggested that we react to them in the same ways we react to a real person: studies have suggested that we tend to be more comfortable when the virtual personality shares our gender and ethnic background, just as we are when we work with living humans. Now, a new study on virtual training instructors extends that to show that people work best with virtual systems that measure progress the same way that they do.
·arstechnica.com·
People like virtual instructors that look, act like them
How Much Narration in eLearning? Our Lessons Learned by Don Bair & Mike Dickinson : Learning Solutions Magazine
How Much Narration in eLearning? Our Lessons Learned by Don Bair & Mike Dickinson : Learning Solutions Magazine
Two IDs look at the use of audio narration--how much, quality of speakers, quality of equipment. Includes guidelines based on their survey of employees. I wish they had some more info about the survey they conducted though (i.e., how many responses they received, how many total employees at the company, etc.)
<p>Here are the guidelines we have adopted as a result of this study:</p> <ol> <li> <p>[How much?] We will use audio only when instructionally necessary. </p> </li> <li> <p>[Control] We will make sure students have the ability to turn the sound on and off, and that they know how to do so.</p> </li> <li> <p>[Who?] We will continue to use in-house talent, but other than credits at the end, we will not identify the narrator unless his or her name or title is pertinent for the instruction, e.g., having the Compliance Officer introduce a compliance course. This will prevent having to re-narrate when someone changes position or leaves the company. We may audition to get more suitable voices.</p> </li> <li> <p>[Quality] We only need slightly a higher quality microphone along with a pop filter to raise our technical quality to the practical limit. We also identified a storage room that will double as our sound studio with the use of inexpensive draperies. This location should improve our ability to splice in updates without sounding noticeably different from the original.</p> </li> <li> <p>We will continue to have learners evaluate the use and quality of our narration and make adjustments accordingly.</p></li></ol>
Only 12% said they prefer professional voice talent. A full 85% said the voice only needs to sound good enough to get the point across without having to strain to understand it. Nearly 60% of our employees said “no preference” as long as the voice isn’t irritating to listen to. 40% prefer that the narrator be someone they recognize (i.e., a well-known manager, process owner, or <span class="glossaryTerm" id="/glossary/getGlossaryDefinition.cfm?id=131">SME</span>). A surprising 9% said the narration could be computer-generated as long as it didn’t sound too robot-like.
We wanted to know the preferences of our employees so we conducted a survey. They almost unanimously said that 1) they do not want the entire course to be narrated, 2) they do not want text on the screen read to them word for word, and 3) about two-thirds of the employees want to be able to turn the narration on or off.
·learningsolutionsmag.com·
How Much Narration in eLearning? Our Lessons Learned by Don Bair & Mike Dickinson : Learning Solutions Magazine
Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial - Kerfoot - 2006 - Medical Education - Wiley Online Library
Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial - Kerfoot - 2006 - Medical Education - Wiley Online Library
Research summary on spaced education for medical students. The e-learning included emailed scenarios and questions. The summary and conclusion talk about medical knowledge, but since this is about scenarios it seems like there might be some decision-making skills being reinforced here too.
<b>Conclusion </b> Spaced education consisting of clinical scenarios and questions distributed weekly via e-mail can significantly improve students' retention of medical knowledge.
·onlinelibrary.wiley.com·
Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial - Kerfoot - 2006 - Medical Education - Wiley Online Library