The Future of Workplace Learning

Using Learner Data to Refine Your Learning Design

August 30, 2020
The Future of Workplace Learning
Using Learner Data to Refine Your Learning Design
Show Notes Transcript

Using Qstream, we were able to refine the learning design of our compliance course based on real-time user data. This was more than a new feature, it was a paradigm shift in learning design. In this episode, discover what we found and how it changed our design.

When we designed the learning experiences of our Qstream pilots at Providence, we knew the benefits could be viewed from several perspectives. From a business perspective, it slashed implementation costs by cutting the training time in half; from a learner perspective, workers told us it was fun to take and that they enjoyed learning in short bursts- this allowed them to quickly get on with their day. 

But what I want to share with you here is a third perspective- an instructional design perspective. The learning design team found something new that we didn’t consider before- we could use real-time user data to refine our learning design. This was more than a new feature, it was a paradigm shift for us.

Let me explain how important this shift was. As an experienced learning professional, I’ve developed dozens of courses based on the industry framework called ADDIE. ADDIE stands for Analyze, Design, Develop, Implement, and Evaluate. ADDIE is a linear development process in which we first 

  1. Analyze the needs of why a course is being requested, then 
  2. Design it so it meets the needs of the learners and the business, then
  3. Develop the course using whatever tools I’m going to use, then
  4. Implement or launch the course, and finally
  5. Evaluate its effectiveness to determine if it met the objectives.

In this paradigm, we build it, launch it, and see how well it performs. Think of it as being a rocket scientist in which you’ve designed and built a rocket and are watching it being launched. There you are, on the ground, watching it fly away toward its destination in space. That’s the way courses have been launched for years- build them, launch them on the learning platform, and then collect some stats to see if it performed as expected.

However, designing learning in Qstream allows us to collect real-time data on the user’s performance. In the rocket analogy, you’re not the engineer standing passively on the ground anymore- you’re the flight commander in mission control- actively scanning real-time data and making course adjustments as needed. 

Case in point, immediately after we launched our Qstream course, we began viewing granular data not only on what questions were most often being missed, but which incorrect answers were being selected. For us, this meant that we could quickly identify that question 12 was causing problems. This particular question was about responding to a verbally aggressive visitor. The question stated “You are trying to help a visitor who suddenly becomes verbally aggressive. To protect yourself, you should stand at least BLANK feet away.” The choices were 3, 4, 6, or 12 feet away. As we looked at the data one week after the course was launched, we found that 95% of users chose either 4 or 6 feet- and those results were split almost evenly. In an ADDIE design framework, we would have done the Evaluation step months, not a week, after the course was launched- and we wouldn’t have explored that level of detail in the first place.

One person taking the course, named Penni, knew the correct thing to do for question 12. She chose 4 feet, and subsequently got the question wrong. On February 25, Penni posted a comment in Qstream for that question stating, “I believe it’s supposed to be 4 to 6 feet.” If this was the case, then it would be a poorly worded question because in her mind, the answer could be either four or six feet- but she didn’t know which answer to choose. Her wrong answer was a false negative, she got it wrong even though she knew the content. As learning professionals, we had the opportunity to change the question in real-time so the updated answers were 3, 6, 8, and 12 feet. That way, if someone recalled 4-6 feet as the correct answer, there would only be ONE answer that fits within that range. In addition, one of our learning professionals on our team posted a reply to Penni the following day stating that “It is my understanding that some training recommends 6 feet instead of 4 feet. Maybe we can clarify this for the next revision.”

In addition, we were able to measure training impact with heat maps of visual proficiency. These allow learning professionals and frontline managers to pinpointing knowledge gaps in a specific area with real-time analytics. These visualizations clearly indicate when the learning has been completed and target areas where learners are struggling. Armed with this information, we can develop additional learning experiences to meet those needs. For example, we were able to identify that no additional learning should be developed for emergency codes because proficiency went from 99-100%. However, content on Chemicals should be periodically reinforced since proficiency jumped from 73 to 91%, a difference of 18%.

Using Qstream allowed us to correct the trajectory of the course mid-flight. We have now shifted into a new paradigm in which we can refine our learning design based on real-time user data. This approach helps us better meet the needs of our workforce as we design for the future of workplace learning.