Virtual employee training has been a game changer - you don’t have to tell us twice. But as companies move away from in-person learning in favor of faster, cheaper and more effective methods, measuring successful outcomes becomes increasingly difficult. Without face-to-face confirmation that each employee has grasped the course material, managers are forced to rely on metrics such as course completion. We’ve already covered why completion rates won’t cut it anymore; so what should be measured? Let’s take a look at which numbers will tell you if a course is creating effective behavior changes, and how to design your course to better track these metrics.
A 95% completion rate on your newest virtual training course looks like a smash success on paper. But before exalting this success to your entire L&D team, you may want to examine the numbers a little closer. Depending on your course design, a completion rate may be telling you little more than how many employees have clicked through the course material. This misses out on three key metrics of effective learning: learner attention, comprehension and retention.
Attention depends on both the material presented and the current mindset of the learner. Making material concise, engaging and diverse can help capture attention, but only if a learner is already in the proper setting to absorb new information. If employees tune into virtual training drained from a long work week, attention won’t be easy to grab. Measuring learner attention can help you determine which material is most engaging, while also evaluating if the course as a whole is concise enough to accommodate realistic attention spans.
Comprehension is often recognized as the core of good course design. In fact, most learning designers rate content comprehension as their top priority. If learners can’t thoroughly understand the material in front of them, they can’t be expected to change their actions or performance going forward. Tracking learner comprehension throughout a course can help inform future content writing, material redesign and overall course success.
Finally, retention is the golden ticket for producing results. Conflict can’t drop and productivity can’t skyrocket if your course material goes in one ear and out the other. Content retention is often the archnemesis of course designers. After all, the Forgetting Curve shows that most people retain less than 20% of information one week after learning it. Luckily, digital learning allows for creative course delivery to help combat this curve. Tracking your learners’ information retention will ensure your course is designed to optimize content spacing, and identify information slipping through the memory sieve.
Without measuring these three elements, your course metrics may show equal “success” between employees who carefully review and internalize material and those who skim through, or worse, find the material too confusing to recall. So how can you ensure you have the proper data to measure these three key components?
Measuring attention, comprehension and retention begins with your course design. Too often, the course design process focuses only on the content quality itself. While beautifully written content may help learners, you won’t have the proper metrics to know without including active learning methods.
Active learning methods shift the responsibility of a learner from observation to participation. By engaging with content beyond watching a video or reading an article, learners tend to have more successful learning outcomes while providing your L&D team with helpful metrics. These methods can vary widely in format: daily quizzes, journal-style check-ins and learning games are just a few. While any type of active learning method can help enhance the course experience for your learners, not all methods will leave you with the metrics you need.
In order to measure learner attention, focus on including short term memory checks. After each block of content averaging around 5-15 minutes worth of content, present learners with a brief, simple quiz. In order to accurately assess attention rather than comprehension, be sure to make this quiz contain only questions that were explicitly answered in the prior content block. A high accuracy rate on this quiz will tell you how many learners are attentively progressing through the course. A low accuracy rate on certain sections will show you which content blocks are too dense, while a low accuracy rate across the board may indicate a distracted learning audience.
Measuring comprehension may be the most familiar active learning method. Design a few course activities to test learner’s ability to apply new content to unfamiliar scenarios. This can take the form of brief “essay-style” questions, creative roleplay situations or even a quick game. Go beyond testing mere memorization in these methods, and don’t be afraid to get creative. Be sure to always include the option for learners to answer “I don’t know” or “I don’t understand,” as these responses will help inform which content blocks need to be reworked.
Finally, measuring material retention requires several spaced memorization check-ins. In order to combat the daily percentage drop shown on the Forgetting Curve, try implementing a cumulative memory quiz after each day’s content. If your content isn’t broken into daily segments, try implementing a separate memory check unit to be periodically used for the weeks following the initial content learning. If you see low material retention, you may need to respace your content, allowing learners more time between each block.
By including these active learning activities in your course, you’ll save your L&D team the headache of figuring out why a high completion rate isn’t leading to the changes you want. Don’t be surprised if you also start receiving outstanding feedback on the course experience. Your dream metrics may just come with the added bonus of making learning fun.
Interested in designing an SMS-based course focused on active learning and actionable metrics? Start building your first Arist course today.