Posted on 17th April, 2018 by Kathryn Nixon
LEO Learning’s Director of Strategic Design, Andrew Joly, recently held a webinar with the Learning Skills Group. It was packed with real, relatable stories about how some L&D professionals are currently measuring the business impact of learning. More importantly, the session revealed the gap between where participants are and where they want to be in terms of measurement.
LEO Learning’s research into measuring the business impact of learning has delivered some fascinating insights into the expectations and challenges the L&D sector is facing when it comes to measurement.
This year’s findings show a 71% increase in respondents who said they feel executive pressure to measure their learning programmes, as well as a sharp rise in the number of L&D professionals stating they believed it’s possible to measure the impact their learning has.
At the same time, a significant number said they didn’t know where to start on their measurement journey. This trend is echoed by Towards Maturity’s Unlocking Potential report which shows a significant skills gap around data collection and analysis, even in top-performing organisations.
Measuring the business impact of learning: what’s happening ‘on the ground’
This tension between pressure from the board, enthusiasm to get started and having the strategy, data and tools to do so was brought into sharp focus during the webinar discussion.
Through a series of questions, we invited attendees to share their current measurement efforts, where they want to be and how they’re currently accessing measurement data.
The result was a lively, thought-provoking discussion that provided a fascinating ‘on the ground’ snapshot of the learning measurement landscape in L&D.
How are you currently measuring the business impact of learning in your organisation?
It was clear from our discussion that many attendees are currently utilising more traditional, qualitative forms of learning measurement. These include ‘happy sheets’ or online surveys that ask learners for feedback on the learning.
“Level 1 happy sheets, so got a long way to go too.”
“Other than asking delegates, we don’t do anything else… Assessing the impact of learning has been stated as being the elephant in the room.”
In an indicator of the influence of new approaches, a small number of respondents reported using more advanced methods of collection.
“We’re looking at analytics, but at the minute I don’t think we know what they’re telling us.”
“Developing logic models and then building a list of measures around that which can be collected.”
The Kirkpatrick model of learning evaluation cropped up almost immediately in the discussion. This model, which dates from 1953, still appears to be a reference point for many.
LEO Learning’s Chain of Evidence model follows aspects of the Kirkpatrick model, in the sense that it focuses on key areas of the effect continuum, from learning engagement to business change.
The Chain of Evidence approach recognises that a holistic view of learning evaluation should include capturing learner engagement and aspects of knowledge & skill uptake.
At LEO Learning, we understand that measuring changes in behaviour and impact on business objectives is now the key focus and aspiration for many L&D leaders. This is why our approach also looks at the shift in performance behaviour in individuals and teams, as well as looking at the impact on business performance (in many different forms).
How do you want to measure the business impact of learning and deliver results to the board?
As our survey results suggest, there is real enthusiasm to move forward with developing a strong picture of the business impact of learning. And this was also evident in responses to our question on what attendees want to be delivering to their senior stakeholders in the near future.
It’s clear that the one thing these L&D professionals want is to demonstrate a strong relationship between their learning solutions, employee performance and business results.
“Clear evidence that L&D is making a positive impact on performance.”
“How it [learning] has improved performance across the business, i.e. growth, transformation, profitability and productivity.”
For many, the goal was to be able to measure behaviour and performance changes, productivity and operational efficiencies and relate them back to business results and KPIs.
What sources of data do you leverage – apart from your LMS?
However, the majority noted that surveys were often used to gather feedback from learners and other stakeholders such as line managers. Other methods, such as learner interviews and focus groups, were also mentioned.
The frustration with these data sources was that once the data was collected, actually analysing it and using it to demonstrate business impact was difficult.
“…we get surveys back but it’s very hard to measure impact of learning in terms of performance or business impact.”
xAPI and Learning Record Stores: powerful partners
Our partners at Watershed have demonstrated that when you utilise the power of xAPI you can leverage almost any type of data and channel it into a single place: an LRS.
Once you have this data collected, you can begin to build a powerful picture of your learning performance: past, present and future.
This type of measurement approach is the aspiration for many organisations. The key for many L&D professionals remains to just ‘get started’ on this journey.
Whether this is reviewing your data sources, leveraging standards such as SCORM or xAPI or designing an MVP measurement strategy, the faster you get going, the sooner you will achieve the goal of demonstrating real business impact.