Making sense of learning data
Posted on 10th November, 2017 by Ben Miller
While the interconnection of learning technologies has led to organisations storing a vast amount of learning data, this wealth of information is only truly valuable if it is used judiciously.
Rather than attempting to make use of all the available data in order to effectively measure the business impact of learning, L&D teams have the challenge of determining which data is relevant, and working out what types of data they need to focus on harvesting and protecting.
Managing data to measure learning impact
In LEO’s new insight, ‘Understanding data about learning’, which is a must-read for anyone looking to clarify their data strategy and enhance the ways in which they measure the business impact of learning, we take a look at the principles that can be applied to find and process the right data.
Managing data is often tricky without the right guidance. Organisational data tends to be complicated, with substantial effort and perceptiveness required to gather, clean and prepare information in order to extract meaningful insights.
Although learning analytics products will often allow you to import data from multiple sources, efforts to build an accurate picture of learning across an organisation are frequently hindered by data being located in various silos. These are often disconnected and out of date, leading to different versions of the same data across systems.
To help overcome these problems, the LEO insight, written with our partners at xAPI experts Rustici Software, suggests that L&D leaders:
- Create a master data source for user profiles, acting as an authoritative data point for user profile data
- Be proactive in exploring and following data management practices to avoid risks and pitfalls
- Consider creating an event store as a single source of consistent data relating to learning activities across the organisation
Learning technology standards
One of the problems L&D teams repeatedly face is in dealing with duplicate data across different systems. Learning technology standards can make our lives much easier in this respect, as they allow data to be seamlessly communicated between different systems, with one system specified as the master.
To help you make sense of the many standards across the industry, we’ve outlined six key learning standards and specifications. These can play a pivotal role in providing access to learning activity from multiple sources and maintaining the validity of data.
The learning standards and specifications include:
- SCORM, the content packaging standard which accommodates single elearning modules in any SCORM-compliant LMS
- The more robust xAPI, which is useful when workplace learning experiences occur in multiple places
- Open Badges, which enable digital credentials to be awarded to learners and can be transferred when they move, for example from a college LMS to an employer’s LMS
The merits of cmi5, IMS Calpier and Learning Tools Interoperability (LTI) are also discussed.
Technology standards, as the insight shows, are essential in the quest for consistent, good-quality data to work with as part of a measuring learning impact strategy.
Expert guidance on learning analytics
The insight concludes by asking learning standards and data expert, Rustici Software’s Tim Martin, for his advice on how you can assemble your data well and create a record store.
Tim is an expert at offering plain-spoken answers to questions about SCORM and xAPI, and his tips range from how to tell if two systems are truly ‘talking’ to each other (interoperability), to the best way to use xAPI.
Want to know more? Get in touch on email@example.com.