Digital learning and analytics – how to make organization training profitable

In a webinar hosted by Liferay and Arcusys last August, Janne Hietala gave a presentation about changes brought along by the digitalization of learning and about harnessing analytics to serve corporate training. The presentation especially highlighted the companies' constantly increasing need to be able to adapt to changing training needs as well as tools and actions to tackle these challenges.

Even as a term, personnel training may lead employees to think about monotonous, never-ending slide shows. On the other hand, the management may consider it only an inevitable item in the budget as it is difficult to measure and determine the true effectiveness of learning.

The usual scenario is that the more efficient and meaningful learning is sought, the more expensive it is to organize the training. However, finding an optimal balance between investment and effectiveness is based solely on estimates unless learning can be measured reliably.

This problem can be solved by learning analytics and digital learning environments that collect usage data.

How is data collected?

Digital learning environments allow learning data to be collected extensively and in great detail, and with analytics tools, this data can be turned into meaningful entities. By collecting and processing data on how learners use learning environments and materials, it is easier for the organization to find the most efficient and useful ways to develop both training and learners.

Nowadays, data is collected about practically everything we do in digital environments. This data is normally used in targeting and personalizing advertisements and marketing through social media, among other channels.

Consequently, it is sensible to be aware of what personal information you give out and why. In this case, however, those concerned about their privacy may heave a sigh of relief: learning analytics is solely about collecting internal data to develop the system, that is: data is stored securely out of the reach of outsiders.

Big, big data

At school, everyone has probably learned data collection and processing in physics or chemistry class, for instance.

At first, the experiment and its goals were defined, after which measurements were collected when carrying out the experiment. At the end of the experiment, the results of the entire group were recorded in a chart and used as a basis for creating bar graphs and tables that made it possible to express views about what happened during the experiment.

This principle has not changed over time but methods and technology have.

What kind of a table can you create in a digitalized world if there are, let's say, thousands of measurement points that constantly produce measurement results and generate data on the basis of hundreds of different measurement items instead of one experiment?

Solely from the point of view of size, the result is such an enormous amount of data that traditional approaches to processing it are, at best, inefficient.

This type of large and constantly growing entity of information, with meaning and value that is difficult to perceive directly, is called big data, and more robust means are needed to process and manage it. Analytics tools provide a starting point for tackling this challenge.

Data processing and analytics

In line with the data processing principles, analytics parses raw data into a processable format by looking for recurring structures in it. These structures can be turned into models, and by applying these models, meaningful and comprehensible entities can be formed from the data.

However, with traditional processing methods you must define very precisely what you want to find in the data. This requires that the person using the processing tool already has a strong idea about what is essential in the data and what isn't.

Analytics tools are capable of more independent operation. By processing data, they learn to form entities and find recurring elements in data that would not be noticed by the human eye.

Learning analytics

The digitalization of learning makes it possible to collect more extensive and versatile learning data. If the digital learning environment is capable of producing big data about learning, analytics tools get a chance to shine.

Tata Interactive Systems' report gives a comprehensive overview of different levels of learning analytics and of how bases of and approaches to using analytics have changed over time.

As a starting point, Tata presents descriptive and diagnostic analytics, with which learning data can be used as a basis for creating meaningful entities of learning and competence development, both for the learner and the training organizer.

According to Tata, the next-generation analytics is based on deriving correlations between data. Predictive analytics can pinpoint connections between learners' performance and the use of the environment and materials, for instance. If analytics reveals a correlation between certain materials and good performance, it is significantly easier to develop the learning environment and materials further.

Thanks to this, the collection of essential data also becomes easier, turning training development into a self-strengthening chain.

According to Tata, the most efficient form of analytics is prescriptive analytics: in addition to predicting the most likely future scenarios, it can offer the best options for preparing for and handling them.

Analytics responding to the company's needs

In the corporate world, personnel training may still be considered an inevitable expense item, needed for ensuring the personnel's required know-how. Thanks to learning analytics, digital learning environments should no longer be regarded solely as a way to save on training costs.

With the aid of the next-generation correlation analytics, a digital learning environment can also create personal learning profiles on the basis of data created by individual learners. These profiles help learners get a better understanding of their own strengths and adapt their learning style.

At the same time, training needs and changes in them become more and more obvious to the training organizer, making it easier to organize targeted, meaningful training that is more efficient in all of its aspects.

The conclusions of Hietala's webinar introduced the steps for deploying analytics in an organization:

  1. Defining the key performance indicators
  2. Planning a training program based on these needs
  3. Developing a test concept for the learning analytics solution in order to measure effectiveness
  4. Deployment and expansion plan

So, to start with, meaningful key performance indicators must be defined for the organization, to serve as a basis for measuring operational efficiency. A training program carried out in a digital environment and the testing of analytics with the data produced by the program provide a starting point for measuring the effectiveness of learning. Once the key performance indicators chosen start to yield meaningful results, the system can be deployed throughout the organization.

These are the foundation for any organization to make the most of training, in cooperation with a partner that offers training solutions and consulting.

Juho Haapiainen
Junior Instructional Designer

Contact us
Janne Hietala
CCO Chief Commercial Officer
+44 (0) 7484186421
Explore more posts
  • /blog+engaging-your-workforce-through-knowledge-management-in-a-digitally-transformative-age
  • /blog/digital-learning-and-analytics-how-to-make-organization-training-profitable
Read more

Blog

Engaging your Workforce through Knowledge Management in a Digitally Transformative Age

  • /blog+my-first-months-working-at-arcusys
  • /blog/digital-learning-and-analytics-how-to-make-organization-training-profitable
Read more

Blog

My first months working at Arcusys

  • /blog+optimizing-learning-experiences-for-organizational-performance
  • /blog/digital-learning-and-analytics-how-to-make-organization-training-profitable
Read more

Blog

Optimizing Learning Experiences for Organizational Performance