eLearning Blog


The Accord Blog will keep you up to date with our latest product features and LMS industry news and trends.



Articles

One for Good Measure

posted on

I had the pleasure of getting an advance peek at the new Accord LMS reports. It is clear Accord LMS has put a lot of thought into and has some big plans for reporting capabilities leveraging the Telerik Report Engine. The type of reports that can be produced, the level of detail captured, and the output options demonstrate key functionalities of a very capable and flexible reporting platform for learning data.

I have a passion for reporting. I spend a great deal of time generating reports (many of which are high stakes and legally mandated). Reporting capabilities and the applying good reporting practices are critical to the success of any training program.

In fact, reporting and metrics is such an important component of my design philosophy that I speak at conferences about putting these elements at the center of a training design, and all other elements assume the role of “supporting cast”.

Most Training Designs Put Content, Not Results at the Center
Typical instructional design starts with a subject matter or body of content that is wrestled with for a period of time and then embellished by infusing in some interactions and activities. The final assessment is often the last element to be considered and designed and simply tests fact recall from the course content. If you are lucky some scenario-based questions are thrown into the mix.

My problem with this approach is that it is content-centric. So much time is spent on defining what the content will be and managing that process that there isn’t adequate attention given to assessment, measurement and reporting strategies. This is exactly backwards.

Find the Bullseye and Aim for It
My approach starts with assessment. The very first thing I define with stakeholders is exactly what results are expected of the e-learning solution and how we will measure it (assessments). Additional factors and controls that need to be accounted for are also discussed. This process defines the bullseye for the rest of the design. I start with the end in mind: “what exactly, do executives/stakeholders need to track as proof positive results?”

The results on which you report will help shape specific actions applied in the workplace to produce those results. Tomes are written on action-based learning, but my favorite resource is Cathy Moore’s blog. She makes it simple: define the key business objective, identify what the learners must do in the workplace to fulfill the objective, create activities for learners to demonstrate the skills, and only provide the content really needed to build that skill. This design focuses on the key objectives that matter to stakeholders and the actions to achieve those objectives. The process starts with a stated business goal and a defined result output (documented via reports). Training design and development is now directed to achieve that output and non aligned elements can be dropped.

It is important to distinguish a few points here:

 

  1. Content is supporting cast, not the star of the show. I explained this to a team of executives in a highly regulated industry by reminding them how their college professors probably never actually knew how much or how little of a textbook they read for class, but they certainly knew how students performed on assessments and during classroom activities. I then showed them how every strategy to track which learners reviewed all content they were assigned could be circumnavigated and in most cases the tracking data provided very poor analysis data anyway.
  2. When you capture data from learning, you measure skills an employee has the potential to apply on the job. There are many factors that influence how well a skill does or does not translate to greater efficiency in the workplace. Consider these support factors carefully in design, and also anticipate creating measurement instruments to track the impact of their influence.
  3. No test is 100% fair. No measure is 100% accurate, or unbiased. Neither are sales forecasts, temperature readings, or almost any other metric used in business or life. Give up the quest for perfect. Do everything within your controls to get to “metrics the organization can believe in and bank on”. Much business planning is done on imperfect measures and forecasts. Discuss issues honestly and openly with stakeholders up front so they can work with you to strategize on how to produce the best data possible; requesting their input and agreement up front reduces the questioning of the validity or usefulness of the data when it is finally presented.
  4. Take multiple measures from multiple sources when possible. Create instruments to unveil or measure the impact of biases. For example, you may have “hard data” from scenario-based tests and get back some metrics about performance on specific questions. I learned long ago while doing marketing statistics “liars can figure and figures can lie”. Survey data and focus groups can often provide insights about some of this data that numbers could never provide. Such as revelations that managers were suggesting and supporting certain actions in the field that did not reflect best practices or the correct answers on assessments. In general, employees are going to listen to direct managers over “the training”, even when the managers aren’t right.
  5. Don’t expect learning data alone to be sufficient: business results come from operational measurement tools (business analytics, sales results, quality measures, etc…). These results are actually the truest reflection of how skills applied in the workplace produce results. Training data can provide information on how training contributed to the results (it can be as simple as a survey asking employees to rank factors that drive results or to forecast what percentage of change in business results were driven by training). To get a total picture of the results and what drives them, it is often necessary to get data from multiple sources for reporting and analysis. Many training departments handle training data separate from other business data and miss the opportunity to collaborate. If you want a “seat at the table”, a good strategy is to work as a partner and integrate with the other parts of the business.

The Accord LMS reporting capabilities include impressive features like survey reporting, question-level details, a wealth of utilization reports, the ability to parse data at many levels to report on different groups at many levels, and the ability to export data to formats to work with spreadsheets and other applications for deeper analysis. The key to leveraging these features for maximum impact is to start with the stakeholders to understand what really needs to be tracked; not just the compliance “have to” reports, but the stuff that really gives executives information on employee capability to execute the strategy of the organization. Also, work with stakeholders up front to address how to manage issues of imperfect measures or external factors so they are addressed up front and there is buy in and agreement; this minimizes derailing questions of the validity of results during the presentation phase.

 

Using the intelligent design of our learning management system with reporting capabilities, you will be able to track key elements of training that contribute to achieving organizational goals and clearly communicate these results to stakeholders.

 

 

| Categories: eLearning

Post a Comment





Newsletter

Once a month, receive the best of our Blogs directly to your mail box.




Search Articles


Blog Categories


Featured Articles




Smart.  Secure.  Scalable.
Experience the ease and agility of the Accord LMS.