July 20, 2018

Instructional Design Walkthrough: Addressing Performance Issues with Training

In this post I would like to illustrate my instructional design process, based on one of the training solutions I've designed in the past (although some details have been adjusted to preserve anonymity and confidentiality). The purpose of this walkthrough is to outline, in broad strokes, the instructional design process from the initial request to the final product and highlight the main deliverables of each phase.

As you will notice, my process is based on ADDIE. Although it is very common these days to dismiss ADDIE as "outdated", "non-agile" and "not a framework", I hope that with this post I can show that it's not as bad as it seems and a big part of success can depend on you as an instructional designer, no matter what way of work you're using.

Initial Request

Company A's Quality Department discovered that Customer Service Agents (CSAs) were not correctly handling customer contacts related to issues with the discount codes, which led to monetary losses. Following the discovery, Quality Manager (QM) requested the development of a 2-hour e-learning module to be rolled out to all CSAs globally in 5 different languages, as well as to be incorporated into the new hire curriculum.

Analysis

As the Instructional Designer in charge of this project, I held a discovery meeting with the QM followed with a meeting with a subject matter expert (SME), who had not only the knowledge of the subject itself, but could also provide trustworthy insight into daily work, motivations and challenges of the CSAs.

During this meeting I focused on the answers to the following questions:
  • What is the business goal of this training and how will we measure the success?
  • What does the data tell us about CSAs' performance in each region?
  • What documentation, if any, is already available to the CSAs?
  • What are the typical or recurring mistakes?
  • Why do these mistakes happen?
  • What do we want the CSAs to do instead?
Based on the discovered information, I:
  • Reduced the target audience to the regions where performance issues were confirmed by data
  • Identified real causes of the performance issues, such as lack of practice in the investigation of promotional issues or the fact that performance issues were rooted in the good intentions, but wrong assumptions about the consequences of the actions taken. In other words, CSAs genuinely assumed that they were doing something good. 
  • Formulated the training goal and performance-oriented learning objectives
  • Reduced the projected training time by identifying performance issues that could be addressed through training
The changes to the project scope and learning objectives were signed off by QM and the SME. The reduction of training time and target audience was received particularly positively as it reduced the unnecessary costs of the project.

Design and Development

After completing the analysis, I:
  • Created a course blueprint outlining the identified performance issues with matching instructional activities and, where necessary, sources of theoretical information required to complete these activities.
  • Iteratively designed an e-learning module, beginning with a low-fidelity prototype in the PowerPoint. 
  • Guided SME through the reviews of the prototype and adjusted it based on SMEs feedback.
  • Developed the final module, featuring software simulations and branching conversation scenario in Storyline.
The module featured:
  • Short introductory scenario to introduce the learners to the consequences of their choices
  • Worked examples of the troubleshooting cases
  • Several troubleshooting simulations where the learners were required to use the simulated software environment and available information sources to correctly identify the customer's issue and take appropriate action. Upon completion of each scenario the learners received detailed feedback on their actions.
  • Branching scenario based on the conversation with a customer, where the learner needed to positively present the troubleshooting process and correctly set customer's expectations regarding the outcome of the investigation. 
The finished course was reviewed and tested by the QM, two SMEs who did not participate in the development process, as well as the CSA trainers. It was not possible to test the module on the CSAs themselves due to constraints on their availability.

Implementation and Evaluation

Following the rollout of the training, it was evaluated on the following levels:
  • Reaction of the learners
  • Changes in performance
  • Business impact
As an instructional designer, I designed the reaction evaluation survey and gathered the response data via LMS. I collaborated with the Quality Department on gathering and analysing data for the performance chance and business impact. The change in performance was measured as the decrease in wrong solutions and increase in the reports of the issues via correct channels. The business impact was measured in terms of reduced financial losses related to mishandled promotional issues.

Overall, the training led to statistically significant changes in performance as well as considerably positive business impact. On the level of learner's reaction 90% of respondents stated that the training has provided them with valuable practical knowledge and 96% of respondents stated that they will be applying new skills in their daily work. The comments from the learners highlighted the benefit and importance of simulations and scenarios.

Due to the success of the training and following the consultation with Customer Support trainers and leads, it was included into the standard new hire training to ensure that the performance issues were being not only treated, but also prevented.

No comments:

Post a Comment