July 20, 2018

Instructional Design Walkthrough: Addressing Performance Issues with Training

In this post I would like to illustrate my instructional design process, based on one of the training solutions I've designed in the past. The post is based on real events, but has been adjusted to preserve anonymity. The purpose of this walkthrough is to outline, in broad strokes, the instructional design process from initial request to the final product and highlight the main deliverables of each phase.

As you will notice, my process is based on ADDIE. Although it is very common these days to dismiss ADDIE as "outdated", "non-agile" and "not a framework", I hope that with this post I can show that it's not really the ADDIE (or SAM or anyone else) but the instructional designer who is the key to the success (or failure) of a training project.

Initial Request

Company A's Quality Department discovered that Customer Service Agents (CSAs) were not correctly handling customer contacts related to issues with the discount codes, which led to monetary losses. Following the discovery, Quality Manager (QM) requested the development of a 2-hour e-learning module to be rolled out to all CSAs globally in 5 different languages, as well as to be incorporated into the new hire curriculum.

Analysis

As the Instructional Designer in charge of this project, I held a discovery meeting with the QM followed with a meeting with a subject matter expert (SME), who had not only the knowledge of the subject itself, but could also provide trustworthy insight into daily work, motivations and challenges of the CSAs.

During this meeting I focused on the answers to the following questions:
  • What is the business goal of this training and how will we measure the success?
  • What does the data tell us about CSAs' performance in each region?
  • What documentation, if any, is already available to the CSAs?
  • What are the typical or recurring mistakes?
  • Why do these mistakes happen?
  • What do we want the CSAs to do instead?
Based on the discovered information, I:
  • Reduced the target audience to the regions where performance issues were confirmed by data
  • Identified real causes of the performance issues, such as lack of practice in the investigation of promotional issues or the fact that performance issues were rooted in the good intentions, but wrong assumptions about the consequences of the actions taken. In other words, CSAs genuinely assumed that they were doing something good. 
  • Formulated the training goal and performance-oriented learning objectives
  • Reduced the projected training time by identifying performance issues that could be addressed through training
The changes to the project scope and learning objectives were signed off by QM and the SME. The reduction of training time and target audience was received particularly positively as it reduced the unnecessary costs of the project.

Design and Development

After completing the analysis, I:
  • Created a course blueprint outlining the identified performance issues with matching instructional activities and, where necessary, sources of theoretical information required to complete these activities.
  • Iteratively designed an e-learning module, beginning with a low-fidelity prototype in the PowerPoint. 
  • Guided SME through the reviews of the prototype and adjusted it based on SMEs feedback.
  • Developed the final module, featuring software simulations and branching conversation scenario in Storyline.
The module featured:
  • Short introductory scenario to introduce the learners to the consequences of their choices
  • Worked examples of the troubleshooting cases
  • Several troubleshooting simulations where the learners were required to use the simulated software environment and available information sources to correctly identify the customer's issue and take appropriate action. Upon completion of each scenario the learners received detailed feedback on their actions.
  • Branching scenario based on the conversation with a customer, where the learner needed to positively present the troubleshooting process and correctly set customer's expectations regarding the outcome of the investigation. 
The finished course was reviewed and tested by the QM, two SMEs who did not participate in the development process, as well as the CSA trainers. It was not possible to test the module on the CSAs themselves due to constraints on their availability.

Implementation and Evaluation

Following the rollout of the training, it was evaluated on the following levels:
  • Reaction of the learners
  • Changes in performance
  • Business impact
As an instructional designer, I designed the reaction evaluation survey and gathered the response data via LMS. I collaborated with the Quality Department on gathering and analysing data for the performance chance and business impact. The change in performance was measured as the decrease in wrong solutions and increase in the reports of the issues via correct channels. The business impact was measured in terms of reduced financial losses related to mishandled promotional issues.

Overall, the training led to statistically significant changes in performance as well as considerably positive business impact. On the level of learner's reaction 90% of respondents stated that the training has provided them with valuable practical knowledge and 96% of respondents stated that they will be applying new skills in their daily work. The comments from the learners highlighted the benefit and importance of simulations and scenarios.

Due to the success of the training and following the consultation with Customer Support trainers and leads, it was included into the standard new hire training to ensure that the performance issues were being not only treated, but also prevented.

February 11, 2018

Pilling a Cat: How Training Actually Works

Whether you are an instructional designer or a customer of external training providers, you may succumb to the idea that success of a training depends on:
  • Length (microlearning!)
  • Format (engaging videos!)
  • Digital delivery (online learning is the best!)
With all the buzz in the media, it's hard to resist this opinion. The idea of micro-videos seems to be more enthralling than pie charts. So here's a simple way to see if the "learning nuggets" and "attention-grabbing videos" produce any return on investment:
  • Watch this short YouTube video about giving a pill to a cat.
  • Get a real cat and try giving it a pill (or a delicious snack they are not in the mood to eat).
Assuming you're new to the task, I'm quite sure that the outcome will depend not on your skills, but on the cooperation of the cat. In other words, it's not your mastery, but the ease of the problem that will define the outcome. As soon as you face a cat that deviates from the example in the video, you will most likely be at a loss (quite possibly a loss of blood, too). So much for the learning nuggets. 

The format of the "nugget" doesn't matter. Whether it  is a video, a drawing, or a drag and drop activity to arrange the steps in the right order. None of these will lead to improved performance post-training. 

The reason is simple. The video contains the basic information: hold the cat's head, aim the pill at the back of its tongue, etc. There is nothing wrong with this information. It is useful and worth knowing before you approach a cat, as it can save you some time and trouble of experimenting. But the information is not enough.

Witnessing a demonstration of an ideal process does not necessarily prompt deep-level processing. In fact, it may lead to the false sense of competency. It's like looking at abstract art and claiming that anyone can do it. On the other hand, engaging with a real cat in the real world, gives you a reality check and stirs up a lot of questions, for example: 
  • How hard can I hold the cat's head without causing damage?
  • Can a cat bite its tongue if I try to close its mouth?
  • If the cat is making noises, which can I ignore and which are the signs that I'm hurting the cat?
  • What to do if the cat mastered tongue-wriggling and pill-spitting quicker than I mastered cat-pilling?
In the ideal situation, after grasping a basic idea of what we're supposed to do, we would venture forth and try to pill different cats with the gradually increasing level of difficulty. We would then reflect on our experience and seek ways to improve the outcome next time. This, and not the format of the presentation, would lead to the true engagement with the subject matter and acquisition of mastery.  

Of course, one may ask - how would it be possible to achieve all of this in an e-learning module? I would say that this is a wrong question to ask, since it focuses on the format. Don't put the format before the goal. Look past isolated events and their formats. Consider performance improvement as a process spanning time and variety of contexts. For example, are the newly trained "cat pillers" assigned to cat pilling or do they do inventory? Do mentors observe their work, encourage reflection and provide feedback? Or do they schedule a perfunctory monthly meeting to listen to the learner's self-report of their mastery? Do the cat pillers have access to supplementary tools to aid their performance? Can they use these tools? 

In short, there is nothing inherently wrong with using videos or providing information. What's wrong is stopping there. Whether we design or buy a training program, it must not stop at the dissemination of information.  To achieve performance improvement, a full-scale training program would need to include:
  • Application of knowledge in novel contexts 
  • Realistic challenge 
  • Gradual increase of difficulty
  • Reflection and feedback
  • Continuation of the development post-training
  • Tools and processes that support performance post-training

January 8, 2018

Story-Based or Scenario-Driven?

Having a shared terminology is important as we use words to describe our reality, communicate ideas and achieve understanding. However, since many people step into the field of Learning and Development by following very different paths, not everyone in this sector uses a stable common language. Even the word "e-learning" can conjure up different images in the minds of different audiences. Add to this the need to communicate with non-L&D stakeholders who aren't highly interested in the semantics and the constant noise produced by marketing-oriented publications touting "story-driven action-packed gamified microlearning scenario-based videos" and you have a full picture of our messy reality.

The issue that I see particularly often is with the use of words "story", "scenario" and "case study". Recently I had to go review the offers from e-learning providers who, naturally, boasted of developing "practical scenario-based modules", which upon a closer inspection turned out to be the dreaded infodumps in disguise. While I do not aspire to lay the foundation for the new universal terminology, in this blog post I would like to reflect on these misused terms, and take a look at what they mean and how can we tell them apart.

Story

Let's start with the easy one. We all know what a story is - a narrative with protagonists and antagonists, beginning, climax, etc. They can be told in different ways and employ different techniques to raise the audience's interest. However, the stories have a definite structure that is independent of the audience's actions, thoughts and desires. The story follows its predefined path from start to end.

Stories can be educational, enlightening, and inspiring, but when it comes to training in the sense of improvement of performance and skills, stories are not enough. For example, I can tell you a story about how I designed a training. While you might get some ideas from it, if you're not an experienced instructional designer, this story will not really teach you how to become one and it will not have lasting impression on your performance. In essence, a story can serve as a frame within which a training is structured, but we still need to use activities, practice and feedback to achieve the training goals.

Case Study

Firstly, to add more complexity to the subject, a case study as a learning method can be confused with a case study as a research method. Secondly, I often see novice instructional designers who entered the field as SMEs writing stories and then christening them "case studies". For example, the novice designer may write a story about a patient who was misdiagnosed in a hospital, what happened as a result and what should have been done instead. This is not a case study in the slightest.

A case study presents the learner with the realistic challenge or question and contains supporting case materials, documents and data to be analysed -  the actual content will depend on the instructional purpose. It can and usually is based on a story, whether real or realistically imagined, but the story is used to provide context and realism for the task. The solutions are sought by learners and later discussed with a mentor or in a group setting. Case studies are best used for challenges that don't have very specific solutions and where analytical thinking, argumentation and evaluation of different perspectives is important. For instance, using a previous example of the patient - a case study would be giving learners the patient's history and then asking them to come up with the diagnosis and justify it with the evidence from the case materials.

Scenario

A scenario is often the most elusive concept to describe (especially since it can be very synonymous with a story), so in this case I will borrow the definition from Ruth C. Clark (2013, p. 5):

"Scenario-based e-learning is a pre-planned inductive learning environment designed to accelerate expertise in which the learner assumes the role of an actor responding to a work-realistic assignment or challenge, which in turn responds to reflect the learner's choices."

As we can see from the definition, what makes scenario different from a case study or a story are these factors:
  • Learner has an active role
  • Learner solves a realistic work challenge
  • The environment responds to the learner's actions 
In contrast, the story/parable does not include the learner as an actor, as they are simply observing the events that unfold. The case study, while asking the learner to work on a realistic task, does not allow them to see the results of their proposed solutions. The results can be hypothesized or imagined, but never really experienced. A scenario, however, presents the learners with choices, challenges and realistic consequences or responses. 

I would note here that in my experience, scenarios are very often associated, sometimes almost exclusively, with "branching" and "dialogues". However, "branching" is a purely technical term that usually makes sense when scenario is developed in a slide-based (or screen-based) software and dialogues are just one example of a work-related challenge. Alternative scenarios could be making a perfect cup of coffee or carrying out a medical procedure. 

Now What?

Having said all that, I have to admit that for people like me, who appreciate the power of radical clarity, it is often natural to engage in petty discussions about whether a "true" scenario should be branching or not, or whether a short video is a micro-, nano- or ɰ-learning. Such discussions, particularly on social media where argumentation should be tastefully omitted for the sake of brevity and witticism, are very enjoyable but, as many pleasant things in life, rather unhealthy. The practical purpose of terminology is not to rigidly label every concept in our reality (or die trying), but to facilitate common understanding and the ability to look beyond attractive labels and see the true nature of "scenario-driven interactive experiences", as not all of these are created equal. 


References:

Clark, Ruth Colvin (2013) Scenario-based e-Learning: Evidence-Based Guidelines for Online Workforce Learning, Pfeiffer

August 31, 2017

My E-Learning Design Process: Taking out the Trash

It will sound strange, but it's true: the most fascinating part of my life in Germany is recycling. To be more precise, the sorting of trash. Wait, don't go yet, this will actually be about e-learning! To give you an idea of the importance of this question, here's a photo of a document I received in the mail some time ago. If you're curious, it contained the news about the new color of our "bio-trash" bins. Serious business.

So, do you receive important news about your trash very often?

Of course, when I saw the title ("Keep It or Toss It") of this week's ELH Challenge, I could not resist. I had to come up with an interaction dedicated to the complex intricacies of the trash sorting. You can see my submission here.

In this post I'd like to talk about the making of this interaction, focusing on three points:
  • Thought process behind the making of this interaction and some instructional design.
  • Fast and efficient way of making a drag and drop interaction without the "Freeform" option. 
This is a reflective post and not a tutorial. I often enjoy reading reflective posts by designers and developers, as it helps me understand their thoughts and approaches to the task at hand. So, I hope you will enjoy it too, particularly if you're at the beginning of your e-learning development journey. For your convenience, I've summed up some "lessons learned" after each part.

Instructional Design

Yes, there's actually some thought and not only humor in this small piece. As you will notice, it doesn't have any theory or "help resources" included to support the learner. This is neither due to the lack of theory (there are plenty of schemes and manuals), nor an omission by accident. 

In fact, I first thought about adding an explanation of which trash goes where. But, my intention was not to test people's memory. Instead, as is my usual approach when designing training, I wanted users to learn by making assumptions and testing them. In a real life, you probably wouldn't read a manual about taking out the trash. Instead, you'd separate it however you feel is logical and be done with it. Thus, the only difference from life in this case would be an opportunity to get feedback on your assumptions.

Speaking of which, my original intention was to include the most confusing trash items. In fact, I selected 8 items at first, but decided to cut it in half, considering that ELH Challenges are usually short.

In short: 

  • Create life-like contexts and tasks
  • Devise activities based on popular misconceptions

Design

It took me slightly less than 3,5 hours to create the "course" from complete nothing to finish. 


This may seem like a huge amount of time to spend on something as simple as 6 trash cans, a draggable object and some text. That's absolutely true, but only if you have a solid idea or a prototype to work from. Getting to this prototype is what's complicated and requires time. The biggest chunk of time (around 2 hours) was spent on ideation - coming up with an idea, scouting for available assets, choosing fonts and colors, and deciding on the final look. The rest was spent on creating assets, slides, interactions, as well as writing feedback, publishing, bug-zapping, and, most importantly, admiring the end result.

When I work on ELH Challenges, I do some formstorming and play with different ideas on paper before choosing one and developing it further. I have to say, unless a brilliant idea suddenly dawns upon me from the start, the more I engage in the formstorming, the better the result. So, the time spent on it should not be seen as a waste. This may be an obvious statement, but if you're locked in a "rapid e-learning development" environment, it's hard to stick to this opinion.

Formstorming is something I've learned in the Fundamentals of Graphic Design MOOC. As Lupton and Phillips (2015, p.13) define it: "Formstorming is an act of visual thinking - a tool for designers to unlock and deepen solutions to basic design problems. [...] Formstorming moves the maker through automatic, easily conceived notions, toward recognizable yet nuanced concepts, to surprising results that compel us with their originality." There are different ways to do it, but the approach I use most often is to create as many iterations of a subject as possible. If you're interested, this is a great example of 100 iterations of a letter A.

In this particular case, however, everything was defined by the trash cans. I made these directly in Storyline, so the rest of the module had to match in form and style. Still, even with this quite specific goal in mind, there were some questions to mull over:

  • How do I make sure that the user identifies the material of an object correctly? Since the assets are not photographic, it might not be obvious of a bottle is made of glass or plastic.
  • Where to place the drag object?
  • Where to put initial instructions?
  • Where and how the feedback will appear?
  • What about a progress indicator?
  • Should I add sound effects?
  • Should I add some limits to the amount of mistakes?
  • Fonts?
  • Colors?
And probably some more. I went through approximately 10 different slide design variations before settling on the final version. For example, I've tried adding a wall behind trashcans and writing the object description in the graffiti-like font (it didn't work well with a "cutesy" flat design) or adding a progress tracker.  

In short:

  • Formstorming is not a waste of time, because...
  • The more design questions you answer before starting with the actual development, the faster you'll develop.
  • You can formstorm in Storyline, but I recommend starting on paper first.
  • Next time someone asks you, "But how hard can this be?!", you can show them this post.
  

Development

As I often say, once you know what you're doing, development is easy. In this case I followed my own advice and finalised one activity slide, before copying it several times and making small adjustment.

I didn't use any special tricks to create the activity. It is made from scratch, but you can achieve the same effect with the "free form" option. I prefer my own triggers, unless I'm really pressed for time (mostly because I feel more in control). The triggers are very simple:


"What's that with the layer 'Object?", you might ask. Excellent question. The layer "Object" is actually showing the description of the trash item:

Object "name" and description on a separate layer

The purpose here is to automatically get rid of this text when the feedback layers appear (instead of hiding it on each layer's timeline). I sometimes do this when I have layers and need to either hide a lot of objects at the same time and quickly, or show something only once. While this is not hugely beneficial for an interaction with just two layers, if you have to do this for let's say 10 layers, you begin to see the benefit.

The only other point I would highlight here, is that it might be tempting to see each trash can as a separate object and set up the triggers for each individually. In this case, however, as you most likely noticed from the triggers, I used two hotspots instead: a small one for the correct can and a big one, spanning across the slide, for all the others:

Green (correct) hotspot is placed over the red (wrong) one.

This way it was easier to create additional slides by duplication, as I didn't need to re-do the triggers at all. Instead, I simply moved the "Correct" hotspot to the right bin. An optional touch was to hide incorrect bins on the feedback layers, but this was also easily adjusted.

In short:

  • Consider moving objects from the base slide to a layer, if you want to consistently hide them when other layers appear.
  • Avoid extraneous work whenever possible (do you really need to have 6 drop targets where 2 are enough?).
  • My advice from this post is actually good. :) 

References

Lupton, E. and Phillips, J.C., Graphic Design The New Basics, New York, Princeton Architectural Press, 2015.



Liked this post? Hated it? Want to hire me or get in touch? Let me know in the comments below or ping me on LinkedIn. I also do freelance projects.


August 29, 2017

Freebee: PowerPoint Chat Bubbles

Because one can never have too many chat bubbles. Click here to download the PowerPoint file, which includes all of the following speech bubbles:


The file is created by me and you can, naturally, use it for whichever personal or commercial purposes. All objects are editable and you can scale them, change fill and outline colors and save them as pictures. For text, I recommend adding separate text boxes and grouping them with the bubbles.

The fonts seen in the image are not included in the file. In case you're interested, the fonts are (from left to right and top to bottom):

August 27, 2017

Storyline Tips for Speed, Precision, and Your Sanity

In this post I wanted to share some tips that made my development process more efficient and precise. There are many tips about saving time and effort in Storyline. Most likely, you've already heard about the benefits of the format painter and slide masters Therefore, I tried focusing on things that were rarely mentioned or not obvious to me at the start.

Number your objects

We all know that naming layers and objects on the timeline is a good idea so that you know which one of the 10 "Rectangles" is your custom-made "Submit" button. Sometimes though, giving your objects numbers instead of names may be more efficient. 

For example, let's say you are working on an e-learning filled with complex terminology like "agranulocytosis" or "propylthiouracil" or some other Lovecraftian-sounding language. And let's say you need to provide additional explanation for three of these complicated concepts. You create three buttons for the learner to click to see the matching layers with additional content. Or maybe you create a drag and drop interaction where you need to match the complex terminology with its description.

It may be easier to match "Drag 1" to "Drop 1", rather than trying to remember if "Shampoo" is "Step 3" or "Step 4".


In this case, instead of using the actual terminology from the course, I use numbers. This helps me to save time not only on typing or copy/pasting, but also reading and comprehending what is what. I find it much easier to match "Layer 1" to "Button 1", than "Layer Triiodothyronine" to a similar clickable object. Lastly, it is gives a much cleaner Trigger list for double-checking if everything matches. 

Ahhh. A treat for a Sheldon Cooper within you.


Number your variables

Same can be true for variables. I prefer to give them numbers, with a one-word description, if necessary. For example: 3_33_L1Seen. This tells me that this variable tracks whether or not the user has seen Layer 1 on slide 3.33.  The reason I number my variables by scenes and slides, is that if I need to come back to the project after a break or an interruption, I can easily remember, which variables are related to the slide I'm working on (since I can always see the number of slide, but I may not always exactly remember what was the variable name: was it L1Seen or Seen1L?).

Since I know that there are 6 clickable objects on the slide, I can quickly check if they are all accounted for by looking at the numbers. Using slide numbers in variables helps you find them in the list, if you're using a lot of them.

In addition, it makes the "sanity check" quicker. When I need to double-check if everything is accounted for, I don't need to read the names of the variables and correlate them in my mind with the named objects. I compare numbers with numbers instead. Faster and less straining on my cognitive resources.

Conveyor-belt approach to changes

Let's say you need to change 3 things on 5 slides (and you can't do this with master layouts). What I've noticed many tend to do (and what I did myself before) is to go slide by slide and make all adjustments on each. While this may give you the satisfaction of having each slide done before moving onto next, I find this strategy inefficient. Think about it - you have to keep in mind at least three things. You also need to make sure you do not forget to do any of them even if someone distracts you. Then you actually need to do them, which also requires your attention and cognitive resources. It is actually hard, particularly if you're working in a busy open office.

When I "conveyor-belt" my tasks, I can focus only on one task at a time and don't need to mentally switch between them. This way I am more likely to get into the "flow", which gives me a faster pace. So, I go through all the slides, make one change, tick it off the list. Then go through the slides again, making another change and tick that one off, too. It sounds like it would take more effort, but I found this much easier purely from the cognitive load perspective.

Create one thing, test it, then copy it

This can be said about pretty much anything: buttons, exercises, typical slides, quizzes - anything that you may want to reuse. I used create all of the elements on the slide. For example, if I had a menu with 4 buttons, I would make all four of them and then test how they all work. These days I create one button and 3 placeholders. Then I make sure that I like it and it is working as intended (because if it doesn't, I'm about to create a lot more work for myself). Then I either copy/paste the finished object or use "Format Painter" on the placeholders. 

Copy/pasting  works particularly well for copying something to another slide. When you copy an object to another slide, it inherits:
  • Position on the slide
  • Position on the timeline
  • Formatting
  • Triggers 
  • Animations


It's not immediately obvious, but you can also copy/paste layers with their triggers, including into other slides.

Precise position

When my placeholders are on the same slide, copy/pasting will preserve the formatting and triggers, but not the position. But let's say my placeholders were perfectly arranged and now I want the copies of the actual objects to appear exactly in the same place. Let's also assume that my  finished object also has triggers, which I want to keep, so using a Format Painter is not that efficient.

In this case, instead of copying and then manually dragging the elements across the slide, I use the Size and Position dialogue. To open it, select any object and press Shift+Ctrl+Enter. I note the coordinates of the placeholders and then enter the same numbers for the position of the copies. I can then delete the placeholders by selecting them on the timeline. 

No more eyeballing, mouse-wrangling or nudging. Quick and easy.


Precise crop

This option is not obvious at all and it took me a while to notice it. If you need to crop an image to a pixel (and you want to use Storyline, rather than an external tool), don't waste time and patience on the Crop button. Instead, select an image, press Shift+Ctrl+Enter to open Size and Position dialogue again, and enter the crop values there, under Size tab. Pixel-perfect crop with the synchronous preview. 

Badum-tshh! Word of warning - does not work well with flipped/mirrored images, so crop them before you flip them.

"Set as Default…"

Want to make sure that all of the text boxes you create from this point onward have 8% transparent background fill, #3333333 color text and 30 point margins on all sides? Create one element and format it as you wish. Then right-click it and select "Set as default…" in the menu. From now on all text boxes that you create will inherit these settings.

The "default" settings are separate for shapes, text boxes and buttons.

"Change Shape"

Let's say you made a square shape, added all sorts of fancy stuff to it, but now you need it to have rounded corners. Instead of deleting the old object and recreating everything from scratch, you can simply use Change Shape option in Storyline, which works exactly as the same function in PowerPoint. The only thing you may need to adjust is the shape size.

Select a shape, click Format in the Drawing Tools and choose Change Shape.

Note that this may not always work well with objects that have multiple states. For example, if you change a square button into a rounded square and adjust the size of the corners, the corner sizes will not be inherited by all states. Change wisely.



Liked some ideas? Hated some ideas? Found a typo? Let me know in the comments or reach out to me through my fancy (hey, there is "fancy" in "infancy"!) new site

August 7, 2017

The world doesn't need a hero. All it needs is an instructional designer.

"Can SME be an instructional designer?", "Do you need an education, or can you just learn Storyline?" or "Who needs those instructional designers anyway?" Somehow, I see a lot of these questions recently, so here are my unsolicited two cents...

To answer these, I would look at the definition of instructional design. With some variations, it will say something like: "Instructional design is a systematic and systemic application of scientific knowledge to improve learning processes and outcomes". If you ever had a course on instructional design, you will most likely remember either this, or spending a lot of time discussing what is "systematic" and what is "systemic".

So, there are two questions to ask when choosing an instructional designer (or deciding to become one).  Does the candidate (or you) have scientific knowledge? Can they/you apply it in a way that

  • makes sense
  • takes into account many interconnected factors and 
  • leads to measurable results (i.e. more than the existence of a training program)? 

The obvious, but for some reason elusive, point is: you need to have both: a solid theoretical grounding and a knack for its application. It sounds bizarre, but in my experience, it is hard to be practical without a good knowledge of theory. Without it, it is hard to have a critical view of the reality. For example, would the "10 ways to make your e-learning better" actually make it better? And why? Education alone won't get you anywhere, but practice alone will get you someplace random.

As for the usefulness of instructional designers... It is tempting to create solutions, which replace either the knowledge, or the skill of its application. For example, I've often seen such interesting internal documents as "pedagogical styleguides". Their purpose is to document the standards of training development in a company. While it can be a nice idea from the point of view of process documentation, more often they do not make sense. Particularly when these standards deal with something that cannot actually be standardised. For example: "Company X uses exclusively constructivist approach" or "We're using only andragogical principles".

Apart from sounding funny to an informed reader, these standards are impractical. The pedagogical theories are not "right" or "wrong". They are concerned with different aspects of learning. One cannot "choose" one theory and abandon all the others. Similarly impractical are suggestions to "include a knowledge check every 10 slides". Or "not include more than 5 learning objectives". What is the point? What is the purpose?

In other words, professional instructional designers are not outdated, but extremely important. And no amount of guidelines or instructions to have a "20 questions quiz for each module" will replace them.