The main goal of instructional design is to help people bridge the gap between their current knowledge and skills and the knowledge and skills they need.
In this article, we will see how the top three instructional design processes, ADDIE, SAM, and Action Mapping, tackle a company's customer service problem.
Let’s get started.
Dattech, your client, is a small internet service provider.
The company provides better internet at more affordable prices than competitors, but many clients are leaving because of poor customer service.
Dattech hires you to create a solution for the problem.
ADDIE (Analysis, Design, Development, Implementation, and Evaluation) is the instructional design model that most people use. It covers the entire process of an instructional design project.
Originally instructional designers followed the steps of ADDIE sequentially, with analysis leading to design, design leading to development, etc.
However, modern instructional designers usually implement all of the stages at once and make adjustments throughout the project.
Let’s apply ADDIE to our customer service problem.
Analysis determines whether training is necessary and provides information about the project, the people who need help and training, and the organization itself.
Ideally, you begin with a training needs assessment. This shows whether training is the best solution for the performance problem. If so, you continue analyzing by conducting interviews, observing employees, and giving out questionnaires.
You also speak to focus groups that include the people you are trying to help and their supervisors.
After talking with your client, the manager of the customer service department, and some of the customer service staff, you learn three things:
All of this information indicates the current training isn’t preparing the employees.
Now that you have identified the problem, you can start creating a solution.
During the design phase, you take the information from your analysis and use it to create a solution for the project. You then present this solution to the stakeholders.
You include all of the relevant data to show why you chose that solution. Once the stakeholders approve your plan, you work with subject matter experts (SMEs) to design the instructional material.
The content that you produce during design depends on what you want the final learning intervention to be.
If you plan to develop eLearning activities, then the output from this phase might include a script or production-ready storyboard.
If you think a face-to-face intervention like a training seminar is the best solution, then the output of design may include content for a facilitator guide and participant workbook.
You decide that the best solution is a learning experience that replicates a phone call with the client. In the learning experience, the learner will have to identify the internet service issue based on what the client says.
You set your learning objectives:
By the end of the training, employees will be able to recognize the potential causes of internet service problems.
By the end of the training, employees will be able to suggest solutions for internet service problems.
You create a proposal for your eLearning experience. In the experience, the learner will see an image of a wifi setup. A client avatar on the screen complains about a wifi issue.
The learner clicks on the device they think is the source of the client's issue (laptop, router, modem, etc.) every time the learner clicks on a different device the client avatar gives them feedback about whether they are close to identifying the problem or not.
You show this proposal to your stakeholders and include all of the data and information that explains why this eLearning experience is the best solution.
After a short review, they approve the project.
You write out scripts and create a storyboard for the eLearning experience. You work closely with your SMEs, senior employees in the customer service department, to make sure that everything you put into the storyboard is accurate.
You finish the storyboard and get the stakeholders to sign off on it. Once they do, you start developing the final product.
You develop the storyboard into the final product in an authoring tool like Articulate Storyline or Adobe Captivate.
Here, you get your eLearning experience in front of your audience.
In some cases, the instructional designer uploads the eLearning experience to an LMS (Learning Management System), enrolls the learners, and notifies them that the training is available and required.
Your main goal is to ensure that the rollout goes smoothly with no technical issues.
You upload your eLearning experience onto the company’s LMS and go through the entire learning experience yourself several times, making sure that every action, interaction, and piece of dialogue works perfectly.
Next, you enroll the employees and send them an email saying that the training is ready and required.
To determine how effective your eLearning experience is you would use the most common framework with four questions from Kirkpatrick’s model of evaluation:
Modern instructional designers incorporate evaluation at every stage of the instructional design process and adjust based on audience response and feedback from the stakeholders.
The final evaluation determines whether the instructional content is accomplishing its goals.
You observe the employees for a few days. They report high levels of satisfaction with the learning experience. Customer satisfaction surveys show that clients are much more satisfied with their service.
Three months later you check in again. Customer retention has dramatically improved.
Companies are often less interested in analysis and evaluation and are more focused on design and development because they seem more tangible.
However, measuring is key for effective design thinking and instructional strategy. Push for analysis and evaluation if you can, but be prepared for these elements to be ignored.
If you want to learn more about ADDIE you can check out this article.
But for now, let’s move on to SAM.
The Successive Approximation Model (SAM) focuses on rapid prototyping and testing multiple solutions for learning interventions.
SAM is less vulnerable to delays in development than ADDIE.
With ADDIE, if your analysis misses any important information, then you might have to go back and redo parts again.
SAM develops and tests multiple prototypes, using the information from each prototype to develop the final learning intervention.
During the evaluation, you gather information about the project by asking specific questions like:
This analysis is usually much less in-depth than in the ADDIE model. The goal is to get the minimal amount of information you need to start suggesting solutions.
The idea is that there are often hidden barriers and requirements that don’t appear until you start proposing solutions.
For example, a client might not mention that they have a strict budget until your solution goes over it.
During your initial conversation with the client, you learn that Dattech already provides training to its customer service employees, but it doesn’t seem to help.
You ask a few more questions and prepare for the design meeting.
In SAM, the design phase begins in a meeting called a “Savvy Start” where you meet with all the major stakeholders, like recent learners, subject matter experts, and the client.
Your goal is to suggest as many solutions as possible, then hear from all the stakeholders why they won’t work.
Every rejected idea gives you more information you can use to find a workable solution.
During the Savvy Start, someone mentions that Dattech has been recording client calls for months but no one has done anything with the data.
You suggest turning those conversations into an interactive quiz to give employees more practice responding to client calls.
The stakeholders accept this idea so you move on to the development stage.
In SAM, development is a three-stage process. At each stage, you try to come up with a unique solution to the same problem. Then, using the data from all three prototypes, you create the final product.
You pull out the most common questions from the recorded calls and make them into a multiple choice quiz.
More experienced employees do quite well but newer employees score poorly. You realize that the current training material doesn't provide opportunities to practice analyzing client problems.
You get started on the second prototype.
You ask two experienced customer service employees to role-play as clients. They ask newer employees questions from the quiz you made in prototype one.
If the new employees make a mistake the experienced employees correct them and let them try again.
The new employees struggle at first, but then they start to answer the questions quickly and accurately. The feedback from this prototype is positive. The employees appreciated getting an opportunity to practice.
You get to work on prototype three.
Ideally, you want to create an entirely new solution at this stage. But you don’t have to if you have lots of positive data from the first two prototypes.
You decide to modify the original training so that the employees have to actively practice using the information right after they learn it.
You have some new employees go through original training, then you have them take the audio quiz from prototype one. Finally, they role-play client calls with an experienced employee. Now, their answers are accurate and timely.
You take the results back to your stakeholders. The people who went through the training felt like the multiple-choice quiz didn’t add anything to the experience. You decide to scrap it.
Once you have the data from all three prototypes you combine them to create the final product.
You take the instructional material from the original training and divide it into sections.
After each section, the learner is put in a branching scenario where they are on a call with a virtual client and they have to choose the correct answers to the client’s questions.
If they make a mistake at any point a digital manager pops up to correct them and give advice.
Because you were testing and iterating at every step of the process, you can walk away assured that this solution will solve the customer service problem.
For larger projects, there would be another stage of development to ensure that your solution exactly meets the needs of the stakeholders.
However, that isn't necessary for our example.
Let’s move on to Action Mapping.
Action Mapping has a different focus. If you recall from the ADDIE example, clients often are less interested in the analysis or evaluation stage of development because they can’t see how it helps accomplish the goal.
Action Mapping fixes that problem by putting the business goal front and center and then determining the concrete actions that individuals can take to ensure that goal is reached.
When you first meet with the client, you want to get as much information about the project as you can then schedule a kickoff meeting with the client and stakeholders.
During your initial conversation with the client, she tells you that they are having a performance problem with their customer service staff.
She wants you to create an online course to train her staff on how to respond to customer questions.
You avoid committing to course design because you don’t know whether that will solve the problem and request a kickoff meeting with the client and the key stakeholders.
In the kickoff meeting, you want the client and at least two stakeholders to participate.
One stakeholder should be someone who is feeling the pain from this performance issue, like the department manager.
The other stakeholder is the subject matter expert. This is someone who is intimately acquainted with the role of the people who needs help.
The purpose of the meeting is to learn about the problem, establish the business goal, and map out actions that will help people accomplish that goal.
You meet with the client, the manager of the customer service department, and the SME in this case a customer service representative with years of experience.
The CEO explains to you that the company provides better internet service than their competition but when customers have technical service issues, 40% of them switch providers.
This compares poorly to the industry average of about 20% customer loss for technical issues.
The manager explains that none of the clients’ issues are particularly complicated. The problem is that the wait times are so long.
Sometimes clients are left on hold for 30 minutes or more.
Once you have an understanding of the problem you want to work with the client and stakeholders to establish a goal for the project.
In Action Mapping, the business goal follows this format:
A measure that we already use will increase/decrease X% by a specific date as people in a specific group do something.
You know that Dattech loses 20% more customers than average. And you know that the main reason is long wait times so you establish the goal:
Customer loss will decrease 20% in three months as customer service representatives respond more quickly to customer questions.
The client and stakeholders agree that this is a good business goal. So you begin the next stage of the process, creating the action map.
When you construct an action map the business goal is in the center. The surrounding nodes are all the observable actions that people can take to accomplish the business goal.
It is essential to keep everyone’s focus on observable actions. SMEs often focus too much on what employees should “know” even if it doesn’t help the business goal.
Once you have identified the essential actions that employees need to take to accomplish the business goal the next question is “Why aren’t they doing those actions?”
After some more discussion you and your stakeholders create four actions that employees should take to accomplish the business goal:
The question now becomes: “Why aren’t people performing those actions?”
Is the problem caused by the environment? A lack of knowledge or skills? or motivation?
After some more questioning, The senior customer service member points out that many of the new employees struggle to troubleshoot customer problems even though they all receive training.
They often Google solutions or ask more experienced employees.
You look at the training and see that it is quite long and doesn't require the learner to think about how to use the information. Now you have all of the necessary information to develop a prototype.
You want to work with SMEs to develop a prototype with realistic practice activities and questions that the learner will encounter on the job.
This is also where you decide the format of your learning intervention. It could be a one-off in-person training event or an in-depth eLearning experience.
The choice of format will depend on the project.
You decide to create an eLearning solution that is a gamified simulation of the customer service role. This learning environment also mirrors the work environment much better than the previous course content.
The learner will have to assist client personas. If they accurately fix the problem for each client they complete the training.
If they fail to help a client they are shown the results of their actions. If they fail three times they have to start from the beginning.
You also realize the employees need a resource to help them answer client questions. So you take the original training content and turn part of it into a searchable knowledge base.
Now you test the prototype with the intended audience and your stakeholders and get feedback.
Everyone likes the gamification element and the employees like the idea of the searchable knowledge base.
You ask some of the new employees to test out the new eLearning experience. They appreciate being able to practice answering questions and analyzing client problems.
The CEO gives you the green light to develop the project.
The next step is to create a project plan that includes the deliverables, approach, and current findings. You get your stakeholders to sign off on this project plan.
Finally, you develop the project in batches.
You create a project plan which the CEO approves. Then, you create different client persona dialogues and accompanying results. Once you finish each persona you give it to your SMEs for feedback and advice.
As they review and approve each persona, you get started on the next batch. Once the project is done, you present it to the intended audience.
After a few weeks, you talk to the customer service staff and gather stories about whether the results helped them or not.
And that is it for Action Mapping. If you want to hear a more in-depth explanation of the Action Mapping Process, then you can check out this talk.
These are the three most used instructional design processes. While we on the Devlin Peck team prefer action mapping, it’s a good idea to know how each process works so that you can use them effectively.
The priority for each one is different:
ADDIE focuses on slower development for more traditional courses; SAM focuses on rapid iteration and testing for solutions; Action Mapping focuses on defining and training observable actions that accomplish business goals.
Being able to use the best practices from each will strengthen your abilities as an instructional designer.
This article covered the different instructional design processes for projects. If you want to learn how to make an effective eLearning experience, then check out this article.