A Guide to Designing and Delivering Learning Events
BY JONATHAN KLANE
Training in a Pandemic
Working from Home but Missing Your Synergist? Update Your Address
If you’ve been working from home during the pandemic, please consider updating your address with AIHA. You can change your address by editing your profile through AIHA.org. To ensure uninterrupted delivery of The Synergist, designate your home address as “preferred” on your profile. Update your address now.
Constraints breed creativity. Limits force us to think of new ways to accomplish what’s necessary and to try harder to come up with solutions to problems at hand. The COVID-19 pandemic has placed constraints on many aspects of our lives, including workplace training. We are currently limited in whether and how we can offer workplace learning or training, and many of us are not doing classroom or live learning and have been forced to migrate online.
The logistics of in-person learning events during a pandemic world are difficult, but perhaps a greater challenge is figuring out how to facilitate online learning while promoting greater engagement among learners and effective workplace learning. Most training isn’t live these days; it’s more likely asynchronous. Learners are logging on whenever they like or more likely when they can afford the time. Remember, no one has “free” time for learning—especially if its value isn’t obvious. How can industrial hygienists and occupational and environmental health and safety professionals get creative with learning in a pandemic world?
LEARNING NEEDS ASSESSMENTS As author Stephen Covey says, “Begin with the end in mind.” For trainers, this means performing a learning needs assessment (LNA). A learning or training needs assessment is probably the most frequently skipped—yet important—step, despite OSHA calling for it in the guidance document “Training Requirements in OSHA Standards” (PDF). Those who omit an LNA do so at the peril of themselves and their learners. Imagine a site where workers aren’t engaging in safe behaviors. The manager asks the trainer to simply “train them again.” An experienced or cautious trainer would want to look into what’s going on in the workplace. If the trainer is forced to provide refresher training without assessing the site’s specific needs, the likely outcome is that the unsafe behaviors will continue. A more cautious trainer performing an LNA might discover any number of factors that can’t be fixed by mere training: issues with supervision, distractions, morale, finances, management, or systems, for example. A Title IX coordinator once told me, “Training is a terrible tool to try to change behaviors—it doesn’t work.” I agree with her wholeheartedly.
COVID-19 has seemingly changed everything, like a reset, creating the perfect opportunity to accurately assess learning needs. LNAs come in various shapes and sizes: some focus on learners, others on process, and still others on task or work setting. Here I’ll describe a few LNAs that I’ve found useful: the Delphi technique, nominal group process, task analysis, and—perhaps my favorite—reflect on practice.
The Delphi technique is like trying to get the standard deviation of a group’s opinion when the group’s members are dispersed. I’ve used the Delphi technique for an oil-and-gas client to assess the learning needs of its EHS auditors. For COVID-19 purposes, I might use Delphi to assess the learning needs of researchers across a broad swath of faculty who are notoriously challenging to corral together at one time. The technique involves developing a survey or questionnaire and sending it out to a group of knowledgeable professionals. Once the responses are gathered, the trainer reworks the most common responses into new survey questions, which are also sent to the group. The new responses can then be used (or can be recrafted and sent out once more, if needed) to develop training.
Similar to the Delphi technique, nominal group process (NGP) has a “captive audience”—like a focus group. The process involves interviewing the group and seeking common responses. The common answers found during the initial interview are posed again to the group in order to find some form of consensus on the students’ needs. I use NGP when I teach my train-the-trainer course as a working example for the class. We assess the needs of a classic trainer by evaluating the most important traits of an EHS trainer. In my experience, dozens of groups of EHS/IH professionals have chosen being a subject matter expert as the single most important trait; we value knowledge and information over everything else trainers might have at their disposal in a learning setting. This is an inherent problem as it runs 180 degrees counter to good adult learning principles. During COVID-19, I might use NGP to assess new pandemic-related learning needs for EHS department staff since I could likely get them all together either in person or virtually.
Task analysis is helpful when trainees need to acquire the skills to perform a new task. Task analysis involves breaking down a task into discrete steps and analyzing the steps to determine how best to learn the task. Task analysis is at the heart of most of what we identify as hands-on learning. From playing guitar to putting on a respirator, it’s likely that—consciously or not—the instructional designer used a task analysis. During the current pandemic, some obvious possibilities for this type of training include how to put on and take off a face covering, disinfect equipment, or wash one’s hands.
Another example of an LNA is reflect on practice. Picture a professional sitting at a desk, musing over what makes a person in that practice a professional: the necessary skills, abilities, and competencies. This person is reflecting on his or her practice. I once used this approach to assess the most important topics to cover in a bloodborne pathogens refresher training for doctors and nurses at a college health center. The year was 1998, and they weren’t consistently using universal precautions. I asked a doctor friend to reflect on her practice, and she told me to definitely cover the then-newer triple antibiotic cocktail used to treat HIV to help encourage safe practices. For COVID-19, an example that would have been perfect a year or so ago would have been to ask an epidemiologist to reflect on practice and describe the societal effects a trainer should discuss in a learning event to prepare people for what was to come during the pandemic.
For asynchronous learning, trainers should consider sending out a pre-course survey asking learners what questions they have, what they’d like to see covered, and what challenges they face. This can be an additional part of your LNA.
LEARNING OBJECTIVES From LNAs spring learning objectives or course outcomes, which include newly acquired knowledge, skills, or attitudes (KSAs). Learning objectives or course outcomes must be measurable, observable, and demonstrable—or “MOD” for short. If you can’t apply all three of these descriptors to the objectives or outcomes, they should be reworded. Learning objectives and course outcomes have a specific and widely used and accepted format: for example, they always start with an introductory statement about what the students will be able to do following the course. As such, objectives or outcomes should start with an action verb. But which one?
A useful tool for developing precise and applicable learning objectives with appropriate action verbs based on level of engagement and application is Bloom’s taxonomy. Many diagrams for this framework are available online, but the best one I’ve found is from Wikipedia (others may prefer a different model or template to work from). Users of Bloom’s taxonomy are prompted to start by deciding on the relative learning level and application—from lower learning levels like knowledge and comprehension, through application and analysis, up to synthesis and evaluation at the highest learning levels. The tool assists users in creating course outcomes based on learning level and suggests action verbs and learning activities. For instance, in an IH 101 course, the learner needs to acquire a great deal of knowledge and comprehend the concepts. In this case, a learning objective might be to match a substance with its disease (cotton dust with byssinosis or brown lung, for example). More advanced learners in an IH 401 course may need to be able to synthesize and evaluate IH data. Here, a learning objective might be to critique a data set for its problems. Using Bloom can make developing educational objectives an easier and more standard process. Bloom’s taxonomy can also help prevent trainers from using non-action verbs like “know,” “understand,” “appreciate,” or “be aware of”—these aren’t MOD. INSTRUCTIONAL DESIGN Once you’ve developed strong learning objectives, it’s time to design your course. Instructional design is made an easier and more successful process by first completing LNAs and developing learning objectives. Trainers should stick to courses that center around facilitating learners’ abilities to accomplish the stated objectives. I suggest familiarizing yourself with the concepts set forth by Malcolm Knowles, an educator who is considered the father of adult learning. When used in course design, his adult learning principles can help trainers develop engaging, fitting learning events. Some of Knowles’ adult learning principles include: Make it applicable. Adults don’t want to learn about what doesn’t apply to them or their situation. An apt example is respiratory protection training. If the contaminant is a particulate (like silica), only teach them about particulate respirators. Don’t try to also cover chemical cartridges—they don’t apply. And don’t believe that “they’ll need it in the future, so I’ll cover it now.” Instead, teach workers the second topic once it applies. They’ll be much more likely to want to learn it then. Do it just in time. There’s no future casting in adult learning. In other words, don’t fall prey to training workers now on something they’ll need in six months. COVID-19 makes this easy. First, it’s new; second, it’s in the moment; and third, it keeps changing and our knowledge is ever developing. Have learners be active. Passive or one-way learning (“I talk and you listen”) isn’t good learning. No one learns by being a sponge (“I’m just going to absorb it all and not engage”). The more active we are in activities like exercises, discussions, writing, calculating, and debating, the more likely the learning will last beyond the class itself. Make it engaging. Trainers must get learners involved with and excited about the topic. There is no such thing as a dry topic—just boring speakers. Share your passions and have learners wade into topics. COVID-19 again makes this principle easy to apply: for the first time, the topic applies to everyone. Real-world applications are endless, and discussion prompts like “what would you do in this situation?” are myriad. Make it truly participatory. Design training around successfully facilitating everyone’s participation—not just the bold or talkative few. Use approaches and methods for both groups and pairs to have all learners participate. Encourage adult learners to help set the agenda. There is often plenty of space in a topic for learners to suggest subtopics. With COVID-19, they might ask to discuss the effectiveness of cloth masks, whether six feet of distance is enough, and even how much risk there is in using a public multi-stall restroom. The quintessential adult learner is self-directed and loves to help by co-teaching. This was me in 12th grade. My English teacher was stumbling over the details of the start of the Trojan War. Since I was also taking a class in mythology, I was happy to help with a few corrections. By my third “excuse me, Miss…” her response said it all: “Jonathan, would you like to teach this?” Heck yeah! And so I did. Since it was odd to have a student teach an impromptu lesson, my classmates paid attention just to see what I might say. It worked, and I suggest giving it a try. Ask certain learners ahead of class to assist by demonstrating, for example, how to properly wear a face covering and how not to. This also works on video for virtual courses. As you’re designing your learning event, consider using the “ADDIE” model if no other method is preferred. ADDIE stands for “Analyze, Design, Develop, Implement, and Evaluate,” and can assist trainers in framing the actual event or course design.
Anything but Lecture
I once attended a talk at a conference about “other training methods besides lecture.” I was intrigued, as were many others, including my colleague Steve, who I sat next to. Our conversation went something like this: “Interested?” “Yes, very much! And you?” “Oh, yes!” The speaker was quite knowledgeable and well versed. He had a master’s in education and worked in the field of learning. As he began, we both listened attentively. The speaker discussed the theory behind learning, the use of lecture in many contexts, and the many benefits of using other learning approaches or methods. His discussion continued and covered the topic in detail. He talked and talked, and so it went.
When the speaker was finally done with his talk, Steve and I got up and filed out with the other attendees. Once out in the main hall where we could hear each other, Steve asked, “What was that?” I looked at him, smiled, and replied, “Irony?” We laughed together. But as I thought about it, I conceived a different way to facilitate a learning event on the topic: a session called “Anything but Lecture!” in which the learners choose the activity from a list of various approaches and methods and we have fun experiencing it. The one activity we don’t do? Lecture.
APPROACHES AND METHODS Lectures are passive, one-way, not engaging, and nonparticipative. They may seem efficient for the speaker, but they are highly ineffective for learners. We’ve all attended lectures as well as delivered them (myself included). They often contain mostly text and bullets—and many in learning circles argue that “bullets kill.” Instead of falling back on lectures, trainers should try to include approaches and methods to help facilitate learning and promote learners’ ability to achieve their objectives.
Many engaging approaches and methods are available, but unfortunately several aren’t easily used asynchronously. Try incorporating some of the following items into your next learning event.
Play a game. If everyone is together, there are many options for games (Jeopardy is a frequent choice). For asynchronous learning, there are additional games that can be played alone while still competing with others (“Spot the Hazard” is one example). Top scores can be posted on a leaderboard. For COVID-19 training, you could make a race out of “Choose the Right PPE Ensemble!” or play “Curious COVID-19 Conundrums: Made Up or True Fact?” Again, a leaderboard can increase competitive spirit and learner engagement.
Tell them a story. Storytelling can help make strong points memorable, and trainers should take advantage of the power of stories. Creative nonfiction is what we all engage in when we use actual cases in story form to drive our points home. COVID-19 presents many opportunities for motivating stories. A few example topics include face covering etiquette and respectful behaviors, contact tracing as told by an epidemiologist, and what it’s really like to get COVID-19.
Use case studies. Industrial hygiene lends itself to the use of case studies. During COVID-19 training, trainers might consider replacing the typical IH case study and data with pandemic-related scenarios. For example, how could a workplace be redesigned to follow best practices for COVID-19 given the specific setting, work, and tasks? Trainers could consider highlighting real-world examples of clusters and superspreading events.
Try some Q&A exercises. Never tell learners when you can ask them a question instead. Consider structuring asynchronous training as a series of COVID-19 questions: Are face coverings PPE? What are the factors that increase the risk of COVID-19 transmission? Are we allowed to [insert action] here?
Design training with modules or micro-learning. IH lends itself to modular and micro-training with its plethora of topics, subtopics, and minutia. COVID-19 specifically can be subdivided into several modules: medical information, proper protection, infinite interactions, quarantining questions, and vectors and vaccines, for example. Trainers may also wish to try micro-learning modules, which are typically under five minutes in length and similar to online how-to videos that tackle single, targeted topics.
In the past, when trainers had fewer constraints (and seemingly plenty of time), we often advocated for block training—addressing several overlapping topics together to reduce redundancies (such topics could include hazard communication, PPE, bloodborne pathogens, and laboratory safety). Just-in-time training and micro-learning have taken us away from large blocks of training, which can be mind-numbing. In addition, online learning makes block training unnecessary.
LEARNING ASSESSMENT Trainers should resist the urge to give a test or quiz to assess whether learning objectives have been met. Tests are best at testing one’s ability to take a test and not much else. Plus, tests typically don’t measure or assess whether learners actually achieved their learning objectives (something we are supposed to do as trainers and instructional designers). Furthermore, many people have learning challenges that they don’t wish to admit or become known.
When I taught EPA asbestos and lead courses, I was required to give written exams and learners had to pass with a minimum score of 70 percent. Over the years, I taught many people who weren’t functionally literate and numerous others who suffered from test anxiety. Most of the lead renovators hadn’t been in a classroom in years and had barely managed at that time, so I got used to providing the exam verbally. Guess what? Without any help whatsoever, they passed—they knew the material; they just couldn’t read or became so anxious about the test that they froze up.
Instead, trainers should seek to facilitate a functional assessment of what learners need to be able to do. Ideas for this type of assessment include checklists of hands-on activities or activities that demonstrate that learners have achieved the learning objectives (for example, actively participating in exercises or doing well in a game).
EVALUATING TRAINING Individual trainers need to know what worked, what didn’t, and how to improve their learning events because there is always a next time (especially with refreshers). How can trainers best assess and evaluate their training? The best evaluation tool I ever used was a blank sheet of paper. I was teaching a safety class to high schoolers (a tough group!). At the end of our class, I asked them to take out a piece of paper and write whatever they wanted to tell me about the class: what they liked, what they hated, how they might use and apply what we covered. No names needed. I got back a stack of papers to read. A few were blank or had just a word or two, but most had entire paragraphs detailing what the training meant to the students. I was blown away by their comments, which helped me redesign the class to be more effective. It’s a lesson that I still recall clearly decades later.
To replicate this with online learning, and with COVID-19 in particular, try asking learners to tell you about the training. Trainers can suggest that learners share likes and dislikes, but I’d offer that more helpful feedback will come from asking them how they feel about their learning, how they’ll use it, what they’d like you to know, and whether they have any still unanswered questions. I suggest leaving feedback anonymous unless learners would like a reply to a question. In this case, trainers can keep questions separate from learners’ comments.
Trainers or organizations that want to explore higher levels of evaluating their learning programs should consider using Kirkpatrick’s model and four levels of learning evaluation. The levels assess reaction—participants’ typical likes and dislikes; learning—whether participants acquired their KSAs or achieved the learning objectives; behavior—what participants have changed once they are back to work after training; and results—whether the training is accomplishing what the organization needs or, basically, providing return on investment (I prefer to look at this as measuring the training’s effectiveness).
PITFALLS TO AVOID As we like to say in IH and safety, the don’ts are often more important than the do’s. So, from a pre-mortem approach, here are pitfalls to avoid:
  • not performing a learning needs assessment (following OSHA requirements doesn’t count as an LNA)
  • poor learning objectives that use words like “know,” “understand,” or “appreciate”—these are not MOD
  • info-centric training (I can’t even bring myself to call it learning!)
  • not engaging learners in exercises (yawn)
  • poor instructional design (use ADDIE if its structure helps)
  • not following Malcolm Knowles’ adult learning principles (these are critical)
  • passive training (remember to use active, engaging approaches and methods)
YOU’VE GOT THIS
Do you have more questions about the learning cycle? Feel free to reach out to me—I enjoy engaging with colleagues and talking about all things learning. There’s a lot we can do to make our learning events spectacular (and not yawn- or click-fests). Imagine what it would be like to have your learners speaking excitedly about your fun, engaging, and well-designed learning events. Constraints like COVID-19 just give us a greater challenge. You’ve got this!
JONATHAN KLANE is director of Risk Management and Safety Education for BioRAFT. He is also a PhD student at Arizona State University in Human and Social Dimensions of Science and Technology, where he studies risk perceptions and the use of narratives to affect them. In his free time (measured to the left of zero), he writes stories, including what it’s like to have COVID-19. Visit BioRAFT’s website to learn more about Klane and access additional resources.
Send feedback to The Synergist.
Robert Kneschke/Adobe Stock fizkes/Getty Images
Those who omit a Learning Needs Assessment do so at the peril of themselves and their learners.
Considerations for In-Person Training during COVID-19
Let’s start with location: are you holding the event inside or outside? Is it possible to increase the distance between people in one larger space or into several separate spaces? Some workplaces are renting canopy tents with open walls and accompanying environmental comfort aids like heaters or cooling fans. Other trainers are using fans to move and circulate air as well as large cooling fans or heaters as needed, both to make the space more comfortable and help prevent the spread of the coronavirus. In my career, I’ve done day-long lead renovator training in an unfinished apartment basement; hazard communication and lockout/tagout training on a production floor, using boxes to fabricate classroom walls; hazardous materials response in a small garage with a compressor that cycled on once an hour; and lots of outdoor training at landfills and other fun sites. If training was possible in odd settings such as these before the pandemic, then certainly we can continue training with these relatively minor changes while the pandemic continues.
In outdoor spaces or situations where learners are more spread out, trainers need to facilitate communication—both between learners and between the trainer or facilitator and the learners. Portable microphones are helpful, as are wearable mics for learners, if feasible. Strategically placed dry-erase boards and paper charts on easels can facilitate clear group discussions. In addition, trainers should consider using classroom clickers and apps for collaboration, Q&A, and sharing ideas to facilitate group discussions and learner engagement. For those who are unfamiliar with classroom clickers, they are similar to handheld remote controls that allow learners to answer questions anonymously. Classroom clickers are sometimes white in color and measure about six inches by a couple of inches. Some platforms for online learning (like Zoom and GoToMeeting) have polling features embedded in them. Many online collaboration apps are available, including PollEverywhere, Mentimeter, ParticiPoll, Ask the Room, PollDaddy, MicroPoll, and Flisti—just to name a few.

RESOURCES
Trainers interested in learning more about the topics covered in this article should examine the work of educator Malcolm Knowles and psychologist Carl Rogers. Additional resources include:
Bloom’s Taxonomy.
Kirkpatrick Partners: The Kirkpatrick Model.
Wikipedia: ADDIE Model.