JOHN MULHAUSEN, PhD, CIH, CSP, FAIHA, retired in 2018 from 3M where he worked for 31 years in a variety of global health and safety risk management roles, most recently as director of corporate safety and industrial hygiene. Send feedback to The Synergist.

How to Improve Exposure Judgments
Last month I drew your attention to problems that can arise when we rely solely on our professional judgment to characterize exposures without adequate structure and tools. This month, I’ll discuss some of the reasons why professional judgments are so often inaccurate and offer suggestions for improvement. The discussion draws on A Strategy for Assessing and Managing Occupational Exposures, chapter 6, written by Susan Arnold, Mark Stenzel, and Gurumurthy Ramachandran, each of whom has contributed to influential research on the accuracy of professional judgment among industrial hygienists. THE PROBLEM To recap, a growing body of research indicates that our judgments are often wrong and tend to underestimate exposures, with potentially dire consequences for the workers we are hired to protect. Nor does experience insulate us from this tendency: some studies have found that OEHS experts’ exposure judgments can be worse than novices’ and not much better than judgments based on random chance.
The chart below illustrates this finding. The authors tested the ability of OEHS professionals to accurately determine in which exposure control category, or ECC, the 95th percentile of an exposure distribution is most likely to be found. As explained in A Strategy for Assessing and Managing Occupational Exposures, ECC 1 corresponds to a 95th percentile that is less than 10 percent of the OEL, ECC 2 to 10–50 percent of the OEL, and ECC 3 to 50–100 percent of the OEL. Exposures in which the 95th percentile is greater than the OEL are assigned to ECC 4.
The chart depicts the accuracy of novice and experienced OEHS professionals before and after training on using a checklist tool intended to provide structure to their exposure judgments. Prior to training, novices outperformed experienced professionals, and experienced professionals weren’t significantly more accurate than assigning an ECC randomly. Accuracy increased dramatically post-training for both novices and experienced professionals.
FAST THINKING VS. SLOW THINKING Why are our exposure judgments so often inaccurate? The answer may lie in the way our brains process information.
Regular readers of The Synergist will be familiar with the ideas of the psychologist Daniel Kahneman, whose 2011 book Thinking, Fast and Slow has been referenced in several articles over the last few years. Kahneman’s insights about subjective judgment, which he defines as “nothing more and nothing less than recognition,” include the observation that “intuition can be a useful tool aiding in accurate decision making if, and only if, it is followed by the disciplined collection of objective information with disciplined scoring and analysis of that information.” Subjective judgment occurs in the part of the brain known as the pre-frontal cortex, which is prone to distractions and biases that we may not perceive. As a result, our subjective judgments are inconsistent at best.
The “fast thinking” of Kahneman’s title is akin to a gut reaction—an automatic, impulsive, and subconscious response to a situation or event. “Slow thinking,” by contrast, is deliberate, analytical, and conscious. A central issue with subjective judgment is that it often relies on fast thinking when it would be far better served by slow thinking.
A one-off, intuitive approach to an exposure assessment lacks the structure necessary to properly weight the myriad inputs we must consider when making a judgment.
Certain mental shortcuts, known as heuristics, are what make fast thinking efficient but prone to mistakes. Kahneman and his colleague Amos Tversky described these heuristics in the 1970s:
•The "availability" heuristic determines the probability of an event according to the ease with which we can recall a similar event from memory. When making a subjective judgment, the more quickly we remember a particular scenario, the greater our tendency to overrate its likelihood.
•The "representativeness" heuristic is our tendency to classify events according to their similarities. It can lead us astray because we tend to rely too much on such classifications when assigning probability; events that we determine to be similar are not necessarily of equal probability.
•The "anchoring and adjustment" heuristic reflects our tendency, when making subjective judgments, to “anchor” on certain information and modify our understanding as new information comes to light. If our anchoring is inaccurate and our adjustments insufficient, the result is an erroneous judgment.
Applying these ideas to our own profession, we can see that a one-off, intuitive approach to an exposure assessment lacks the structure necessary to properly weight the myriad inputs we must consider when making a judgment. Our intuition relies on our (potentially faulty) memory of previously encountered situations that we may incorrectly recall as similar to the situation at hand.
We see the effects of Kahneman and Tversky’s heuristics on our own work when we (for example) base judgments on information recalled from memory or on our “eyeballing” of exposure data. These heuristics may explain why—as research has shown—we tend to mentally interpret monitoring data as symmetrical normal distributions when right-skewed lognormal distributions are more appropriate. In effect, through fast thinking, we convince ourselves that nearly impossible events are likely to occur, and we dismiss probable outcomes as outliers.
SUGGESTIONS FOR IMPROVEMENT The way to improve our exposure judgments is to force ourselves to use slow thinking by adopting a structured approach to exposure judgments. Such an approach is described in A Strategy for Assessing and Managing Occupational Exposures, an essential resource for OEHS professionals. Here are a few suggestions for engaging the deliberate part of your brain when making exposure judgments:
Write down your observations. Keep notes about the materials involved in the task you’re assessing. Look up details such as the vapor pressure of a substance. Note how much air is being moved and where it is going. This documentation can help your mind anchor on relevant information and prevent an overreliance on memories of previous assessments.
Use algorithms and checklists. Rely on these conscious heuristics instead of the unconscious ones that Kahneman and Tversky identified as problematic for subjective judgment. Tools that use algorithms and checklists are freely available from the AIHA website. More information about these tools appears in my previous article and in the free AIHA webinar “Top 10 Imperatives for the AIHA Exposure Risk Management Process.”
Incorporate calibrating feedback loops into your OEHS practice. Document your qualitative exposure judgment before you receive monitoring results for a similar exposure group (SEG). Compare the statistical analysis of the monitoring data to your initial judgment. Do they match? Why? Why not?
Attain the Registered Specialist: Exposure Decision Analysis credential. Research on professional judgment highlights the importance of training. The AIHA Registry Programs LLC offers an Exposure Decision Analysis program that assesses competency in the skills and knowledge necessary to make informed decisions about worker exposure and exposure uncertainty. The program is offered at no cost as a service to the OEHS community. For more information, visit the AIHA Registry Programs website.
TOWARD BETTER JUDGMENTS Research shows that training in a group setting is one of the most effective means of improving our exposure judgments. Currently, few opportunities for group training exist, but AIHA is working to change that.
In September, AIHA and ACGIH hosted online brainstorming sessions as part of our Defining the Science initiative. DTS is intended to facilitate partnerships between practitioners and researchers to stimulate research that is guided by workplace needs and results in practical guidance to improve the protection of workers and communities. One of the top priorities identified by the initiative was improving the accuracy of qualitative exposure judgments. AIHA and ACGIH are exploring ways that our associations can help fill this need.
Categorical judgment accuracy, showing accuracy attributable to random chance, pre-training (baseline), and post-training checklist-guided judgment accuracy for novices and practicing IHs. Adapted from “Using Checklists and Algorithms to Improve Qualitative Exposure Judgment Accuracy,” Figure 2, Journal of Occupational and Environmental Hygiene, March 2016.
Tap on the chart to open a larger version in your browser.
AIHA: A Strategy for Assessing and Managing Occupational Exposures, Chapter 6, “Approaches to Improving Professional Judgment Accuracy,” 4th ed. (2015).
AIHA: “Top 10 Imperatives for the AIHA Exposure Risk Management Process” (webinar, 2021).
Annals of Occupational Hygiene: “Occupational Exposure Decisions: Can Limited Data Interpretation Training Help Improve Accuracy?” (April 2009).
Journal of Occupational and Environmental Hygiene: “Effect of Training on Exposure Judgment Accuracy of Industrial Hygienists” (April 2012).
Journal of Occupational and Environmental Hygiene: “Using Checklists and Algorithms to Improve Qualitative Exposure Judgment Accuracy” (March 2016).
The Synergist: “Judgment Day: How Accurate Are Industrial Hygienists’ Qualitative Exposure Assessments?” (January 2014).