A working day is a series of small decisions stitched together in service of an organizational goal. Many of these decisions concern whether a worker uses or implements controls for known hazards. Based on recent data from the International Labour Organization, 7,500 people die each day due to unsafe and unhealthy working conditions; 6,500 of those deaths are due to work-related diseases, the remainder are caused by traumatic injuries. For the United States and Canada, statistics show that occupational diseases result in higher mortality than injuries each year. These statistics suggest that a number of workers’ small decisions regarding use of controls must not be going the way we want them to.
Workers make small decisions that result in exposure to hazards day after day, year after year, culminating in the rates of occupational disease that we see. If we were to ask workers why, for example, they chose not to be clean shaven for respirator use, why they didn’t use local exhaust ventilation, or why they didn’t follow an established procedure, we would likely fill out the bottom of the investigation form with “complacency,” “risk tolerance,” “decision to err,” and “shortcutting.” These behavior-based causes occupy the base of the classic safety pyramid, which also includes near misses, recordable injuries, lost workdays, and fatalities (see Figure 1). To prevent these outcomes, we implement corrective actions like coaching, mentoring, or training.
These corrective actions, and behavior-based safety in general, miss a fundamental component: behavior is the observable result of decisions. Decision-making, then, should be of interest to occupational hygienists. Our job is to anticipate situations where workers are likely to decide not to implement controls, and to recognize where hardwired bias and shortcuts are likely to impair decision-making and result in exposure to hazardous conditions in the workplace.
SYSTEM 1 AND SYSTEM 2
To say these shortcuts and biases are hardwired is to say that our brains evolved thousands of years ago when parts-per-million, polyaromatic hydrocarbons, and noise-induced hearing loss were not considerations for survival. Decisions around risk exposure were much more concrete, immediate, and vivid. Our ancestors’ thoughts on risk probably extended no further than that the immediately-dangerous-to-life-and-health concentration of lions equalled one. In our nasty, brutish, and frequently short lifespans, decisions needed to be made quickly and in the absence of significant information. This need for fast thinking led to the development of what psychologist Daniel Kahneman calls “System 1” thinking. The other way of thinking, which Kahneman calls “System 2,” is conscious and deliberative—in other words, slow.
Figure 1. The safety pyramid illustrates the connection between at-risk behaviors and serious consequences, including fatalities.
Tap on the figure to open a larger version in your browser.
System 1—fast, low-energy, intuitive, unconscious-—evolved to keep us safe at the watering hole, but it presents significant problems in the modern industrial context. To be quick and low energy, the System 1 decision pathway takes shortcuts, makes assumptions, and has formed biases that cognitive scientists, psychologists, and economists have been researching. Many of these biases and shortcuts have implications for occupational hygiene in that they contribute to chronic hazard exposure.
Preventing future illness is a goal of most occupational hygienists. The latency periods for many diseases extend decades after exposure and, in many cases, after the end of a working life. At odds with this fact is the mental bias referred to as “hyperbolic discounting” or “present bias.”
We greatly prefer immediate gains to gains in the future—$100 now is better than $105 a week from now. System 1 is always evaluating our choices to determine if immediate gains (such as improved efficiency or comfort) are available and, if they are, strongly lobbies for that choice over unclear future gains or consequences of indeterminate severity. A worker’s shortcuts, then, can be understood as opting for immediate sure gain (or avoiding loss). For example, not taking the time to use local exhaust ventilation allows the worker to complete the assigned task more quickly.
When we picture the future, absent any other input, we tend to see it as only a possibility, not a certainty. The future is a hazy thing compared to the immediate crunch of budgets, deadlines, and other considerations. A one-time decision resulting in exposure would, in and of itself, not be especially harmful. But present bias ignores all the prior days that we made the same decision; it ignores the past as much as the future while the exposures add up, day after day, year after year, culminating in increased potential for disease.
The personal risk assessment process implies that a worker will consider what the consequent illness or injury would be like before making a decision that leads to an exposure. But our ability to imagine the future is limited. For our ancestors, surviving beyond tomorrow was a much more questionable prospect than it is for us, and so our ability to imagine the future hasn’t evolved for a time when lifespans approach and surpass 80 years. When we try to imagine that distant future with disease as a consequence of current choices, we might not think of it as happening to us. We think of it as happening to a stranger, a person we’re not connected to, instead of our future self. We accept our current exposure because we are unable to accept that we are harming a future version of ourselves.
During the risk assessment process, rather than focus on a possible future, the occupational hygienist or safety professional may instead present controls as a way for workers to conserve or improve current health. This approach exploits another System 1 shortcut: our aversion to a negative change from our baseline reference state, regardless of how bad or good that state is. By making the reference point right now, instead of a distant future, we encourage workers to view exposures as a loss of health, a step toward illness. From this point of view, workers can consider each decision separately and not as part of a series. Thinking of each decision as one step in a long chain of decisions allows a “just-this-once” mentality to creep in.
When we talk to workers about the possible consequences of exposure, or think about them ourselves, System 1 immediately starts to search through our memories for examples of that consequence, stories we may have heard or told about similar events. Events or consequences that are easily recalled are said to be “highly available” to System 1. Highly available events include misfortunes such as broken arms, falls, and back problems—things that have either happened to us or to someone we know. The more easily that System 1 can recall an event, the more likely System 1 is to think that it will happen, regardless of the actual probability of occurrence. This bias for anecdote evolved long before the Internet, Twitter, and push notifications: if somebody said there was an alligator at the pond, we would, understandably, go elsewhere for our water.
Conversely, the harder it is to think of an example of a consequence or outcome, the less likely we are to think it will happen, regardless of its actual probability. Occupational illness tends not to be highly available; many illnesses we are concerned with happen years or decades after exposure. As a result, few among us can easily imagine the agony of a bed-ridden death due to occupational cancer or chronic obstructive pulmonary disease. The victims of occupational disease die quietly, years and miles from the factory floors, refinery fumes, and other exposure pathways from which the disease originated. Workers, unable to think of a vivid example of disease, assume the disease is unlikely to afflict them and determine that controls are unwarranted. When examples of outcomes from our decisions aren’t available, we tend to minimize rarer but not impossible consequences and are less likely to consider the precautions worthwhile.
Hygienists should be prepared to give workers examples of adverse consequences during training, pre-job meetings, and other communication sessions. For example, a graphic of a blocked airway next to a clear one could act as a speedbump to slow System 1 thinking and force the engagement of System 2, our conscious and deliberative pathway, which very much is aware of retirement and doesn’t want to spend it gasping in a long-term care facility.
In many cases, we aren’t able to provide workers with accurate statistics on disease rates for their occupation. This is fine; people aren’t especially good at interpreting statistics anyway. System 1 quickly takes over when presented with probabilities and numbers. Workers are left to make their small decisions based on description, current knowledge, and anecdotal data. A 2003 paper in the Journal of Behavioral Decision Making has shown that when people make these small “feedback-based decisions,” they tend to excessively minimize rare outcomes. Known as “underweighting,” this bias relates to how much importance we place on a possible outcome; in the absence of bias, our judgment of its importance should be equal to its probability. When we underweight, we don’t factor in that possible outcome as much as we should when making a choice. Similarly, research published in the Journal of Organizational Behavior Management in 2000 found that safety devices were underused in cases where workers were making small feedback-based decisions.
Occupational hygiene has a deficit of evocative feeling with relation to the consequences of exposure.
EXPERIENCE VS. DESCRIPTION
While we’ve been shown to underweight rare events when evaluating them from experience, a paper by Kahneman and the psychologist Amos Tversky found the opposite in many cases where decisions were made from description (that is, written problems), where we tend to overweight rare outcomes. Kahneman and Tversky’s research suggests that providing workers with a descriptive outline of the choice they face engages the slow thinking process and will result in better decisions.
If nothing else, description can be tailored to take advantage of System 1’s bias for feelings. Paul Slovic’s research into this feeling-centred bias, also called the halo effect or affect heuristic, found that our feelings about something, whether negative or positive, influence how supportive we are of that thing. These feelings can be automatic and unconscious—that is, fully a System 1 function. For example, what feeling arises due to the word “vomit”? What about “profit”? Loss? Silicosis? Cancer? Mention a broken arm or crush point and it immediately lances System 1 with anticipated pain; we take more care with our activities as a result.
Occupational hygiene has a deficit of evocative feeling with relation to the consequences of exposure. Even a generic consequence like “cancer” has, through advances in medical treatment and outcomes, lost the dread it once had. Much of the lack of descriptive naming in diseases is due to clinical necessity and medical history; however, the role of the occupational hygienist should be to present this information in a more meaningful way—to use System 1 to our advantage by using language that will cause System 1 to lobby for ensuring controls are in place.
The affect heuristic goes further. If we are presented with information that says the benefit of a specific activity is high, we infer that the associated risk is low; if the information says that the risk is low, we infer that the benefits are high. Conversely, if the information says that the benefit is low, the risk is thought to be high; if the information says that the risk is high, the benefits are thought to be low. In conjunction with language that evokes feelings, the affect heuristic can be used to frame specific organizational decision-making requirements to influence decisions in the direction of lowering exposure potential.
SAFETY ARCHITECTS
The industrial hygiene discipline, and professional safety generally, represents a societal change in moral thinking that has taken place over the last one hundred years—a tiny sliver of time when compared with our old brains. As a society, we grew weary of watching our loved ones leave for work, uncertain of their safe return, uncertain of their long-term health. In their book Nudge: Improving Decisions about Health, Wealth, and Happiness, Richard Thaler and Cass Sunstein argue that organizations should use “choice architecture,” a knowledge of cognitive science, to present choices in such a way that the most beneficial is chosen. As an extension, industrial hygienists and safety professionals should consider themselves as safety architects and use a knowledge of our hardwired decision biases and shortcutting to redesign the cognitive environment under which work takes place. The occupational illnesses of the present have been caused by a series of small decisions in the past; we should understand how those decisions were made in order to change them in the future.
RYAN CAMPBELL, CIH, CRSP, is a senior industrial hygienist with WhiteSwan Safety in Calgary, Alberta.
Send feedback to The Synergist.
RESOURCES
Econometrica: “Prospect Theory: An Analysis of Decision Under Risk” (March 1979).
European Journal of Operational Research: “The Affect Heuristic” (March 2007).
Journal of Behavioral Decision Making: “Small Feedback-Based Decisions and Their Limited Correspondence to Description-Based Decisions” (July 2003).
Journal of Organizational Behavior Management: “Behavioral Safety Research in Manufacturing Settings: A Review of the Literature” (2000).
Yale University Press: “Nudge: Improving Decisions about Health, Wealth, and Happiness” (2008).
Using Cognitive Science to Improve Communication with Workers
BY RYAN CAMPBELL
Small Decisions
Although the print version of The Synergist indicated The IAQ Investigator's Guide, 3rd edition, was already published, it isn't quite ready yet. We will be sure to let readers know when the Guide is available for purchase in the AIHA Marketplace.
My apologies for the error.
- Ed Rutkowski, Synergist editor
Disadvantages of being unacclimatized:
- Readily show signs of heat stress when exposed to hot environments.
- Difficulty replacing all of the water lost in sweat.
- Failure to replace the water lost will slow or prevent acclimatization.
- Increased sweating efficiency (earlier onset of sweating, greater sweat production, and reduced electrolyte loss in sweat).
- Stabilization of the circulation.
- Work is performed with lower core temperature and heart rate.
- Increased skin blood flow at a given core temperature.
- Gradually increase exposure time in hot environmental conditions over a period of 7 to 14 days.
- For new workers, the schedule should be no more than 20% of the usual duration of work in the hot environment on day 1 and a no more than 20% increase on each additional day.
- For workers who have had previous experience with the job, the acclimatization regimen should be no more than 50% of the usual duration of work in the hot environment on day 1, 60% on day 2, 80% on day 3, and 100% on day 4.
- The time required for non–physically fit individuals to develop acclimatization is about 50% greater than for the physically fit.
- Relative to the initial level of physical fitness and the total heat stress experienced by the individual.
- Can be maintained for a few days of non-heat exposure.
- Absence from work in the heat for a week or more results in a significant loss in the beneficial adaptations leading to an increase likelihood of acute dehydration, illness, or fatigue.
- Can be regained in 2 to 3 days upon return to a hot job.
- Appears to be better maintained by those who are physically fit.
- Seasonal shifts in temperatures may result in difficulties.
- Working in hot, humid environments provides adaptive benefits that also apply in hot, desert environments, and vice versa.
- Air conditioning will not affect acclimatization.
Acclimatization in Workers