left-arrowright-arrow
Five Common Difficulties in Real-Time Detection System Implementation
Connectivity, Data Interpretation, and Other Issues
BY SPENCER PIZZANI AND MARINA JABSKY
Working from Home but Missing Your Synergist? Update Your Address
If you’ve been working from home during the pandemic, please consider updating your address with AIHA. You can change your address by editing your profile through AIHA.org. To ensure uninterrupted delivery of The Synergist, designate your home address as “preferred” on your profile. Update your address now.
Real-time detection systems, or RTDS, may include complex combinations of sensors, instruments, software, and processes. When used for occupational and environmental health and safety, they hold tremendous potential to improve exposure decision analytics. However, as the complexity of RTDS implementation increases, so does the potential for encountering significant difficulty. While single-use direct-read instruments (DRIs) can be as simple as a home carbon monoxide detector, systematizing these sensors into a fire system or other interdependent network is a complex undertaking. The designers of such systems must anticipate the challenges of individual instruments as well as the system as a whole. Underestimating complexity can lead to cost overruns or loss of esteem, and may compromise the trust placed in the system. Below, we present five common difficulties in the implementation and deployment of an RTDS. 1. COMMUNICATING VALUE Before technical evaluation of an RTDS can begin, OEHS professionals may need to persuade a group of stakeholders of the system’s value, often including those with financial, legal, human resources, or engineering roles. To successfully advance the idea of an RTDS requires a concise but detailed explanation of what the system will accomplish or empower (or do better than the alternatives) and why this capability is worth the cost and effort. Stakeholders may expect that the OEHS professional is an expert at such systems and knowledgeable of nuances related to power, data connectivity, sensor chemistry, commercial technology systems, and ventilation engineering. Communicating value should begin with stating, as clearly as possible, the question that the RTDS is intended to answer. Examples include: • “Is the concentration of carbon monoxide in this room above the Threshold Limit Value?” • “Is the ventilation running effectively to keep the particle count below the value we determined is most closely correlated with the appropriate level of control?” • “Is there any detection of ozone in this area?” • “Is the temperature in the heated process area within one standard deviation of the average?” Starting with a specific question facilitates finding a targeted solution and is especially important when performing sensor occultations, such as monitoring oxygen to evaluate the presence of nitrogen, or monitoring carbon dioxide as a surrogate for ventilation efficacy. The technical question answered by the system should have value to the circumstances in which it is used. For example, the statement “We know the carbon monoxide levels in this room are below the TLV” should be followed by “and because we know that personal time-weighted average exposures are typically lower than our steady-state concentration, our local exhaust ventilation is working acceptably.” These statements should follow formal logic and be verified, if appropriate, by the collection of confirmatory evidence. Avoid assigning exposure determination based solely on the presumed performance of an RTDS: validating correlations between personal exposure monitoring and an RTDS is critical when the system is not itself performing personal exposure monitoring. Common value propositions include predicting failures of control systems, early warning of potential overexposures, and facilitating investigation of anomalous readings. However, the value of comparing data to itself should not be overlooked. A changing average gas concentration could identify a need for additional training, leak detection and repair efforts, or an unexpected process change. A change in particle counts could indicate changes to airflow, doors being propped open inappropriately, or a material change in feedstock. Answering meta-data questions such as “How are we today, as a 30-day average, compared to one year ago?” can be enlightening. At times, the very presence of data changes the calculus of exposure assessment, incident investigation, claims response, and controls prioritization. Stakeholders may have a different perspective when looking at data that may not agree with their perception of risk, internalized risk tolerance, or moralization. This perspective shift may be particularly apparent when viewing unfavorable data from the lens of opposing counsel who may be seeking to prove tort. Additionally, the value of an RTDS to data visualization cannot be overstated. Stakeholders who have only topical proficiency may benefit more from colorful tables and line graphs over valid but visually uninteresting tables of values.
Validating correlations between personal exposure monitoring and an RTDS is critical when the system is not itself performing personal exposure monitoring.
Communication to stakeholders should also acknowledge that all RTDS require some level of maintenance, often in the form of consumables or cleaning, to function as intended. While many OEHS professionals associate maintenance with “calibration”—a frequently misused term that should often be replaced with “functional testing” when sensor response values are not being set by the user—the general concept of care needs to be planned for any system. The frequency of maintenance can depend on factors such as the ability of the system to absorb downtime or the difficulty of replacing failed parts. Further, OEHS professionals should avoid overpromising the performance of a system or launching it before it is ready, which can sap momentum and enthusiasm.
2. SYSTEMATIC EVALUATION OF SENSOR SYSTEMS Knowledge of sensor and DRI fundamentals is necessary for OEHS professionals to appropriately operate these instruments where the health and welfare of others depends on correct usage. This knowledge includes skills outlined in the AIHA publication “Technical Framework: Guidance on Use of Direct Reading Instruments” as well as skills that are often honed by productive struggles in understanding the function of DRIs and how they can be applied to unusual situations. Knowledge of technical terminology (for example, limit of quantification versus limit of detection), the scientific expressions of uncertainty, and instrument responses must be combined with practical abilities such as recognizing a pump fault, interpreting mathematically negative readings, and knowing when to escalate maintenance concerns. The Technical Framework outlines competencies necessary to the appropriate operation of DRIs based on the degree of trust granted to the user.
However, even experienced OEHS professionals should perform systematic evaluations of sensor systems. Such evaluations should include the sensors as well as the instruments that interpret their response. Some systems, such as single gas monitors, may be simple to evaluate using only the manufacturer’s literature. More complex systems may require interaction with the supplier or manufacturer, or further evaluation with specialized testing.
The purpose of systematic evaluation is to ensure that the instrument is “fit for purpose” when considered for its intended use. “Fitness for purpose” varies: one instrument may be needed for the presence of high humidity, isokinetic conditions, or extreme temperature, while another may be needed to operate in conditions where an audible alarm cannot be heard or a blinking light cannot be seen. Unusual physical characteristics (alarm decibel output, display luminosity) are rarely identified in manufacturers’ operation and technical manuals but can be critically important when lives are on the line. Even experienced, credentialed industrial hygienists may struggle to perform sufficient systematic evaluation on complex exposure scenarios with incomplete manufacturer documentation.
The Technical Framework identifies the previously published Standardized Equipment Specification Sheet (SESS) for both sensors and instruments as a starting place for a systematic evaluation. However, the SESS cannot cover all potential uses, hazards, or concerns. Instead, it is meant to promote a systematized way of thinking about DRI performance and physical characteristics that can prevent unfortunate accidents or help meet desired goals. (See the digital extra below for a case study demonstrating the importance of systematic evaluation.)
When performing systematic evaluation, it is valuable to consider the learning curve associated with use of novel devices. As there is no universal schema for RTDS operability, any factor can be a potential source of existential error. New users may be insufficiently familiar with instrument nuances ranging from basic principles such as airflow obstruction to subtle details such as correction factors or the presence of negative cross-sensitivities. Senior OEHS professionals should be particularly alert to information processing bias, such as assuming that users have “chunked” nuanced procedures into larger schema. In addition, it may be tempting to refer to “calibrating” a DRI when in fact we mean the performance of several tasks in a specific order, including exposure to a challenge analyte and appropriate associated recordkeeping. Don’t assume that any stakeholder’s understanding of specific processes matches the intended behavior.
This is especially important for extremely complex instruments, such as portable spectroscopes or stationary mass spectrometers. These instruments can require advanced training for appropriate use. RTDS deployment considerations should evaluate the need to provide training, including potential refresher training or auditing of proper use by a technical expert.
Finally, it is important to consider the user’s perception of difficulty regardless of the system’s actual complexity. Users may become frustrated if they perceive the RTDS to be too complex. Formulating a strategy to accommodate this feeling may assist in adoption.
3. INTERNET CONNECTIVITY, CYBERSECURITY, AND IOT APPLIANCES An emerging feature in DRIs is advanced connectivity, either to a base station such as a tablet or laptop, or directly to a telemetry system connected via cellular data connection, Wi-Fi, or other means. Advanced connectivity improves the flow of data from instrument to laptop to database by eliminating manual local storage transfers, improves the retrieval rate of data (by the hour, minute, or second rather than by day or event), and integrates seamlessly with database software to enable users to more easily transport or analyze their data.
For machinery that is part of the Internet of Things (IoT), basic functionality is predicated upon internet connectivity. In some cases, this connectivity must meet reliability standards to function as intended. This mindset is baked into concepts such as real-time wireless alarms, triggered capture of visual or audio data, or other similar features that may be required for the instrument or system to be fit for purpose. The potential of these features is significant—being able to pinpoint the exact process that causes a spike in analyte can help determine if exposure controls can meet surge or upset conditions, and can help validate models for exposure assessment and related factors.
However, few OEHS professionals are well-versed in network engineering. Assuming that internet connectivity will be available is a significant potential pitfall in planning for RTDS implementation. Cellular signals can be affected by location, interfering equipment, building materials, the radio band of the selected internet service provider, or the strength of the transmitter. Wi-Fi may be difficult to obtain due to local restrictions on available bandwidth or frequency, network appliance capacity, firewall or network segmentation rules, or the need to pass data through intermediate appliances. Simply asking “Does this facility have Wi-Fi?” is not adequate for emerging systems that provide a regular data feed to a cloud host; Wi-Fi alone may not be sufficient for the RTDS to function on a local server, if one is even available. The need for radio repeaters, network infrastructure, and server space or cloud hosting to accompany the RTDS entails higher costs, additional professional assistance or skilled trades, or longer timelines. These complexities may hobble important projects or push them out of reach. OEHS professionals should consider both the hazard-facing hardware and the software that captures and communicates data.
OEHS professionals should also consider cybersecurity and data privacy as part of the equation of RTDS design and implementation. Ransomware and malicious intrusion via unsecure or unmanaged software have caused untold damage to both the exploited organizations and their digital supply chains. The importance of cybersecurity cannot be understated, and cybersecurity is as steeped in technical minutiae as industrial hygiene. OEHS professionals should include cybersecurity professionals in their evaluation of any internet-connected or IoT RTDS whenever feasible, or where identified by risk assessment. For projects that are struggling for resources, this added precaution may significantly impact the likelihood of success. OEHS professionals should seek guidance from their stakeholders, clients, or organizations before making any assumptions related to connectivity or cybersecurity.
4. INTERPRETATION OF DATA In industrial hygiene, context is paramount to the interpretation of data. To those without experience, data interpretation might seem as simple as comparing average concentrations to identified exposure limits, but prudent practice contains many additional layers, and large-scale RTDS may have primary stakeholders who are not trained in this component of OEHS.
Before an RTDS is deployed, OEHS professionals should align stakeholders on data processing steps and inform them about aspects of prudent industrial hygiene practice such as: • the fundamental doctrine that true exposure is never equivalent to measured or observed exposure
• the application of any agent-specific transformations, such as particle fractionation, radiation quality factors or frequencies, conversions from pure metals to oxides, occultations for analytes of inference, or other similar functions • the evaluation of other pooled and weighted sources of sampling and analytical error • the appropriate application of statistical confidence intervals • the importance of variables related to production cycles or time (Tuesdays vs. Wednesdays, or day shift versus night shift), product runs, feedstock, processing rates, cycles of fugitive agents, severity of exposure, and other variables that may be cyclical or idiosyncratic • considerations of exposure group validation, lognormal model validation, and exposure decision analysis • the variable and potentially significant impact of susceptibility of individuals with medical, experiential, or psychosocial conditions • any uncertainty applied by the occupational exposure banding process, or resulting from partially validated or unvalidated methods, or the absence of established exposure limits
If the opportunity to provide stakeholders with this context does not arise, the OEHS professional should be prepared to point out which column, exactly, contains the data that would be used to refute or substantiate a compliance fine, and which formulae are used to make such a determination. It is important to be transparent about any calculations or interpretive notes critical to the disposition of the data. As an example, noise data collected when key noise-producing processes are not running should be marked as atypically low. While scientific uncertainty may be difficult for stakeholders to grasp, simple measures of integrated uncertainty, including box-and-whisker plots, diverging line graphs, and odds ratios, may help frame the results of exposure decisions.
Finally, the OEHS professional should make provisions for memorializing important conclusions and the data used to substantiate them. All data interpretation contains some bias, but defense by reproducibility is only possible when those biases are identified and disclosed. Memorialization is especially important where staff turnover or other factors may impact the length of institutional memory.
5. ALIGNING ALARMS WITH SPECIFIC ACTIONS The process of setting alarms for real-time detection systems is at least as complex as establishing a properly contextualized, appropriately framed exposure decision. This subject is the topic of much ongoing discussion among regulators and professionals.
The purpose of alarms is to prompt action from machines, systems, or, more commonly, people. Alarms call attention to a change in conditions that warrants or precedes some response such as activation of a control, cessation of a process, or acknowledgment of an anticipated detection.
A scenario can be envisioned where an elaborate RTDS is deployed at great expense and effort, only for the first alarm to be met with uncertainty about the appropriate response. This situation can lead to questions about the accuracy of inconvenient or impactful detections, the appropriateness of the instrument, or any number of potentially confounding factors. A worse outcome would be blatant disregard of the alarm.
To avoid these difficulties, OEHS professionals should be prepared to substantiate the alarm limits; pursue stakeholder alignment; associate highly specific, terminable actions with each alarm; and link these actions to an individual or role (such as a confined space attendant or process monitor). As a simple example: “When the low alarm horn and light is activated, all employees leave the room. Then, the production supervisor may silence the horn. Employees may return to their stations when the alarm light shuts off.” The objective is to provide an entire logical loop to all stakeholders, including the operator of the RTDS, the individual responding to its signals, and those responsible for downstream tasks such as maintenance or emergency response.
The purpose of associating specific actions with specific signals is to eliminate uncertainty and ensure the action is performed in accordance with established procedure. This may be especially important on multi-entity work sites or where time is a significant factor.
Caution should be exercised to ensure that all stakeholders understand the potential impact of these specific actions. All actions should be both necessary and feasible. While minor adjustments may be necessary, identifying the principles behind the predetermined response action is best completed before the alarm is triggered. Room for judgment should be provided where appropriate; a common example is the provision of a windsock to determine the appropriate muster point in the event of an evacuation stemming from agents that may migrate according to the wind direction. However, a balance must be struck between simplicity and flexibility.
Finally, appropriate response to alarms assumes that the RTDS is properly functioning. Maintenance, drills, or other measures can help ensure proper operation and stakeholder confidence. False alarms or nuisance signaling may lead to counterproductive behaviors, such as sabotage to annunciators that some stakeholders deem too annoying to abide. Never underestimate the power of comfort or convenience even at the risk of potentially catastrophic health effects, especially when combined with discontent, fatigue, or the perception of incompetence on the part of the alarm-setting authority.
RECOMMENDATIONS OEHS professionals should review the DRI Technical Framework to understand the key competencies appropriate for users with varying levels of knowledge. Additionally, they should perform systematic evaluation of potential sensors, instruments, and systems to assist in technical alignment on connectivity and data interpretation prior to deployment. Transparency and memorialization should be considered and treated appropriately given the context of use. All alarm settings should prompt a specific action in accordance with a complete logic loop.
SPENCER PIZZANI, CIH, is the occupational health manager for PepsiCo Global EHS. His professional passions include sensor technology, genetic susceptibilities, and accessibility of industrial hygiene technical information.
MARINA JABSKY is the industrial hygienist for NYCOSH, a nonprofit focused on empowering labor and community organizations to create safer workplaces. She is passionate about equity in safety and health, effective communication, and accessible knowledge, and leaving situations in a better condition than she found them.
The authors acknowledge Steve Jahn, Bob Henderson, Paul Wambach, and Emanuele Cauda of the AIHA RTDS Committee and Big Data Working Group for their continued assistance. metamorworks/Getty Images
Dead Banding: A Case Study The Importance of a Systematic DRI Evaluation
By Spencer Pizzani and Marina Jabsky
An industrial hygiene consultant was retained by a chemical processing client to evaluate hydrofluoric acid (HF) exposures during a pilot batching process for a novel material. The client wanted to establish a baseline exposure evaluation to ensure existing engineering controls were effective. The exposure was highly dynamic; the client chemical operator needed to traverse a limited enclosure and would spend substantial but uncertain periods of time outside of the enclosure. Both an HF DRI and integrated analytical sampling were used to determine if additional controls were necessary.
HF presents an exposure risk with potentially severe health effects. Chemically protective clothing and respiratory protection was worn for the process pilot by both the client chemical operator and the consultant. A protocol for work stoppage was established for escalating detectable concentrations of HF.
The consultant reviewed the current DRI available for HF. The instrument selected was usable within the enclosure and had a limit of quantification (LOQ) below 50 percent of the TLV to allow for sufficient warning to stop the process. The TLV was an eight-hour, time-weighted average (TWA) of 0.5 ppm with a short-term exposure limit (STEL) of 2 ppm as a ceiling limit. The consultant reviewed the sensor specification and made the following conclusions:
  • The resolution was 0.09 ppm (presumably 0.1) as 1 percent of full scale (9 ppm).
  • The range was 0–9 ppm.
  • The accuracy was 0.45 ppm—potentially concerning.
  • The LOQ was 0.1 ppm. This was in error as the instrument manufacturer does not list a limit of detection (LOD) or LOQ on its specification.
During systematic review, the consultant mistakenly assumed that the LOQ would be equal to the lowest potential resolution within the range, or 0.09 ppm (0.1 ppm based on presumed rounding), with a potential accuracy of between ±0.01 ppm and ±0.5 ppm. While the accuracy is of potential concern because this value may result in a wide range of uncertainty, no instrument with better range, resolution, or accuracy was identified. The chemical processing client was informed of the instrument to be used and provided approval based on full knowledge of the review. Both the consultant and the client’s technical lead were experienced and certified in industrial hygiene.
Simultaneous sampling was completed using integrated sampling via NIOSH method 7906, Particulate Fluorides and Hydrofluoric Acid by Ion Chromatography. The AIHA-accredited laboratory provided a reporting limit of 4 µg. Monitoring was completed successfully using the DRI and NIOSH 7906. No irritation was reported by the operator or consultant. The DRI did not indicate any detection of HF; all readings were flat 0.0 ppm.
Upon receiving the analytical laboratory report, the consultant was surprised to find that the laboratory results indicated a TWA concentration of HF equivalent to 0.23 ppm. This was below the TLV-TWA and TLV-STEL for the period sampled, but the absence of any non-zero detection on the tested instrument caused concern that the instrument provided a false negative for a potentially highly hazardous exposure. Initial review indicated that 0.23 ppm was within the 0.5 ppm potential maximum accuracy range of the instrument. However, because the operator spent substantial time away from the enclosure, the presumed concentration within the enclosure was above the laboratory-identified TWA.
The consultant contacted the instrument manufacturer for more assistance in evaluating the unexpected performance of the instrument. The instrument manufacturer indicated that the instrument had a substantial dead band that was greater than the TLV value of 0.5 ppm TWA, but less than the TLV-STEL of 2 ppm. In sensor technology, a dead band is a programmed behavior for a range of input signals that results in an absence of response. Essentially, while the HF sensor was supplying signal indicating the detection of the analyte, the instrument was programmed not to display these signals as concentration while the signal was within the range of the dead band.
Dead bands have legitimate uses; the most common one is to improve user confidence by smoothing slight sensor drift around zero (known commonly in instrument response discussions as “noise”). Users have more confidence in an instrument that displays a steady zero than one that rapidly fluctuates between (for example) -0.01 and 0.02 ppm of ammonia. Dead bands around zero are acceptable practice if sensor response is still accurate during actual exposure to analyte. The degree of acceptable dead banding is therefore necessarily a function of fitness for purpose—and in this case would be dictated by the exposure dynamics and TLV values.
As a result of the DRI dead band, the instrument would not be expected to respond (and the display would not be expected to change from 0.0 ppm) until concentrations were higher than the TLV-TWA. This dead band was not identified in the manufacturer’s published sensor specification.
While the operator’s exposure was ultimately in conformance with the TLV (and PPE/respiratory protection was worn as a precaution), it’s possible to imagine a scenario in which the instrument was relied upon for exposure assessment against the TLV-TWA without respiratory protection.
RESOURCES
AIHA: “Technical Framework: Guidance on Use of Direct Reading Instruments” (May 2022).
Journal of Occupational and Environmental Hygiene: “Rating Exposure Control Using Bayesian Decision Analysis” (October 2006).