Editor’s note: Barry Beith, Ph.D., is president of HumanCentric Technologies in Cary. The company focuses on human factors and usability in the approach to design and evaluation.Between the Columbia tragedy and the medical events surrounding the heart-lung transplant of Jesica Santillan at Duke University Medical Center, we have been reminded recently all too well of the impact human error can have when human beings and complex systems and technologies must work together.
Human beings are the strongest and weakest variable in any complex system. The strongest because we are versatile, adaptable, learning beings equipped with the most advanced computer ever created. The weakest because along with our adaptability is our variability. For the same reasons that even after years of practice and repetition, we cannot be assured of hitting a golf ball the same way, we also err in other situations where we have never erred before.
Such human errors can arise from many sources such as memory lapses, misperceptions, incorrect interpretations. We can commit slips of an unintentional nature, or mistakes of an intentional nature, i.e. when we do what we think is the right thing but are wrong.
Unfortunately, human error is always caught by 20/20 hindsight, proffered as an explanation of cause, and left as a hollow reminder that human beings cannot be expected to be perfect. While all too often human error is an explanation, it is never a solution, yet we most often go forward shaking our heads at the sad but inevitable fallibility of people or, better yet, with a deceptively comforting bandage fix and a promise that it will never happen again. Eventually, the bandage wears out and the vivid lesson grows dim with time, and we set ourselves up for another error.
Don’t forget the human factor
Human beings play an important role in technology, because the ultimate solution to human errors is not removing the human from the system and separating people and technologies. In the advancement of technology, we can never forget the human factor.
We must address more effectively human weaknesses while facilitating human strengths. This is best done through system design and is the focus and driving motivation behind the science and practice of Human Factors and Ergonomics. We must integrate on-going processes for identifying, eliminating, reducing, and managing human errors in such a way that detection, reduction of impact, and recovery are optimized.
The effort to avoid human errors is non-trivial given that most errors result from multiple low probability events, circumstances, and failures occurring at the same or close points in time and even in specific sequences of occurrence. Very often there are “latent” design flaws in the system of which the human being is unaware that contribute to errors. For example, when a motorist unfamiliar with an intersection approaches the intersection after a previous event in which the stop sign was knocked down. The result is an accident created because the driver did not know to yield to the cross traffic.
Add to this a blind corner, an unfamiliar car, an inexperienced driver, etc. and the potential for driver error becomes clear. The point is that the alignment of these various elements is difficult to foreseen and often difficult to determine even in the glaring light of 20/20 hindsight. It is for this reason that the events, circumstances, and failures must often be addressed independently.
For example, in avoiding a fire, one must eliminate one of three elements that must be present to start a fire: oxygen, fuel, and an ignition source. If any one of these is gone, then the fire cannot occur. This analogy has relevance to addressing human errors as well, but must be programmatically applied to the complex system.
Remembering the other ‘victims’
When human errors lead to tragedies, we sometimes forget that there are many “victims” including not only the obvious ones who suffer injury or death and their loved ones, but the professionals or other human beings who are left to realize that they played an unwanted and inadvertent role in the alignment of contributing factors to the error.
People for whom all their training, experience, and good intentions are swallowed in the pain and remorse of an event they would never have wanted and can’t take back.
Proactive programs focusing on addressing human errors must be undertaken by executives running complex systems for the sake of all the victims of such incidents.
Human factors professionals can help immensely in this undertaking. Such programs must be initiated and sustained, despite the difficulty of keeping up the effort. This is why design solutions have the greatest impact and the longest effect, because the solutions become “built in” to the system. When solutions are focused on the human being as in selection, training, exhorting “safe” behavior, or implementing finger-pointing, and penalty-based blame, the absence of human error is temporary and the longer the system goes without an error, the more likely programmatic “human” solutions are to fail.
It is of note that the longer California goes without an earthquake, the higher the probability that one will occur within a specified period of time. Despite the analogous nature of human error, we tend to believe that the longer we go without an error, the less likely one is to occur. Ironically, this in and of itself is a serious error in judgment.
The bottom line to the recent tragedies is that humans and technology will always interact and, therefore, human errors will always be a potential factor in some unforeseen way until we focus on the human side of technology and realize that to “err is human, to forgive, design.”
Note: Local Tech Wire welcomes submission of articles or letters addressing issues that concern the high tech and life science industries. Please send your thoughts to Managing Editor Rick Smith at email@example.com