“Human Error” is the Scapegoat for Systemic and Organizational Failures

EHL Hospitality Business School

By Mica Endsley, Ph.D.

 

When a major accident or disaster happens, the headlines splashed across news outlets are all too familiar and often put the focus away from where it should be. Earlier this year, one of Greece’s worst-ever rail disasters was due to “tragic human error”, according to the country’s prime minister. Recently, the Times of India reported: “Human error behind most rail accidents in last three decades”.

Pointing a finger at the human operator is nothing new. In fact, this approach is often applied across a wide spectrum of industries, including automotive, space exploration, aviation, and health care, where some 60-80% of accidents are blamed on the operator. But these statistics do not tell the full story, or even necessarily the right one.

Uncovering the True Risks

The fact is human error often is not the fault of the human. Although statistics provide some insight, they do not always present a complete or accurate picture. Recently, Jennifer Homendy, the head of the National Transportation Safety Board, publicly urged the National Highway Traffic Safety Administration (NHTSA) to remove a report from their website claiming that 94% of car crashes result from human error. Homendy explained her reasoning in a tweet, stating, “Simply put: It’s not true. Crashes are more complex than that, and we need to understand all those factors to stand a chance at reducing traffic deaths in the United States.”

When automobile accidents happen, it’s easy to point the finger at the driver, but this overlooks various risk factors at play, such as weather conditions, poorly designed roads, or interference from in-vehicle technologies. Blaming the person behind the wheel doesn’t fix the underlying systemic issues that contribute to accidents. We need to focus on addressing those issues insteadto improve safety.

For example, researchers at Johns Hopkins, who reported that 250,000 deaths per year in the U.S. are due to medical error,making it the third leading cause of death, alluded to this by cautioning readers to look beyond the numbers. They stated thatmost medical errors aren’t due to inherently bad doctors but are instead due to systemic problems.

Often, the root causes of accidents occur well before the operator is anywhere near the scene of the accident. During the design phase, flaws can be introduced when designers do not take the human operator’s capabilities and limitations into account, setting the stage where “errors” are likely.Bad design encourages accidents; good design prevents accidents.

For example, when Boeing introduced the 737-Max8, it failed to build-in needed redundancy in sensor information or to provide displays that would inform pilots of what its newly implemented automation system was doing. These problems directly contributed to two major accidents that killed 346 people.

Often automated systems are introduced with the promise that they will “reduce human error”.  In fact, automation often introduces new problems and creates new errors and accidents.  In the case of the recent rail accident in Greece, for example, it appears that a faulty automation system misrouted the passenger train onto the tracks occupied by another train. Between 1972 and 2013, 26 aircraft accidents were found to be due to the presence of automation where the pilots were significantly challenged in understanding what the automation was doing and interacting with it correctly to avoid the resulting accident.

These same issues are coming into play in many new automobiles that are automating vehicle control.  Some 736 crashes and 17 fatalities have involved the Tesla autopilot mode.  While some people believe that drivers should be able to detect and prevent accidents where automation fails to perform correctly, a large body of research conducted over the past 40 years shows this is frequently unlikely.

Automation causes people to become “out-of-the-loop”.  Their attention easily wanders to other tasks, and they become less vigilant. Even when they are focused on what the automation is doing, they often have trouble understanding what it is doing or knowing what it will do next.  The extra time required to determine if the automation is performing correctly or if an intervention is needed, and what that intervention should be, can be deadly.  Simply stating that “drivers should monitor the automation” does not solve this fundamental problem.

Designing for Humans

The solution to the problem of human error is not to punish people or to try to replace them with automation. The solution is to design technologies that actually enhance human performance, rather than technologies that fail to account for the needs of the people who must interact with them.

Solving such systemic design challenges is the primary calling of an entire field called Human Factors and Ergonomics. This discipline conducts and applies scientific research on human abilities, characteristics, and limitations and applies that knowledge to the design of equipment, jobs, systems and operational environments in order to promote safe and effective human performance. Its goal is to support the ability of people to perform their jobs safely and efficiently, thereby improving the overall performance of the combined human-technology system.The Human Factors and Ergonomics Society in the United States has more than 3,000 members who are employed in industry, government and academic research.

Human Factors and Ergonomics replaces a misplaced emphasis on blaming the operator or over-reliance on training and instead creates systematic improvements in human performance through improved system design. While training is important, it cannot overcome poor system designs in the long run. People are still likely to make the same types of errors if the system design is not consistent with human capabilities and limitations. Further, a well-designed system that is consistent with a user’s needs and is easier to operate is also easier to train; thus, potentially reducing training requirements as well as improving human performance.

The practice of Human Factors and Ergonomics is based on scientifically derived data on how people perceive, think, move, and act, particularly when interacting with technology. The way that technology is designed significantly affects the performance of the people who interact with it. The efficiency and accuracy of human performance can be greatly improved when technology has a user-centered interface designed to accommodate basic human capabilities.

User-centered systems make it easy to navigate through screens, for example and implement features that reduce the chances of common human errors (such as error tolerance and error resistance). They also focus on presenting information effectively, so that people have better comprehension of essential information, even when workload is high, resulting in much higher levels of performance. Conversely, complex technologies, that require paging through screens, with difficult-to-read displays are much more prone to errors. If significant effort is required to understand a constantly changing situation, it greatly increases the likelihood of human error.

User-centered design approaches are also highly important when automation is introduced. Rather than reducing the need to consider the human operator, automation increases the difficulty of the operator’s job and makes it even more important to consider how it will affect human performance. Human factors research shows that when displays make the behavior and capabilities of the automation more transparent to the operator, for example, it can greatly reduce the ‘out-of-the-loop’ problem.  Making people more situationally aware by increasing their level of engagement with the task has also been shown to be important.

Automation and artificial intelligence are being introduced in technologies all around us. Paradoxically, rather than eliminating human error, they simply create many new problems.  Getting the benefit of these new technologies requires an increased emphasis on designing systems that pay careful attention to good human factors design and require careful user testing prior to deployment.  People have never been more important.

About the Author

Dr. Mica Endsley is the Government Relations Chair for the Human Factors & Ergonomics Society (HFES). She’s also president of SA Technologies, a situational awareness research, design, and training firm. She was formerly the Chief Scientist of the United States Air Force.

About Neel Achary 19103 Articles
Neel Achary is the editor of Business News This Week. He has been covering all the business stories, economy, and corporate stories.