Action with unintended consequences
Human error
is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".
[1]
Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse as
nuclear power
(e.g., the
Three Mile Island accident
),
aviation
,
space exploration
(e.g., the
Space Shuttle Challenger disaster
and
Space Shuttle Columbia disaster
), and
medicine
. Prevention of human error is generally seen as a major contributor to
reliability
and
safety
of (complex) systems. Human error is one of the many contributing causes of
risk
events.
Definition
[
edit
]
Human error refers to something having been done that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".
[1]
In short, it is a deviation from intention, expectation or desirability.
[1]
Logically, human actions can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate (leading to mistakes); or, the plan can be satisfactory, but the performance can be deficient (leading to
slips
and
lapses
).
[2]
[3]
However, a mere failure is not an error if there had been no plan to accomplish something in particular.
[1]
Performance
[
edit
]
Human error and performance are two sides of the same coin: "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight:
[3]
[4]
therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study of
absent-mindedness
in everyday life provides ample documentation and categorization of such aspects of behavior. While human error is firmly entrenched in the classical approaches to accident investigation and risk assessment, it has no role in newer approaches such as
resilience engineering
.
[5]
Categories
[
edit
]
There are many ways to categorize human error:
[6]
[7]
- exogenous
versus
endogenous
error (i.e., originating outside versus inside the individual)
[8]
- situation assessment versus response planning
[9]
and related distinctions in
- error in problem detection (also see
signal detection theory
)
- error in problem diagnosis (also see
problem solving
)
- error in action planning and execution
[10]
(for example: slips or errors of execution versus mistakes or errors of intention
[11]
[3]
)
- by level of analysis; for example, perceptual (e.g.,
optical illusions
) versus cognitive versus
communication
versus
organizational
- physical manipulation error
[12]
- 'slips' occurring when the physical action fails to achieve the immediate objective
- 'lapses' involve a failure of one's memory or recall
- active error - observable, physical action that changes equipment, system, or facility state, resulting in immediate undesired consequences
- latent human error
resulting in hidden organization-related weaknesses or equipment flaws that lie dormant; such errors can go unnoticed at the time they occur, having no immediate apparent outcome
- equipment dependency error ? lack of vigilance due to the assumption that hardware controls or physical safety devices will always work
- team error
? lack of vigilance created by the social (interpersonal) interaction between two or more people working together
- personal dependencies error ? unsafe attitudes and traps of human nature leading to complacency and overconfidence
Sources
[
edit
]
The
cognitive
study of human error is a very active research field, including work related to limits of
memory
and
attention
and also to
decision making
strategies such as the
availability heuristic
and other
cognitive biases
. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.
Misunderstandings as a topic in human communication have been studied in
conversation analysis
, such as the examination of violations of the
cooperative principle
and Gricean maxims.
Organizational studies of error or dysfunction have included studies of
safety culture
. One technique for analyzing complex systems failure that incorporates organizational analysis is
management oversight risk tree analysis
.
[13]
[14]
[15]
Controversies
[
edit
]
Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmful
oversimplification
of a
complex phenomenon
.
[16]
[17]
A focus on the variability of human performance and how human operators (and organizations) can manage that variability, may be a more fruitful approach. Newer approaches, such as resilience engineering mentioned above, highlight the positive roles that humans can play in complex systems. In resilience engineering, successes (things that go right) and failures (things that go wrong) are seen as having the same basis, namely human performance variability. A specific account of that is the
efficiency?thoroughness trade-off principle
,
[18]
which can be found on all levels of human activity, in individuals as well as in groups.
See also
[
edit
]
References
[
edit
]
- ^
a
b
c
d
Senders, J.W. and Moray, N.P. (1991)
Human Error: Cause, Prediction, and Reduction
. Lawrence Erlbaum Associates, p.25.
ISBN
0-89859-598-3
.
- ^
Hollnagel, E. (1993)
Human Reliability Analysis Context and Control
. Academic Press Limited.
ISBN
0-12-352658-2
.
- ^
a
b
c
Reason, James (1990)
Human Error
. Cambridge University Press.
ISBN
0-521-31419-4
.
- ^
Woods, 1990
- ^
Hollnagel, E., Woods, D. D. & Leveson, N. G. (2006). Resilience engineering: Concepts and precepts. Aldershot, UK: Ashgate
- ^
Jones, 1999
- ^
Wallace and Ross, 2006
- ^
Senders and Moray, 1991
- ^
Roth et al., 1994
- ^
Sage, 1992
- ^
Norman, 1988
- ^
DOE HDBK-1028-2009 (
https://www.standards.doe.gov/standards-documents/1000/1028-BHdbk-2009-v1/@@images/file
)
- ^
Rasmussen, Jens
; Pejtersen, Annelise M.; Goodstein, L.P. (1994).
Cognitive Systems Engineering
. John Wiley & Sons.
ISBN
0471011983
.
- ^
"The Management Oversight and Risk Tree (MORT)"
. International Crisis Management Association. Archived from
the original
on 27 September 2014
. Retrieved
1 October
2014
.
- ^
Entry for MORT on the
FAA Human Factors Workbench
- ^
Hollnagel, E. (1983).
"Human error. (Position Paper for NATO Conference on Human Error, August 1983, Bellagio, Italy"
.
- ^
Hollnagel, E. and Amalberti, R. (2001). The Emperor's New Clothes, or whatever happened to "human error"? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development.. Linkoping, June 11?12, 2001.
- ^
Hollnagel, Erik (2009).
The ETTO principle : efficiency-thoroughness trade-off : why things that go right sometimes go wrong
. Farnham, England Burlington, VT: Ashgate.
ISBN
978-0-7546-7678-2
.
OCLC
432428967
.
External links
[
edit
]