曝光台 注意防骗
网曝天猫店富美金盛家居专营店坑蒙拐骗欺诈消费者
investigators with a user-friendly, common sense framework that allows accident investigations
to be conducted and human causal factors classified. Three levels of failure involving the human
component are presented. These are unsafe supervision, unsafe conditions of operators, and
unsafe acts that operators commit. The framework directly incorporates Reason’s classification
of unsafe acts. Three basic error types are incorporated. The first are slips which are
characteristic of attentional failures. The second are lapses which come from memory failures.
The third are mistakes which are defined as intentional behavior that doesn’t produce the desired
outcome. Mistakes can be further broken down as being either rule-based or knowledge-based,
as is described in Rasmussen’s model. A framework was developed to breakdown unsafe
conditions of the operator. Substandard conditions of the operator are divided into three
categories. These include adverse physiological states, adverse mental states, and physical and/or
mental limitations. Substandard practices of the operator are also broken down into three
categories. These categories are mistakes-misjudgments, crew resource mismanagement, and
readiness violations. A framework for unsafe supervision is also developed. One dimension of
this framework deals with unforeseen unsafe supervision. Examples of this are unrecognized
unsafe operations, inadequate documentation and procedures, and inadequate design. The other
dimension deals with known unsafe supervision. Examples of this include inadequate
supervision, planned inappropriate operations, failure to correct known problems, and
supervisory violations. The usefulness of this cause-oriented taxonomy was shown by a
demonstration in its application to a military aviation accident.
Shappell, S. A., & Wiegmann, D. A. (2000). The human factors analysis and classification
system (HFACS) (Report Number DOT/FAA/AM-00/7). Washington DC: Federal Aviation
Administration.
The Human Factors Analysis and Classification System (HFACS) was originally developed for
the U.S. Navy and Marine Corps as an accident investigation and data analysis tool. Since its
original development however, HFACS has been employed by other military organizations (e.g.,
U.S. Army, Air Force, and Canadian Defense Force) as an adjunct to preexisting accident
investigation and analysis systems. To date, the HFACS framework has been applied to over
1,000 military aviation accidents yielding objective, data-driven intervention strategies while
enhancing both the quantity and quality of human factors information gathered during accident
investigations. Other organizations such as the FAA and NASA have has also explored the use of
HFACS as a complement to preexisting systems within civil aviation in an attempt to capitalize
on gains realized by the military. Specifically, HFACS draws upon Reason’s (1990) concept of
latent and active failures and describes human error at each of four levels of failure: 1) unsafe
acts of operators (e.g., aircrew), 2) preconditions for unsafe acts, 3) unsafe supervision, and 4)
organizational influences. The manuscript provides a detail description and examples of each of
these categories.
44
Siegel, A. I., Bartter, W. D., Wolf, J. J., Knee, H. E., & Haas, P. M. (1984). Maintenance
personnel performance simulation (MAPPS) model: Summary description (NUREG/CR-
3626). Oak Ridge, TN: Oak Ridge National Laboratory.
This report describes a human performance computer simulation model developed for the
nuclear power maintenance context. The model looks at variables such as work place,
maintenance technician, motivation, human factors, and task-orientation. Information is provided
about human performance reliability pertinent to probabilistic risk assessment, regulatory
decisions, and maintenance personnel requirements. The technique allows for the assessment of
the tasks that maintenance technicians may perform in a less than satisfactory manner and what
conditions or combination of conditions serve to contribute to or alleviate such performance.
Silverman, B. G. (1992). Critiquing human error: A knowledge based human-computer
collaboration approach. San Diego, CA: Academic Press.
A model of human error is examined that that is useful for construction of a critic system in
artificial intelligence. The model is rooted in the psychological study of expert performance. The
adapted model of human error consists of three levels. The outermost layer provides a method to
account for external manifestations of errors in human behavior. These occur as cues such as
knowledge rules, models, or touchstones, that need to be followed to reach a correct task
outcome. The middle layer determines what causes lead to the erroneous behaviors identified in
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:
人为因素分析综述(26)