• 热门标签

当前位置: 主页 > 航空资料 > 航空安全 >

时间:2010-05-19 08:33来源:蓝天飞行翻译 作者:admin
曝光台 注意防骗 网曝天猫店富美金盛家居专营店坑蒙拐骗欺诈消费者

3
Alkov, R. A. (1997). Human error. In Aviation safety- The human factor (pp. 75-87).
Casper, WY: Endeavor Books.
This paper makes the argument that much is known about what causes errors, but systems cannot
be error-free and eventually errors will occur. Three factors must be considered when studying
human error.
(1) Identical types of error can have fundamentally different causes.
(2) Anyone is capable of making errors, regardless of experience level, proficiency, maturity,
and motivation.
(3) Outcomes of similar errors can be different.
Errors are classified as design-induced. These errors can be random, systematic or sporadic.
Other types of error classifications include errors of omission, commission, substitution,
reversible, and irreversible. The paper goes on to describe three things that a model of human
error should do. It needs to be able to predict the error, take into account data input, account for
cognitive processes, and examine actions of individuals to determine what kind of error behavior
occurred. Three taxonomies for errors are also discussed. The first taxonomy simply describes
what happened. The second taxonomy lumps together errors according to the underlying
cognitive mechanism that cause it. The third taxonomy classifies errors according to human
biases or tendencies. The slips, lapses, mistakes paradigm of error is then examined within these
taxonomies. Errors, which are unintended, are contrasted to violations, which are usually
deliberate. The authors also take a look at intentional violations performed by operators. The
decision to perform a violation is shaped by three interrelated factors. These factors are attitudes
to behavior, subjective norms, and perceived behavioral control. The role of latent failures versus
active failures is discussed. Latent failures are consequences of human actions or decisions that
take a long time to reveal themselves. Active failures have almost immediate negative outcomes.
Finally, local versus organizational factors are stressed as being important. Local factors refer to
the immediate workplace whereas organizational factors refer to those that occur outside of the
immediate workplace.
Amendola, A. (1990). The DYLAM approach to systems safety analysis. In A. G. Colombo
& A. S. De Bustamante (Eds.), Systems reliability assessment (pp. 159-251). The
Netherlands: Kluwer Academic Publishers.
The DYLAM (Dynamic Logical Analytical Methodology) is described and analyzed. DYLAM is
a methodology created to address the problem of the inability of event trees to adequately
account for dynamic processes interacting with systems’ states. DYLAM is especially useful for
developing stochastic models of dynamic systems which provide a powerful aid in the design of
protection and decision support systems to assist operators in the control of hazardous processes
in addition to systems safety assessment. The method differs from other techniques in its ability
to account for process simulations and components of reliability performance in a unified
procedure. The method uses heuristical bottom-up procedures that lead to the identification of
event sequences that cause undesired conditions. It is also able to consider changes of the system
structure due to control logic and to random events.
4
Baron, S., Feehrer, C., Muralidharan, R., Pew, R., & Horwitz, P. (1982). An approach to
modeling supervisory control of a nuclear power plant (NUREG/CR-2988). Oak Ridge,
TN: Oak Ridge National Laboratory.
The purpose of this report is to determine the feasibility of applying a supervisory control
modeling technology to the study of critical operator-machine problems in the operation of a
nuclear power plant. A conceptual model is formed that incorporates the major elements of the
operator and of the plant to be controlled. The supervisory control modeling framework is
essentially a top-down, closed-loop simulation approach to supervisory control that provides for
the incorporation of discrete tasks and procedurally based activities.
Barriere, M. T., Ramey-Smith, A., & Parry, G. W. (1996). An improved HRA process for
use in PRAs. Probabilistic Safety Assessment and Management ’96 (pp. 132-137). New
York, NY: Springer.
A summary of the human reliability analysis called ATHEANA (a technique for human error
analysis) is given. ATHEANA is an analytical process for performing a human reliability
analysis in the context of probabilistic risk assessment. ATHEANA is based on an understanding
of why human-system interaction failures occur as opposed to behavioral and phenomenological
description of operator responses.
Benner, L., Jr. (1975). Accident investigations: Multilinear events sequencing methods.
 
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:人为因素分析综述(2)