• 热门标签

当前位置: 主页 > 航空资料 > 航空安全 >

时间:2010-07-01 21:08来源:蓝天飞行翻译 作者:admin
曝光台 注意防骗 网曝天猫店富美金盛家居专营店坑蒙拐骗欺诈消费者

rut. After all, implementing intervention programs that
improve safety is the ultimate goal, and one should not
confuse data collection with safety interventions! Left
without useful tools for systematically studying the
proverbial “boat load” of human error data, the only option
available to safety managers is to bypass the next two highly
critical steps in the system safety process. As illustrated in
Figure 2, these steps that are often skipped involve the
identification and assessment of hazards.
Hazard
Identification
Hazard
Assessment
Identify
Interventions
Intervention
Assessment
Intervention
Implementation
Data
Pretend to Monitor
SSaaffeettyy Maannaaggeemmeenntt PPrroocceessss LLiimmiitteedd HHuummaann FFaaccttoorrss TToooollss aanndd TTeecchhnniiqquueess
No tools/techniques -Interventions therefore
based on “gut feelings” or personality driven
Human factors data is not available to
evaluate interventions
“Data Bucket” filled from accidents,
incidents, safety reports, etc.
Poor tools to code, archive and analyze error
data - key steps are therefore skipped
Figure 2. Lack of effective human factors tools results
in key steps in the safety process being skipped.
The fact is, there has been no way to determine the
common problems that exist throughout the entire system.
As a result, the modus operandi of many organizations has
been the process of simply “cherry picking” the hazards
identified in the most recent incident or focusing on select
“high profile” events that have captured everyone’s
attention. However, these may have no real bearing on the
most threatening systemic hazards that are lying dormant in
the system (and most likely in your database as well). As a
result, the forest is often missed for the trees.
To make matters worse, there have been no effective tools
for systematically generating effective intervention
programs that target specific forms of human error.
Consequently, as also illustrated in Figure 2, this has forced
safety professionals to bypass the next key steps in the
safety management process – generating and evaluating
intervention strategies. Indeed, the typical process for
ginning up error prevention programs is to use simple
intuition, expert opinion, or “pop psychology.”
Unfortunately, such an approach often results in people
pushing their pet projects, or the person with the loudest
voice or highest authority having the last say on what gets
done. Hence, this entire engineering approach to human
error management, although conceivably a good idea, has
met with little success.
“Insanity is doing the same thing over-andover
again and expecting a different outcome.”
Albert Einstein
“Prescription without diagnosis is
malpractice.
Socrates
“Intuition is no substitute for information.”
Wiegmann & Shappell
Error Management Quarterly, vol. 1 (1), 2004 Human Factors in Safety Management Systems 3
Copyright Error Management Solutions, LLC 2004
Over the past several years, we have been working on ways
to improve the integration of human factors into the system
safety process. As both scientists and practitioners, we have
strived to develop methods for managing human error that
are scientifically derived, empirically tested, and proven in
the field. The results of our efforts have been the successful
development of a compendium of tools and techniques that
turn errors into information, information into knowledge,
and knowledge into effective error management solutions.
The Human Factors Analysis and Classification System
(HFACS®) is an empirically derived system-safety model
that effectively bridges the gap between human error theory
and the practice of applied human error analysis. A proven
safety management tool, HFACS facilitates the reliable
identification, classification, and analysis of human error in
complex, high-risk systems such as aviation, healthcare, and
nuclear power industries. As illustrated in Figure 3, the
HFACS framework comprehensively addresses the myriad
of active and latent failures known to influence operator
performance.
Latent Conditions
Latent Conditions
AAccttiivvee aanndd LLaatteenntt CCoonnddiittiioonnss
Failed or
Absent Defenses
Organizational
Factors
Unsafe
Supervision
Preconditions
for
Unsafe Acts
Unsafe
Acts
Active Conditions
Accident
Figure 3. Model of latent and active failures.
 
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:HFSMS-article(2)