曝光台 注意防骗
网曝天猫店富美金盛家居专营店坑蒙拐骗欺诈消费者
being exactly alike. There are numerous instances in which minor changes in methods
to produce a component with the same or improved design characteristics as previous
items have instead caused failures and accidents. If an accident has occurred,
correction of the cause by change in the design, material, code, procedures, or
production process may immediately nullify certain statistical data.
· Generalized probabilities do not serve well for specific, localized situations. In other
situations, data may be valid but only in special circumstances. Statistics derived
from military or commercial aviation sources may indicate that a specific number of
aircraft accidents due to bird strikes take place every 100,000 or million flight hours.
On a broad basis involving all aircraft flight time, the probability of a bird strike is
comparatively low. However, at certain airports near coastal areas where birds
abound, the probability of a bird-strike accident is much higher.
· Human error can have damaging effects even when equipment or system reliability has
not been lessened. A common example is the loaded rifle. It is highly reliable, but
people have been killed or wounded when cleaning or carrying them.
· Probabilities are usually predicated on an infinite or large number of trials.
Probabilities, such as reliabilities for complex systems, are of necessity based upon
very small samples, and therefore have relatively low confidence levels.
7.2.6 Human in the Loop10
Fortunately humans usually try to acclimate themselves to automation prior to its use. Depending on the
complexity of the system acclimation will take resources, time, experience, training, and knowledge.
Automation has become so complex that acclimation has become an “integration-by-committee” activity.
Specialists are needed in operations, systems engineering, human factors, system design, training,
maintainability, reliability, quality, automation, electronics, software, network communication, avionics,
and hardware. Detailed instruction manuals, usually with cautions and warnings, in appropriate language,
are required. Simulation training may also be required.
9 Ibid. Hammer, page 91 and 92.
10 Allocco, Michael, Automation, System Risks and System Accidents, 18th International System safety Society Conference
FAA System Safety Handbook, Chapter 7: Integrated System Hazard Analysis
December 30, 2000
7 - 17
The interaction of the human, and machine if inappropriate, can also introduce additional risks. The human
can become overloaded and stressed due inappropriately displayed data, an inappropriate control input, or
similar erroneous interface. The operator may not fully understand the automation, due to its complexity.
It may not be possible to understand a particular system state. The human may not be able to determine if
the system is operating properly, or if malfunctions have occurred.
Imagine relying on an automated system and due to malfunction or inappropriate function, artificial
indications are displayed and the system is inappropriately communicating. In this case the human may
react to an artificial situation. The condition can be compounded during an emergency and the end result
can be catastrophic. Consider an automated reality providing an artificial world and the human reacts to
such an environment. Should we trust what the machines tell us in all cases?
The integration parameters concerning acclimation further complicate the picture when evaluating
contingency, backup, damage control, or loss control. It is not easy to determine the System State; when
something goes wrong, reality can become artificial. The trust in the system can be questioned.
Determining what broke could be a big problem. When automation fails, the system could have a mind of
its own. The human may be forced to take back control of the malfunctioning system. To accomplish such
a contingency may require the system committee. These sorts of contingencies can be addressed within
appropriate system safety analysis.
7.2.7 Software as a Risk Control
Software reliability is the probability that software will perform its assigned function under specified
conditions for a given period of time11. The following axioms are offered for consideration by the system
safety specialist:
· Software does not degrade over time.
· Since software executes its program as written, it does not fail.
· Testing of software is not an all-inclusive answer to solve all potential software-related
risks.
· Software will not get better over time.
· Software can be very complex.
· Systems can be very complex.
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:
System Safety Handbook系统安全手册下(21)