4.2 Sources of Evidence for Requirements Satisfaction
4.2.1 Evidence to support an argument that a software safety requirement has been met may be obtained from one or more of the following main sources: a) Testing of the object code. b) Field service experience of an identical, or sufficiently similar, system.
c) Analysis of an appropriate level of design.
Which of the three main sources of evidence is most appropriate will vary according
to the attribute concerned and the required AEL, as indicated in Part 3 Section 6.
NOTE 1:In this context, source code is considered to be an aspect of design. NOTE 2:Analysis can include evidence of the effective use of appropriate processes and techniques. NOTE 3:For the other assurance objectives (Configuration consistency, Requirements traceability and Requirements validity) analytic evidence is expected.
4.2.2 The forms of evidence available for each attribute and each source of evidence are listed in Table 1.
NOTE: Table 1 is not exhaustive; other forms of evidence that support a claim that a given attribute is satisfied may be offered.
Table 1 Forms of Evidence: Satisfaction of Safety Requirements
Software attribute Test evidence Field experience Analytic evidence
Functional properties Functional testing Analysis of known faults in a product Formal proof of logical behaviour
Timing Properties Response time tests. Maximum throughput tests Analysis of known faults in a product Worst case timing analysis. Performance modelling
Robustness Fault injection testing (internal and i/o). Power failure and equipment failure tests Evidence from incident reports on effectiveness of fault tolerance measures Design evidence that internal and external failures can be detected, and appropriate action taken
Table 1 (Continued) Forms of Evidence: Satisfaction of Safety Requirements
Software attribute Test evidence Field experience Analytic evidence
Reliability Reliability testing (using expected operational profile) Evidence of high test coverage Field reliability measurements (for a similar operational profile) Estimates based on residual faults and operating time (N/T) Evidence of a low probability of residual faults (from analysis of the process and the product). E.g. Static analysis Compliance analysis Complexity metrics Inspection, Quality of support tools. Fault density in similar projects
Accuracy Measuring error for known test cases Analysis of known faults in a product Numerical analysis Algorithm stability analysis
Resource usage Worst case load tests (disc, memory, input/output, communications, processor) Resource usage monitoring data from similar applications Design evidence of static assignment of resources at start-up. Worst case resource analysis
Overload tolerance Excess load tests Analysis of known faults in a product Design evidence that system will degrade gracefully in overload conditions
4.3 Rigour of Evidence
The rigour (depth and strength) of the evidence gathered (both direct and backing evidence) also increases with AEL. This is reflected in the requirements at section 6.1, which shows the evidence to be produced from each source of evidence for each AEL.
The requirements in each table are cumulative - i.e. at a given AEL, its requirements together with all requirements for lower AELs should be complied with.
5 Safety Cases
Insofar as SW01 deals only with the approval of software demonstration of the satisfaction of the requirements herein may be used in support of a system safety case.
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:CAP 670 Air Traffic Services Safety Requirements 1(85)