AEL Rigour
1 Statement -field service records support claims Statement -SW is relevant to Field service claims Statement -operational environment is relevant to Field service claims Statement -field service records are complete and correct
2 Field service records DRACAS procedure Report -analysis of tool and procedure errors
3 Report -analysis of Field service claims Report -analysis of similarity of SW/Justification for differences Report -Analysis of similarity of operating environment/ justification for differences Report -verification of use of DRACAS & supporting tools
4 Assessment of analysis, justification and verification by an independent department
5 Assessment of analysis, justification and verification by an independent organisation
AEL Rigour
1 Statement -selection of best practice guidance/standards/notations/ techniques/tools Statement -analysis shows criteria are met for all attributes/ justification for failure to meet criteria Statement -verification & validation of tools
2 Report -analytic criteria including use of formal metrics for criteria coverage Results of analysis Report -verification of use of guidance/standards/notations/ techniques/tools Project specific development process developed and justified Staff competency rules and justification Report - analysis of tool errors
3 Report - verification of criteria Report -assessment of results Report -assessment of development process (all practicable measures have been taken to ensure the product is free of errors) Report -adequacy of criteria (including justification for coverage) Report -verification of use of project specific development process Report -verification & validation of tools Report -verification of staff competency
4 Assessments performed by independent department
5 Assessments performed by independent organisation
NOTE 1:The above items are cumulative; all items for lower AELs should be included with the items for higher AELs.
NOTE 2:Often standards and regulations concentrate on when a technique should be applied, making a decision that above a certain criticality technique A is required and below it is not. In this guidance the emphasis is on the rigour and extent of the activity not whether it should be done or not.
For example it is quite obvious that all systems should be tested, but it is the extent of the tests, their independence and the visibility of the associated test cases and results that vary. At low AELs a statement from a competent organisation that test criteria have been defined according to some systematic best practice is sufficient. At higher AELs the test criteria should be justified and documented with additional reports provided.
The tables above and in section 7.4 capture how the variation in the rigour of evidence with AEL might occur. However it is the demands of the argument being made and what is necessary to provide a convincing case that is the overriding factor. The tables therefore combine a number of different factors. There are changes in the role of 'testing' within the overall argument as other arguments (e.g. analytical ones) take a more prominent role (section 7.4). Also, there are variations in the strength of argument for the testing (e.g. provision of independent oversight) as well as changes to the details of the arguments being made in the tables above (e.g. test criteria are adequate because a certain type of coverage is desired and is being measured). These different factors can interact in a number of ways and it is the overarching need for a convincing and valid argument that should ultimately drive the rigour of the evidence provided.
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:CAP 670 Air Traffic Services Safety Requirements 1(72)