曝光台 注意防骗
网曝天猫店富美金盛家居专营店坑蒙拐骗欺诈消费者
are needed to prevent or even detect the emerging
likelihood of risks relating to untrustworthiness. For
starters, prevention and remediation of the risks must encompass
better understanding of the full range of requirements,
as well as better system architectures explicitly addressing
those trustworthiness requirements with appropriate
assurance, usability, operation, and maintainability.
Greater attention needs to be devoted to the software
engineering disciplines for implementing trustworthy applications,
either based on underlying trustworthy infrastructures
such as secure operating systems and networking,
or alternatively able to architecturally surmount some
untrustworthiness in subsystems. (Twenty-two paradigmatic
approaches to building trustworthy systems out of
less trustworthy components are examined in Section 3.5
of [14]. However, beware of compromises resulting from
coordinated attacks or accidental misbehavior originating
inside the implementations or in underlying layers of
abstraction.) In addition, these concepts must be inculcated
into the practice of developers and operational staff
through enlightened education and training. Also valuable
would be a stronger sense of corporate altruism and perhaps
some intelligent government action. (However, with
regard to legislation and regulation, be very careful what
you ask for; you might get it, or — perhaps more likely
— a badly distorted version of it).
Above all, trustworthiness demands a pervasive sense
of systems in their entirety, considering the long-term
risks in the global context of all relevant applications relative
to the totality of all relevant requirements. The preceding
four sections illustrate four types of systems in
which greater trustworthiness is urgently needed: survivable
backup systems, robust networking, safe systems,
and secure total systems (such as networked operating
systems together with all their networked applications).
In each of these and many other application areas, the
potentials for untrustworthiness must be considered with
respect to the environments in which those systems operate.
The desired trustworthiness properties are mostly
emergent properties of the entire system, rather than isolated
properties of subsystems. Nevertheless, a huge step
forward in avoiding or circumventing untrustworthiness
would result if the emergent properties of application systems
as a whole could be systematically derived or otherwise
inferred from the composable properties of the subsystems,
and so on iteratively into lower layers of abstraction.
For example, see the 1977 Robinson–Levitt paper
[19] and its application to the Provably Secure System
design [3, 16, 17], and Neumann’s report on predictable
composability [14].
Life-critical systems and supposedly secure systems
should of course be held to higher standards than conventional
software, although criteria and evaluations tend to
be not very rigorous. As one rather sad example, today’s
electronic voting systems (e.g., [11, 21]) are held to much
weaker standards than gambling machines!
Myopia is very dangerous with respect to trustworthiness.
The commonalities among the different application
areas considered here are likely to transcend wouldbe
single-discipline solutions. Thus, a massive culture
shift is needed to proactively develop and compositionally
evaluate systems in their entirety and to assure their
operational configurations and usability. This is of course
especially important for applications with critical requirements
for trustworthiness. The pleas for such approaches
in many Classic Papers are old but nevertheless still
timely.
Acknowledgments
The author thanks Douglas Maughan, who sponsored
reference [14] when he was a Program Manager in the
Defense Advanced Research Projects Agency (DARPA).
Many of the concepts discussed there on how to develop
composable high-assurance trustworthy systems and networks
are relevant to avoiding risks such as those discussed
here. This paper was prepared in part under National
Science Foundation Grant Number 0524111.
References
[1] R.P. Abbott et al. Security analysis and enhancements
of computer operating systems. Technical report,
National Bureau of Standards, 1974. Order No.
S-413558-74.
5
[2] K. Borg. Re: LA power outages. ACM
Risks Forum, 24(39), 22 August 2006.
http://catless.ncl.ac.uk/Risks/24.39.html#subj8.
[3] R.J. Feiertag and P.G. Neumann. The foundations
of a Provably Secure Operating System
(PSOS). In Proceedings of the National Computer
中国航空网 www.aero.cn
航空翻译 www.aviation.cn
本文链接地址:
航空资料23(107)