Sunday, 7 August 2011

Critical systems Specification

In September 1993, a plane landed at Warsaw airport in Poland during a thunderstorm.
For nine seconds after landing, the brakes on the computer-controlled braking
system did not work. The plane ran off the end of the runway, hit an earth bank
and caught fire. The subsequent enquiry showed that the braking system software
had worked perfectly according to its specification. However, for reasons I won't
go into here, the braking system did not recognise that the plane had landed. A safety
feature on the aircraft had stopped the deployment of the braking system because
this can be dangerous if the plane is in the air. The system failure was caused by
an error in the system specification.
This illustrates the importance of specification for critical systems. Because of
the high potential costs of system failure, it is important to ensure that the specification
for critical systems accurately reflects the real needs of users of the system.
If you don't get the specification right, then, irrespective of the quality of the software
development, the system will not be dependable.
The need for dependability in critical systems generates both functional and nonfunctional
system requirements:
1. System functional requirements may be generated to define error checking and
recovery facilities and features that provide protection against system failures.
2. Non-functional requirements may be generated to define the required reliability
and availability of the system.
In addition to these requirements, safety and security considerations can generate
a further type of requirement that is difficult to classify as a functional or a nonfunctional
requirement. They are high-level requirements that are perllaps best described
as 'shall not' requirements. By contrast with normal functional requirements that
define what the system shall do, 'shall not' requirements define system behaviour
that is unacceptable. Examples of 'shall not' requirements are:
The system shall not allow users to modify access permissions on any files that
they have not created. (security)
The system shall not allow reverse thrust mode to be selected when the aircraft
is in flight. (safety)
The system shall not allow the simultaneous activation of more than three alarm
signals. (safety)
These 'shall not' requirements are sometimes decomposed into more specific software
functional requirements. Alternatively, implementation decisions may be
deferred until the system is designed.
The user requirements for critical systems will always be specified using natural
language and system models. However, as I discuss in Chapter 10, formal specification
and associated verification are most likely to be cost-effective in critical systems
development (Hall, 1996; Hall and Chapman, 2002; Wordsworth, 1996). Formal
specifi~ations are not just a basis for a verification of the design and implementation.
They are the most precise way of specifying systems so reduce the scope for
misunderstanding. Furthermore, constructing a formal specification forces a detailed
analysis of the requirements, which is an effectivle way of discovering problems in
the specification. In a natural language specification, errors can be concealed by the
imprecision of the language. This IS not the case if the system is formally specified.

No comments:

Post a Comment

Your comments are welcome!