Guarding The Guardians

Ben Brosgol and Greg Gicca discuss ensuring the security of safety systems in avionics

A software failure in avionics can result in horrific tragedy so how can the industry show that a safe software system is also secure especially now, with advanced networking and communications increasing vulnerability to security breaches?

The safety standards for software used in commercial avionics systems are set out in a document called DO-178B, Software Considerations in Airborne Systems and Equipment Certification. The FAA uses DO-178B as a guide to determine if software will perform reliably in an airborne environment and its development was a joint effort with EUROCAE who publish the document as ED-12B.

If zero aircraft fatalities due to failure of DO-178B certified code can be a measure of success, then DO-178B has proved to be a successful guardian of safety since its inception in the early 1990s (admittedly, there have been some cases that were a bit close for comfort).

Obviously technology has moved on considerably since it was written and the industry recognises that it was in need of revision. An updated version, DO-178C, was published in late 2011. But a concern is that, as advanced networking and communications facilities become available, safety-critical systems might be accessed by external systems and equipment, like Wi-Fi access, which some airlines provide. Whether intentional or inadvertent, such potential security breaches are an obvious risk and could lead to loss of life.

For software systems to be safe, security must be taken into account from the very first stages of design. Security cannot be added to software systems as an afterthought. Security objectives should be addressed by adding security-related requirements to the software’s overall safety requirements – and by using a combination of sound processes and appropriate deployment platforms and development technologies.

Levels of Assurance

Neither DO-178B, nor the new version DO-178C, actually set out specific security objectives. However, this in no way means that a DO-178B/C-certified system is insecure because what it provides is software assurance through a set of tasks to meet objectives and different levels of rigor.

DO-178B categorises software failure in terms of its potential effect on safety ranging from the lowest (Level E – no effect) to the highest (Level A – catastrophic failure with loss of aircraft). For each level there is a corresponding set of objectives that must be met. Different systems on the aircraft can be categorised at different levels.

Similar to the DO-178B/C software safety certification, the principal certification specification in software security is the Common Criteria standard, which catalogues and defines two sets of requirements:

  •           Security Functional Requirements (SFRs): Services that perform security-related tasks
  •           Security Assurance Requirements (SARs): Evaluation steps that check whether a product’s security objectives are met

Like the safety objectives of DO-178B, the SARs are grouped into Evaluation Assurance Levels (EALs), ranging from 1 (lowest) to 7 (highest) and achieving higher EALs takes additional effort.

Different application domains have different kinds of security requirements. In order to bring a degree of consistency to evaluating potential products, the Common Criteria defines the concept of a Protection Profile, which identifies:

  • The assets that need to be protected
  • The SFRs that need to be implemented
  • The SARs (i.e. the EAL) that must be met
  • The sophistication of the assumed operational environment/attacker

It does not necessarily follow that a product with a higher EAL is more secure and one with a lower score, less secure – it actually depends more on their respective Protection Profiles.

Where the DO-178B’s track record for software safety is good, the Common Criteria for software security has not been so successful. It may be that the effort needed to show compliance with a Protection Profile, and therefore to successfully achieve a given EAL, does not deliver the assurance about security that we need it to. Even so, the Common Criteria’s catalogue of SFRs and SARs can be extremely useful when working towards the security objectives of a safety-critical system. Based on a component’s functions and safety level, a software developer can determine which SFRs and SARs are relevant and add them as DO-178B requirements. Through such an analysis and selection of SFRs and SARs, developers can achieve workable levels of assurance for both software safety and security.

Achieving safety AND security – deployment platforms

Modern software systems, with requirements for both safety and security, may need to have different components set at different safety levels and/or different security levels but still be able to operate jointly – and even communicate with each other – without interference and without jeopardising the safety/security of more critical components. In order to achieve these goals, operating system architectures use partitioning – namely ARINC-653 for safety and MILS for security.

ARINC-653 is an architecture that supports multiple applications running at potentially different safety levels. A small real-time core controls all time and space usage for a number of applications, each running in its own isolated partition and invoking services from the APplication EXecutive (APEX), for example to multithread within a partition. Each application is allocated a certain amount of processor time per cycle and a certain amount (and location) of memory. This method guarantees that the operation of any one application cannot adversely affect another, and simplifies the testing.

The Multiple Independent Levels of Security (MILS) operating system architecture is similar to ARINC-653, but for applications running at potentially different security levels. An additional consideration for MILS is the managing of inter-partition communication in a secure policy-based manner, for example, ensuring that an unclassified partition cannot read classified data.

Achieving safety AND security – development technologies

Achieving high levels of safety and security requires finding bugs and potential areas of weakness early on in the software life-cycle. So using typical static analysis tools is going to be too late because the error is already in the code. A preferable approach is to use programming languages and associated tools that prevent the errors from being inserted in the first place. Languages such as Ada, with strong typing and extensive compile-time checking, can help. In a language such as C for example, adding an integer to a pointer can easily result in a “buffer overrun” error, where data is inserted into a location outside the bounds of the intended target data structure. This error is prevented in Ada, since the compiler will reject a program that attempts such an operation.

For applications that must reach high levels of safety and/or security, guarantees backed by formal mathematical reasoning may be necessary. In these cases it is often appropriate and cost effective to use a language that supports proofs of correctness of developer-specified program properties. The SPARK language takes this approach. SPARK is a deterministic Ada subset augmented with a notation for expressing a program’s “contracts” – for example, the pre and post-conditions and invariants of a subprogram, or a program unit’s data dependencies and information flows. Tools that complement the compiler apply proof techniques to verify the specified contracts. The resulting analysis can demonstrate, for example, that the program is free of run-time errors.

Combining safety and security

At AdaCore we understand that prevention is the best protection and that designing a safe software system means accounting for security right from the start – especially as safety standards do not explicitly address security issues. By using a combination of sound processes and appropriate technologies, software developers will be able to rise to the challenge.

Ben Brosgol is a member of the senior technical staff at AdaCore. Greg Gicca is an independent consultant. An earlier, extended version of this paper is available here.

Posted in Avionics, Features, Safety, Security Tagged with:

Comments are closed.