Certification of software for real-time safety-critical systems: state of the art

13
Innovations Syst Softw Eng (2009) 5:149–161 DOI 10.1007/s11334-009-0088-1 ORIGINAL PAPER Certification of software for real-time safety-critical systems: state of the art Andrew Kornecki · Janusz Zalewski Received: 15 March 2009 / Accepted: 20 April 2009 / Published online: 2 June 2009 © Springer-Verlag London Limited 2009 Abstract This paper presents an overview and discusses the role of certification in safety-critical computer systems focus- ing on software, and partially hardware, used in the civil avi- ation domain. It discusses certification activities according to RTCA DO-178B “Software Considerations in Airborne Systems and Equipment Certification” and touches on tool qualification according to RTCA DO-254 “Design Assur- ance Guidance for Airborne Electronic Hardware.” Specif- ically, certification issues as related to real-time operating systems and programming languages are reviewed, as well as software development tools and complex electronic hard- ware tool qualification processes are discussed. Results of an independent industry survey done by the authors are also presented. Keywords Software certification · Software tools · Software safety · Tool qualification · Safety-critical systems · Real-time systems 1 Introduction Certification is the hot issue in many industries that rely on use of computers and software in embedded systems that control safety-critical equipment. The term “certification” in software engineering is typically associated with three A. Kornecki Embry-Riddle Aeronautical University, 600 S. Clyde Morris Blvd, Daytona Beach, FL 32114, USA e-mail: [email protected] J. Zalewski (B ) Florida Gulf Coast University, 10501 FGCU Blvd, Fort Myers, FL 33965, USA e-mail: [email protected] meanings: certifying product, process, or personnel. Product and process certification are the most challenging in devel- oping software for real-time safety critical systems, such as flight control and traffic control, road vehicles, railway inter- changes, nuclear facilities, medical equipment and implanted devices, etc. These are systems that operate under strict tim- ing requirements and may cause significant damage or loss of life, if not operating properly. Therefore, the society has to protect itself, and governments and engineering organi- zations initiated establishing standards and guidelines for computer system developers to follow them in designing safety-critical systems in several regulated industries, includ- ing aerospace, avionics, automotive, medical, nuclear, rail- ways, and others. Consequently, the U.S. government and international agencies that regulate respective industries have issued a number of standards, guidelines, and reports related to certi- fication and/or other aspects of software assurance, such as licensing, qualification, or validation, in their specific areas of interest. Two such guidance documents for civil aviation, DO-178B [1] and DO-254 [2], developed by RTCA Inc., describe the conditions for assurance in designing software and electronic hardware in airborne systems. The guidelines are adopted by the U.S. Federal Aviation Administration (FAA) and the European EUROCAE, as mandatory for design and implementation of airborne systems. In this paper we present an overview of current practices in civil aviation and related industries and discuss issues associ- ated with certification of software as well as qualification of tools to meet the guidance requirements. In particular, Sect. 2 discusses the role of guidance in certification, and Sect. 3 reviews the certification issues according to DO-178B. Sections 4 and 5 discuss tool qualification for software and hardware development, respectively. Section 6 provides some conclusions. 123

Transcript of Certification of software for real-time safety-critical systems: state of the art

Innovations Syst Softw Eng (2009) 5:149–161DOI 10.1007/s11334-009-0088-1

ORIGINAL PAPER

Certification of software for real-time safety-critical systems: stateof the art

Andrew Kornecki · Janusz Zalewski

Received: 15 March 2009 / Accepted: 20 April 2009 / Published online: 2 June 2009© Springer-Verlag London Limited 2009

Abstract This paper presents an overview and discusses therole of certification in safety-critical computer systems focus-ing on software, and partially hardware, used in the civil avi-ation domain. It discusses certification activities accordingto RTCA DO-178B “Software Considerations in AirborneSystems and Equipment Certification” and touches on toolqualification according to RTCA DO-254 “Design Assur-ance Guidance for Airborne Electronic Hardware.” Specif-ically, certification issues as related to real-time operatingsystems and programming languages are reviewed, as wellas software development tools and complex electronic hard-ware tool qualification processes are discussed. Results ofan independent industry survey done by the authors are alsopresented.

Keywords Software certification · Software tools ·Software safety · Tool qualification · Safety-critical systems ·Real-time systems

1 Introduction

Certification is the hot issue in many industries that rely onuse of computers and software in embedded systems thatcontrol safety-critical equipment. The term “certification”in software engineering is typically associated with three

A. KorneckiEmbry-Riddle Aeronautical University, 600 S. Clyde Morris Blvd,Daytona Beach, FL 32114, USAe-mail: [email protected]

J. Zalewski (B)Florida Gulf Coast University, 10501 FGCU Blvd,Fort Myers, FL 33965, USAe-mail: [email protected]

meanings: certifying product, process, or personnel. Productand process certification are the most challenging in devel-oping software for real-time safety critical systems, such asflight control and traffic control, road vehicles, railway inter-changes, nuclear facilities, medical equipment and implanteddevices, etc. These are systems that operate under strict tim-ing requirements and may cause significant damage or lossof life, if not operating properly. Therefore, the society hasto protect itself, and governments and engineering organi-zations initiated establishing standards and guidelines forcomputer system developers to follow them in designingsafety-critical systems in several regulated industries, includ-ing aerospace, avionics, automotive, medical, nuclear, rail-ways, and others.

Consequently, the U.S. government and internationalagencies that regulate respective industries have issued anumber of standards, guidelines, and reports related to certi-fication and/or other aspects of software assurance, such aslicensing, qualification, or validation, in their specific areasof interest. Two such guidance documents for civil aviation,DO-178B [1] and DO-254 [2], developed by RTCA Inc.,describe the conditions for assurance in designing softwareand electronic hardware in airborne systems. The guidelinesare adopted by the U.S. Federal Aviation Administration(FAA) and the European EUROCAE, as mandatory fordesign and implementation of airborne systems.

In this paper we present an overview of current practices incivil aviation and related industries and discuss issues associ-ated with certification of software as well as qualification oftools to meet the guidance requirements. In particular, Sect. 2discusses the role of guidance in certification, and Sect. 3reviews the certification issues according to DO-178B.Sections 4 and 5 discuss tool qualification for software andhardware development, respectively. Section 6 provides someconclusions.

123

150 A. Kornecki, J. Zalewski

2 The role of standards in certification

2.1 Standards in civil aviation

The RTCA Inc., previously known as the Radio-Telecommu-nication Committee for Aviation, is a non-profit organizationformed to advance the art and science of aviation and avia-tion electronic systems for the benefit of the public. The mainRTCA function is to act as a Federal Advisory Committeeto develop consensus-based recommendations on aviationissues, which are used as the foundation for FAA Techni-cal Standard Orders controlling the certification of aviationsystems.

In 1980, the RTCA, convened a special committee(SC-145) to establish guidelines for developing airbornesystems and equipment. They produced a report, “SoftwareConsiderations in Airborne Systems and Equipment Certi-fication,” which was subsequently approved by the RTCAExecutive Committee and published in January 1982 as theRTCA document DO-178. After gaining further experiencein airborne system certification, the RTCA decided to revisethe previous document. Another committee (SC-152) draftedDO-178A, which was published in 1985. Due to rapidadvances in technology, the RTCA established a new com-mittee (SC-167) in 1989. Its goal was to update, as needed,DO-178A. SC-167 focused on five major areas: (1) Doc-umentation Integration and Production, (2) System Issues,(3) Software Development, (4) Software Verification, and(5) Software Configuration Management and Software Qual-ity Assurance. The resulting document, DO-178B, providesguidelines for these areas [1].

RTCA DO-254//EUROCAE ED-80 [2] was released in2000, addressing design assurance for complex electronichardware. The guidance is applicable to a wide range ofhardware devices, ranging from integrated technology hybridand multi-chip components, to custom programmable micro-coded components, to circuit board assemblies (CBA), toentire line replaceable unit (LRU). This guidance alsoaddresses the issue of commercial off-the-shelf (COTS) com-ponents. The document’s appendices provide guidance fordata to be submitted for certification, including: indepen-dence and control data category based on the assigned assur-ance level, description of the functional failure path analysis(FFPA) method applicable to hardware with design assur-ance levels (DAL) A and B, and discussion of additionalassurance techniques, such as formal methods to support andverify analysis results.

In air transportation, as pointed out by Kesseler [3], thesituation is different than the approach taken for aircraft.While for the latter a single certification is done, for ser-vices provided by ground systems, typically the license isissued by a national regulator for a specific time period.Unfortunately, for air traffic management no internationally

recognized standard exists. In SW 01, which is a part of theBritish CAP 670 standard [4], so called assurance evidencelevels (AEL) are defined, to identify the type, depth, andstrength of evidence to be provided to the assessor of thesoftware development process. Thus, the developer may usepractically any standard to provide this evidence. The stan-dard uses the term “assessment” rather than certification andmakes it process oriented. Other certification issues are alsodiscussed by Kesseler in [5].

The implementation of communication, navigation, sur-veillance, and air traffic management (CNS/ATM) systemshas resulted in increased interdependence of systems provid-ing air traffic services (ATS) and systems onboard aircraft.CNS/ATM systems include ground, airborne, and space-based devices. Document RTCA DO-278 [6] resulting fromdeliberations of Special Committee 190 (SC-190) providesguidelines for the assurance of software contained in non-airborne CNS/ATM systems. The guidance applies to soft-ware contained in CNS/ATM systems used in ground orspace-based applications shown by a system safety assess-ment process to affect the safety of aircraft occupants or air-frame in its operational environment. Similar to the DO-178Bthe guidance is objective-based where specific objectivesmust be met depending on the level of system criticality iden-tified by the system safety assessment.

2.2 Aerospace industry

In the aerospace industry, respective standards have beenpublished by NASA. NASA Software Assurance Standard[7] refers, in a couple of paragraphs, to certification but doesnot define it. Instead, it defines software assurance follow-ing the IEEE Glossary [8] as “The planned and systematicset of activities that ensure that software life cycle processesand products conform to requirements, standards, and pro-cedures.”

NASA Software Safety Standard [9] talks about certifi-cation and defines it as follows: “The process of formallyverifying that a system, software subsystem, or computerprogram is capable of satisfying its specified requirementsin an operational environment for a defined period of time.This includes any requirements for safing the system upon theoccurrence of failures with potential safety impacts.” Then,the standard dedicates one page to a description of the certifi-cation process: “Safety-critical software is certified by ensur-ing it is produced in accordance with the requirements in thisStandard. This may be a Center, program, project or facilityspecific certification process.”

NASA Software Safety Guidebook [10] defines certifica-tion differently, adopting it from DO-178B [1]: “Legal rec-ognition by the certification authority that a product, service,organization or person complies with the applicable require-ments. Such certification comprises the activity of checking

123

Certification of software for real-time safety-critical systems: state of the art 151

the product, service, organization or person and the formalrecognition of compliance with the applicable requirementsby issue of a certificate, license, approval or other documentas required by national law or procedures. In particular, cer-tification of a product involves: (a) the process of assuringthe design of a product to ensure that it complies with a set ofstandards applicable to that type of product so as to demon-strate an acceptable level of safety; (b) the process of assess-ing an individual product to ensure that it conforms with thecertified type design; (c) the issue of any certificate requiredby national laws to declare that compliance or conformityhas been found with applicable standards in accordance withitems (a) or (b) above.” But in its body the standard is veryvague and only descriptive rather than prescriptive regardingcertification.

NASA report [11] is probably the best available source todescribe certification in aerospace industry. It defines certi-fication according to DO-178B, and essentially adopts thisstandard outlining it with details. Then, it describes two cer-tification processes internal to NASA, used at the DrydenFlight Research Center and the Jet Propulsion Laboratory. Itredefines certification in a glossary, saying that it is a “pro-cess for demonstrating that system safety is satisfactory forflight operation”.

2.3 Military standards

2.3.1 Standards adopted in the U.S.

The numerous military standards on safety assurance taketheir specific approaches to computer certification. The firstknown report concerning issues related to airborne softwarecertification appeared in 1978 [12]. It defines certification asthe formal administrative procedures established to substan-tiate that enough evidence has been obtained to state withnear certainty that the performance of the acquired systemand its attendant software will satisfy the user’s documentedneed. It determines the certification checklist, composed often items, mostly non-technical, of which the most impor-tant is the following: “Have critical functions been subjectedto an independent verification and validation and have allidentified discrepancies been satisfactorily resolved?”

MIL-HDBK-516B [13] defines airworthiness certificationas “A repeatable process implemented to verify that a spe-cific air vehicle system can be, or has been, safely main-tained and operated within its described flight envelope. Thetwo necessary conditions for issuance and maintenance ofan airworthiness certificate are (1) the aircraft must conformto its type design as documented on its type certificate, and(2) the aircraft must be in a condition for safe operation.”Regarding software safety, in Section 14.3 “Software SafetyProgram”, it lists the following three steps and refers to three

other documents MIL-STD-882D [14], Joint Software Sys-tem, and DO-178B [1]:

• Verify that a comprehensive software safety program isintegrated into the overall system safety program

• Verify the software safety program requires that appro-priate software safety-related analyses be performed aspart of the software development process

• Verify that the design/modification software is evaluatedto ensure controlled or monitored functions do not initi-ate hazardous events or mishaps in either the on or off(powered) state

MIL-STD-882D [14], in turn, is not at all specific to soft-ware or hardware certification. The Software System SafetyHandbook [15] does not take a stand on certification either,referring to other standards, in particular DO-178B [1], DEFSTAN 00-56 [16] and Australian [17], and using the termassurance, instead, in the statement of purpose: “Providemanagement and engineering guidelines to achieve a rea-sonable level of assurance that software will execute withinthe system context with an acceptable level of safety risk.”

2.3.2 Selected standards in other countries

Regarding software related military standards issued in othercountries, DEF STAN 00-56 [16] Issue 4 does not refer to cer-tification at all, while DEF(AUST) 5679 [17] refers vaguelyto the role of a Certifier, as follows: “The procurement pro-cess may also involve a Certifier. The Certifier is an organiza-tion (such as the Australian Ordnance Council for munitions;the RAAF Directorate of Technical Airworthiness, etc.) thatwill assess safety and suitability for service of the System.The Certifier may impose specific requirements in additionto those of this Standard.” Its multiple applications sincethe time of publication exposed several areas for improve-ment. The approach taken in this regard is that the “principlesand requirements [are] to be placed on developers, and arenot prescriptive about the processes, methods and tools thatshould be used during system development” [18]. It requiresthat “an independent System Safety Evaluation be carriedout to assess the technical validity of the Safety Case.” It is,however, “important to avoid the extreme situations where,on the one hand, System Safety Evaluators become part of thedevelopment process (and thus compromise independence)and, on the other hand, Evaluation is done only at the veryend of development…” [18].

The Swedish Armed Forces Handbook for Software inSafety-Critical Applications [19] defines certification as fol-lows: “Confirmation from an independent authority (thirdparty) that a product, process or demonstration satisfies pre-scribed requirements, usually according to one or severalspecified and established standard.” It refers to certification

123

152 A. Kornecki, J. Zalewski

of software tools: “Shall be certified by an independentauthority against an adequate, official specification or stan-dard.” It states additionally that “Certification is not a proofof correctness. Accepted tests and reviews, as well as wide-spread usage are factors contributing to an increased con-fidence that the software has been subjected to a qualifieddevelopment”.

2.3.3 NATO report

The NATO Technical Report [5] includes entire section oncertification, which is worth a longer discussion. The sec-tion describes certification of embedded systems and givesthe following definition: “Software certification is simply theprocess of generating a certificate that supports the claim thatthe software was either: (1) developed in a certain manner,(2) will exhibit some set of desirable run-time characteristics,or (3) has some other static characteristic embedded into it.”The authors ask and answer two key questions: (1) who per-forms the certification, and (2) what is being certified?

In answering the first question, who performs the certi-fication, they mention the vendor, a client, or a third party,giving examples of third party certifiers, such as UnderwriterLaboratories (in electronics), designated engineering repre-sentatives (DER’s) (in aviation), German Technische Über-wachungs Verein (in industrial applications), etc. Answeringthe question on “what is certified”, three types of subjects arementioned:

• a process, to insure that certain processes were followedduring software development

• software engineering professionals, which should be ref-erred to as “professional licensing”, and

• the software itself, to certify that it will behave correctlyin use, referred to as “product certification”.

Two views of product certification are mentioned: (1) fit-ness for purpose, and (2) compliance with requirements,which are equivalent only if the requirements define exactlywhat the software is intended to do. Potential benefits andshortcoming of certification are mentioned, and the mainrecommendation is made that “the certification of embeddedsystems should be based around the concept of dependabilitycase” (as for example, in CAP 670 SW01 [4]). The depend-ability case should be based on claims/arguments/evidenceframework including the following components:

• certification requirements in terms of claims about thesystem and its attributes, expressed in terms of respectivemodels (at the system or lower levels)

• evidence that supports the claims, and• an explicit set of arguments that provides a link from the

evidence to the claims.

The conclusion of this report states that process certifica-tion should only support confidence in the evidence, but theoverall emphasis of certification should be on the product.

3 Software certification according to DO-178B

There are three essential categories of software that impactthe certification process, due to their different functional-ity: real-time operating systems (RTOS), programming lan-guages (with their compilers), and development tools.

3.1 Real-time operating systems

There is an evident trend to adopt the RTOS kernels to increas-ing scrutiny of regulatory demands. Shortly after issuing theDO-178B documents, the vendors have quickly “jumped onthe bandwagon” and attempted to comply with requirementsof DO-178B, claiming certifiability. The examples of “cer-tifiable” RTOS include VxWorks from Wind River Systems[20–22], LynxOS from LynuxWorks, Integrity from GreenHills Software, Linux and RTLinux, RTEMS and microC.

Romanski reports [20] on certification attempts ofVxWorks that started in 1999. At the start of the project,the specifications, documentation, and source code were allanalyzed to determine which features need to be removed orchanged to support the certification. The analysis showed thatthe core operating system with many of the support librariescould be certified, with some restrictions, for example, onmemory allocation/deallocation functions. The process waslargely automated, with a database and CD-ROM materi-als deliverable to the auditors. Further, Fachet [21] reportson the VxWorks certification process to meet the criteria ofIEC 61508 [23], and Parkinson and Kinnan [22] describethe entire development platform for a specific version of thekernel VxWorks 653 to be used in the integrated modularavionics.

Not much information, except articles in trade magazines,is available on other real-time kernels. A commercial pressarticle on LynxOS-178 [24] claims that it has been verified tothe DO-178B Level A and used by avionics manufacturers insafety-critical systems. The same is true about Green Hills’Integrity-178B real-time kernel [25,26], claiming protectionboth in the time domain and in the space domain, althoughnot much scientific evidence has been published.

Locke [27] considers Linux as a candidate for DO-178Bcertification, but no evidence has been provided, at the timeof this writing, that any Linux version has been certified orverified to DO-178B requirements. Nevertheless, the authorsuggests the steps which could lead to such certification.They would rely mostly on generating the paper trail, that isrespective requirements and design documents, including test

123

Certification of software for real-time safety-critical systems: state of the art 153

plans and procedures, and completing the test and verificationprocesses.

Applying the definition of certification as “procedure bywhich a third-party gives written assurance that a product,process or service conforms to specified requirements”,Moraes et al. [28] use the risk assessment technique fail-ure mode and effect analysis (FMEA) to create a metric andanalyze data for two kernels RTLinux and RTEMS. The anal-ysis shows that if the threshold to certify the software is setto an estimated risk lower than 2.5%, only RTEMS would becertified.

Interestingly, a well described process of selecting anRTOS according to DO-178B guidelines led to a choiceof microC/OS kernel [29], a relatively unknown althoughwell documented RTOS, available for many years but notmuch advertised [30]. Verification of this RTOS has beencontracted to an independent organization and all require-ments-based tests have been completed in 2003.

Overall, various authors put a number of limitations on theRTOS certifiable to DO-178B, due to conflicts with requireddeterministic behavior. Romanski [31] lists restrictions onfunctionality, resource use (heap and stack), and timerelated issues (cache memory, pipeline processing andco-processors). Medoff [32] adds more items to the list, suchas creation of processes, memory allocation, etc.

3.2 Programming languages

A similar trend among vendors is visible in the area of pro-gramming languages and compilers. In an earlier article,Halang and Zalewski [33] presented an overview of pro-gramming languages for use in safety-related applicationsup to 2002, focusing on PEARL, originated and predomi-nantly used in Germany. Their observation with respect toDO-178B and other standards is that “because verificationis the main prerequisite to enable the certification of largersoftware-based solutions, only serious improvements aimingto support the process of program verification will be a stepin the right direction.”

There are essentially three contenders among languagesused in safety-critical systems: Ada, C/C++ and Java, forwhich DO-178B certifiability is claimed. The most advancedin this respect seems to be Ada, whose certification attemptsgo back to the eighties, with roots in compiler validation [34].

3.2.1 Ada and compiler certification

Santhanam [35] answers the question, what does it meanto qualify a compiler tool suite per DO-178B requirements,and lists the requirements on the object code and the develop-ment process, estimating the overwhelming cost of providingevidence. Therefore, defensive techniques are advocated, toassure confidence in the compiler correctness with the use of

assertions, optimizations turned off, no suppression of run-time checks, avoidance of nested subprograms, etc.

Features of the object model of Ada 2005 are claimed to be“well suited for applications that have to meet certification atvarious levels” [36]. It meets the safety requirement, whichmeans that programmers are able “to write programs withhigh assurance that their execution does not introduce haz-ards” [37], in order “to allow the system to be certified againstsafety standards”, such as DO-178B. However, the commonopinion, expressed by the same authors, who actually devel-oped compilers, is that compilers “are far too complex to bethemselves certified” [36,37].

One version of Ada, which makes use of its severely lim-ited subset, named SPARK, seems to have gained some popu-larity in safety-critical applications, because of the existenceof its formal definition. Amey et al. [38] report on multi-ple applications of SPARK in industry, including one to theDO-178B Level A.

3.2.2 C/C++ certification issues

In the C/C++ world, there have not been many reports on thesuccessful uses of these languages in safety-critical applica-tions that would pass or be aimed at any certification efforts.The languages are being widely criticized for having toomany features not necessarily suitable for safety-critical sys-tems.

Hatton [39] gave an overview of safer C subsets andMISRA C, in particular, following his crusade toward makeC a safer language. His premise was that “C is the perfectlanguage for non-controversial safer subsetting as it is knownto suffer from a number of potential fault modes and the faultmodes are very well understood in general.” He analyzed thestandards with respect to style related rules, divided furtherinto rules based on “folklore” and those based on knownfailures. He observed that “MISRA C does not address allknown fault modes, and does not incorporate the full rangeof analysis checks that it might.”

One interesting result of this work is the development of aso called signal-to-noise ratio of a language standard, whichreflects how well the serious defects can be seen “in a wealthof other less useful information.” Hatton also wrote an inter-esting paper on comparing the newer MISRA C standard tothe previous version [40]. One of the early papers on C fromthe perspective of safety-critical systems was published byLindner [41].

Despite the enormous popularity of C++, the number ofC++ applications in avionics is relatively low, perhaps dueto the multitude of known language problems. Subbiah andNagaraj [42] report on the issues with C++ certification foravionics systems, focusing on structural coverage, whoseintent is “to ensure that all output of the compiler is testedduring the execution of the requirement-based tests, so as

123

154 A. Kornecki, J. Zalewski

to preclude the possibility that some instruction or data itemproduced by the compiler is first depended upon during oper-ation.” They discuss experiences faced during the verificationof two software components, concentrating on object-ori-ented issues, such as class inheritance, inline functions andencapsulation.

Finally, it is worth mentioning an older comparison of Cand Ada, in a still valid dispute between academics and prac-titioners [43,44]. Certification at both language and RTOSlevel is discussed in a paper by Parkinson and Gasperoni[45].

3.2.3 Java on its way to certification

It seems like Java, due to its inherent properties, such as auto-matic garbage collection, would not be suitable for real-timesafety-critical systems. Nevertheless, work is on the way todefine an appropriate Java profile and leverage the languageto meet the required safety criteria. Nilsen [46,47] arguesthat in hard real-time systems, due to their continuous inter-action with the environment, which imposes unpredictablebehavior on such a system, “the validity of each independentcomponent cannot be demonstrated in isolation from its con-text.” This imposes a constraint expressed in the guidelines,such as DO-178B, that the certification evidence should beapplied to complete systems rather than to the individualcomponents. Taking into account the arrival of new comput-ing technologies and the need to reduce development costsby software reuse, work has begun on updating DO-178B,with the expectation to allow certification of individual com-ponents. In this view, work on real-time specification for Java(RTSJ) [48] addresses issues such as safe stack allocation ofobjects, atomic synchronization locks, static analysis of CPUtime, and others [46].

However, RTSJ does not seem to lead in a straightfor-ward manner to implementations that would be verifiablefor safety-critical systems and respective certifications [49].Therefore, work on a new profile for safety critical Java (SCJ)has been initiated. The assumption is that the new profileshould be compact and simple to use when modeling anapplication, but still powerful. Current proposal includes, forexample, thread classes for periodic time-triggered and forsporadic even-triggered activities, classes for access to phys-ical memory and time, but also has a number of features hid-den from the programmer or implemented in analysis tools, toprovide platform independence. It is based on a well knownRavenscar profile, originally developed for Ada and recentlytailored toward Java [50].

Recently, a couple of industry reports have been publishedon the use of Java in safety-critical applications in avionics[51–53]. Dautelle [51] reports on developing a test suite forRTSJ, which includes measuring such application parametersas latency (context switch and dispatch latencies), execution

jitter (cased by various factors: concurrent garbage collec-tion, low priority thread cache effect, etc.), and network dis-tribution delays. Compliance with RTSJ standard is sought,with focus on performance, but no certification is mentioned.

Hu et al. [52] describe current version of the hard real-timeprofile for (safety critical) Java, focusing on such issues asprogram initialization and class loading, memory manage-ment, and scheduling. They discuss a case study of a fightmanagement and guidance system and its implementation,and present some conclusions from the experiments. Oneof the results states that “the language promises significantproductivity improvements for the late development phases,even though the expected benefits over a complete DO178-compliant development cycle have still to be assessed.” Theyalso mention “the lack of qualification cases compatible withAirworthiness Authorities expectations” regarding the Javacompiler.

Armbruster et al. [53] report on an implementation ofRTSJ and its use in the control software for the ScanEagleUAV. They discuss extensively their experiences with imple-menting priority scheduling, priority inheritance, scopedmemory, garbage collection, and real-time scope-aware classlibraries, and report on the challenges faced, among which themost significant was the issue involving the RTSJ memorymanagement model (scoped memory). The software report-edly passed Boeing’s internal qualification test, but the mainconcern was about the level of maturity of tools and vendorsupport.

Brosgol and Wellings [54], in their comparison of real-time Java with Ada, address language features against thefollowing four general criteria: reliability, predictability, ana-lyzability and expressiveness. They analyze both languageswith respect to high-level features, encapsulation, object ori-entation, generics, in-line expansion, run-time support facili-ties, and optimizations, and come up with a list of advantagesand disadvantages, characterizing each language’s suitabilityfor safety-critical applications.

4 Software development tools

4.1 Tool qualification according to DO-178B

Regarding the use of tools, the FAA recently released a com-prehensive report on “Assessment of Software DevelopmentTools for Safety-Critical Real-Time Systems” [55], whichhas been summarized in [56] and briefed in [57] regard-ing tool qualification. The experimental part of this workinvolved collecting data from the usage of six software designtools (as opposed to verification tools [58]), in a small-scalesoftware development project, regarding four software qual-ity criteria. Assuming these criteria were direct metrics of

123

Certification of software for real-time safety-critical systems: state of the art 155

quality, the following specific measures to evaluate themwere defined and used in the experiments:

• usability measured as development effort (in hours)• functionality measured via the questionnaire (on a 0–5

point scale)• efficiency measured as code size (in lines of code, LOC)• traceability measured by manual tracking (in number of

defects).

The selected tools included three from the structural(object-oriented) category and three from the functional(block-oriented) category, with one of them actually crossingthe boundary of the two categories. The conducted processof tool assessment can be characterized by the following fouractivities:

• application of a domain-specific benchmark problem andplatform on which the tool will be evaluated;

• identification of specific criteria (metrics such as usabil-ity, traceability, etc.), against which the tool will be eval-uated;

• development of a measurement method to evaluate eachcriterion (metric);

• collection and analysis of results.

Several conclusions were drawn from this study, includingthe following:

• the criteria used gave a relatively good assessment of toolquality, although they may not scale-up well in a largerscale project

• traceability is the critical element of tool assessment, buthas to be automated, which may be difficult to implementin actual development processes

• several other important criteria, which were not used inthis study, such as reliability or robustness, may contrib-ute to a significantly more insightful evaluation of a giventool, but the collection of respective experimental datamay not be practical.

This study was followed by the development of a method-ology for numerical tool evaluation using Bayesian beliefnetworks [59].

A good number of articles by other authors have beenwritten on tool verification, qualification and certificationattempts, especially dealing with issues how the respectiveprocesses must address the requirements of the DO-178B.In particular, the decision must be made, whether tool qual-ification is necessary (see Fig. 1).

A tool is categorized as the development tool, if it caninsert an error in the airborne system, or as the verification

Fig. 1 Tool qualification conditions according to DO-178B [1]

tool, if it may only fail to detect an error. In the following,we try to cover issues related to software verification tools.

For verification tool qualification, several interestingpapers have been published in the last few years. As Dewarand Brosgol [60] point out in their discussion of static anal-ysis tools for safety certification, a tool as fundamental asthe compiler can be certainly treated as a development tool,but also as a verification tool, since compilers “often performmuch more extensive tasks of program analysis.” As a perfectcounterexample they refer to the Spark’s Examiner, which isnot a usual kind of compiler, because it does not generate codeat all. It is only used for checking the program, neverthelessis a part of a software development process. Furthermore,they ask the question should the tools “be certified with thesame rigorous approach that is used for safety-critical appli-cations?” Their answer is that this is not practical, and theysupport this view by stating that even “the compilers them-selves are out of reach for formal safety certification, becauseof their inherent complexity.”

Talking about the tools that are not involved in code gen-eration, only do the static analysis of code, they point outthat there is a significant problem with this approach, since“the certification evidence is based on the system’s staticsource text but needs to relate the system’s dynamic, run-time, behavior.” The essential function of static analysis toolshould therefore be the analysis of stack usage, and theydescribe a couple of such tools from SofCheck and Gram-maTech. Since the stack analysis tool does not generate newcode, it can be qualified as a verification tool in DO-178Bsense.

Dewar supports this view in another article [61], elaborat-ing more on the tools for static analysis of such properties asschedulability, worst-case timing, freedom from race condi-tions, freedom from side effects, etc. He also offers his viewson the use of testing, object-oriented programming, dynamic

123

156 A. Kornecki, J. Zalewski

dispatching, and other issues in developing safety-criticalsystems. He elaborates on the role of the DER’s, whose jobis to work with software development companies and thecertification authorities on the qualification and certificationissues, stating that DER’s “are the building inspectors of thesoftware engineering industry.”

In other articles on static analysis tools for potential qual-ification, Anderson [62] explains how such advanced toolsperform the analysis, and Gasperoni [63] discusses code cov-erage tools that tell the developer “which portions of theapplication are really being executed.” He refers to free soft-ware coverage tools that could be used for safety-criticalsoftware projects at all levels of criticality. In turn,Santhanam [64] describes a toolset called test set editor(TSE), which automates the compiler testing process andworking in combination with the Excel spreadsheet and thehomegrown scripts in Tcl/tk significantly contribute to costsavings in constructing structural tests to satisfy FAA certi-fication requirements.

4.2 Model-based development and tool qualification

Frey and Stürmer examine the issue of code generation forsafety-critical systems in the automotive industry [65]. Theyrefer to such tools as Simulink/Stateflow and real-time work-shop embedded coder from The MathWorks that are inwidespread use for model-based development. Graphical rep-resentation of a model encourages automatic code genera-tion, thus, the designers and developers “are faced with thequestion as to whether tools used for automating manualtransformation work need to be qualified or somehow val-idated in accordance with a defined procedure.” The auto-motive industry, however, uses a derivative version of IEC61508 [66], which is not as specific about tool qualificationas DO-178B.

A series of other papers discuss the use of MathWorkstools in model-based development for safety-critical systems,and their certification according to IEC 61508-3. Conrad [67]notices that IEC 61508 has been developed a while backfor traditional, hand-coded, software processes and “doesnot address popular development technologies such as pro-duction code generation directly.” Therefore, he proposes anenhancement of the standard to address model-based design.In his view, the software safety integrity tables in IEC 61508can be used to provide a project independent mapping ofrecommended techniques and measures onto tools, such asthe Simulink family of products and related processes. Thesetable templates can then be tailored to the needs of a particu-lar project and submitted to the certification authority as partof the compliance demonstration process.

Erkkinen [68] gives a broader background of the approachtaken by MathWorks in production code generation forsafety-critical systems, and Porter [69] discusses in more

details how this approach addresses the certification require-ments of DO-178B. He refers in particular to the modelcoverage tool that can provide the following information:cyclomatic complexity, decision coverage, condition cover-age, modified condition/decision coverage (MC/DC), lookuptable and signal range coverage.

Since model-based development is applied more and morewidely in safety-critical systems, all types of tools receivegreater scrutiny in the view of qualification and certification.Several recent papers discuss such issues as the impact ofmodel-based development on certification [70], model-basedtesting of code generators [71], verification of model pro-cessing tools (syntax checking tools, simulation tools, anal-ysis tools, and synthesis/code generation tools) [72], and theapplication of model-based approach to the validation andverification of flight critical software for the UAV’s [73].Denney and Trac [74] discuss an interesting approach toautomatic code generation, by using a tool called AutoCert,which “supports certification by automatically verifying thatthe generated code is free of different safety violations, byconstructing an independently verifiable certificate, and byexplaining its analysis in a textual form suitable for codereviews.” Their approach, adapted and tested for MathWorksreal-time workshop is general in principle and does not relyon the correctness of the code generator or any of its compo-nents.

4.3 Other tool qualification issues

Several other papers related to tool certification or qual-ification are worth mentioning. Zoffman et al. [75] pres-ent a general classification scheme for software verificationtools, which are divided into several categories based on(1) objectives (such as test coverage, code compliance, soft-ware performance), (2) methods (such as unit testing, faultinjection, etc.), (3) metrics (decision coverage, condition cov-erage, cyclomatic complexity, etc.), and (4) attributes. Theclassification, although interesting in itself and referring toDO-178B in the text of the paper, does not provide muchinsight into the certification issues. Bunyakiati et al. [76]propose another classification, based on a three-dimensionalmodel, involving product, process and personnel certifica-tion on one axis, first, second and third party certification onthe second axis, and the functional architecture of the tools.This model, although very interesting, does not make anyreference to safety-critical software, so it is not clear whatbenefits would it bring if applied in this domain.

A recent FAA report [58] provides an overview of the ver-ification tools available up to the time of report’s publication.One tool not covered in this report, Astrée, is described in[77]. It is a parametric, abstract interpretation-based, staticanalyzer that aims at proving the absence of run-time errorsin safety-critical avionics software written in C. The authors,

123

Certification of software for real-time safety-critical systems: state of the art 157

representing Airbus, claim that they succeeded on using thetool on a real-size program “as is”, without altering or adjust-ing it before the analysis. Other issues addressed with thistool, although not described in the paper, include: assess-ment of worst-case execution time, safe memory use, andprecision and stability of floating-point computations. In allthat, automatically generated code should be subjected to thesame verification and validation techniques as hand-writtencode.

It may be also worth noting that all established tool ven-dors have been addressing the DO-178B issues for sometime now. One such interesting example is McCabe Soft-ware [78]. Their document provides a summary of McCabeIQ tool functionality and explains how the tool can be usedto support the DO-178B guidelines. Several other vendorsdo the same, and the current list of safety-critical softwaretools can be found on the web [79].

Finally, to conclude the section on software tools, it isworth noting that from press releases it becomes evident thatvendors providing safety-critical software at various levels,begin joining forces to increase their mutual market value.Several recent examples of such alliances include the follow-ing. Wind River Systems and Esterel Technologies created apartnership integrating VxWorks 653 platform and SCADEtool to provide a solution for rapid and reliable deploymentof DO-178B systems. Similarly, LynuxWorks and TTTechjoined forces to by integrating their products, LynxOS real-time kernel and time-triggered protocol (TTP) to address cer-tification according to DO-178B.

5 Tool qualification against DO-254

Since the growing complexity of electronics hardware req-uires the use of automatic software tools in the design pro-cess, the DO-254 [2] document also includes a section on toolqualification. It distinguishes between design tools, whichcan introduce errors into the product, and verification tools,which do not introduce errors into the product but may faildetecting errors in the product. The qualification process toolvendors have to comply with is shown in Fig. 2.

5.1 Overview of related work

In this view, several vendors recently began dealing withhardware design tool qualification. Aldec [80] used a sampledesign of a system containing two connected boards: Aldecboard generating stimuli and collecting results for designunder test (DUT) and the second user designed board.

The verification process contains three independentstages: simulation, verification, and comparison. The sim-ulation stage is a typical HDL-level simulation in Active-HDL simulator. During simulation, stimuli and results are

1. Identify the tool 2. Identify the processthe tool supports

6. Establish qualificationbaseline and

problem reporting

7. Basic toolqualification

9. Design toolqualification

yes

yes

no

no

no

10. Complete the process

DO-254/ED-80Tools

Assessment and Qualification

Process

3. Independent assessment?

4. Tool is design A/B/C or verification A/B?

5. Relevant toolhistory?

8. Tool is design tool A/B?

no

yes

yes

Fig. 2 Tool assessment and qualification process according to DO-254[2]

captured to waveform (in ASDB format) on specified edgeof user clock. The clock line of DUT is not stored in waveformfile. For hardware verification purpose the prototype verifi-cation tool (PVT) program is used. It maintains communi-cation with FIFO sending test vectors to DUT and retrievingresponse data from DUT. During verification process, appli-cation continuously performs two tasks: writing stimuli toFIFO and reading results from FIFO, which are written toa raw binary file. At the end of verification, binary resultsare transformed to the ASDB waveform file. At the compar-ison stage, a waveform captured during simulation is com-pared with the one obtained from hardware verification. Ifthere are no differences, it means that verification has fin-ished successfully.

Lange [81] addresses circuit metastability in the contextof DO-254 tool qualification. Metastability is the term todescribe what happens in digital circuits when the clock anddata inputs of a flip-flop change values at approximately thesame time. This leads to the flip-flop output oscillating andnot settling to a value within the appropriate delay window.The output of the flip-flop is said to have gone “metastable.”This happens in designs containing multiple asynchronousclocks, when two or more discrete systems communicate.Metastability is a serious problem in safety-critical designs asit causes intermittent failures. A comprehensive verificationsolution is offered by Mentor Graphics 0-In clock domaincrossing (CDC) tool that essentially does the following:

• Performs a structural analysis on the RTL code to iden-tify and analyze all signals crossing clock domains, anddetermine if their synchronization schemes are presentand correct.

• Verifies transfer protocols to assure that the synchroniza-tion schemes are used correctly, by monitoring and veri-fying that protocols are being followed during simulation.

123

158 A. Kornecki, J. Zalewski

• Globally checks for reconvergence, which is most effec-tively done by injecting the effects of potential metasta-bility into the simulation environment and determininghow the design will react.

The 0-In CDC tool provides added assurance that thedesign will function correctly within the intended system. Ifone has a specific requirement from the customer or a DERto verify your CDCs and identify and eliminate instances ofmetastability, then one has to use a method of tool assess-ment. Again, the one suggested is the Independent OutputAssessment.

Another tool from Mentor Graphics, ModelSim, is dis-cussed by Lange [82] in a view of meeting the DO-254 crite-ria. It is considered a verification tool, because it is used fordigital simulation of directed test cases and provides cover-age data, not generating the code to be used in the productioncircuit. The paper outlines the exact ten steps to go throughthe DO-254 assessment and qualification process, as pre-sented in Fig. 2. The suggested way to proceed with toolassessment is to avoid qualification by using an independentoutput assessment method (Step 3 in Fig. 2). For ModelSimit can be done as follows:

• reviewing RTL simulation outputs whether they matchsynthesized (gate level) simulations; although this doesnot provide independent assessment alone, if the outputsof these two simulations match, then the likelihood of anerror in ModelSim is extremely low

• applying some of the same tests on the physical deviceand checking the results against earlier simulations; if theresults for the actual physical circuit match the verifica-tion results for the model, then one can logically con-clude that the tool. ModelSim, is correctly simulating themodel.

According to the author, the above two steps form the basisfor meeting the DO-254 tool assessment criteria.

TNI [83] presents Reqtify, a tool requirement traceability,impact analysis and automated documentation generation.Reqtify provides the following functionality: requirementcoverage analysis, upstream and downstream impact analy-sis, requirement change, update and deletion trackingthroughout the project life cycle, requirement attribute han-dling, filtering and display depending on these attributes, userconfigurable documentation generation, and regression anal-ysis. This technical note presents how Reqtify complies withthe DO-254 objectives.

According to DO-254 classification, Reqtify is a verifi-cation tool, as it is a tool “that cannot introduce errors, butmay fail to detect an error in the hardware item or hardwaredesign.” Prior to the use of the tool, a tool assessment shouldbe performed. The purpose of tool assessment and qualifi-

cation is to ensure that the tool is capable of performing theparticular verification activity to an acceptable level of con-fidence for which the tool will be used. It is only necessary toassess those functions of the tool used for a specific hardwarelife cycle activity, not the entire tool. The assessment activityfocuses as much or more on the application of the tool asthe tool itself. Verification tool only needs to be qualified ifthe function that it performs is not verified by another activ-ity. The flow chart from DO-254 is applied and indicates thetool assessment considerations and activities and providesguidance for when tool qualification may be necessary.

Dellacherie et al. [84] described a static formal approachthat could be used, in combination with requirements trace-ability features, to apply formal methods in the design andverification of hardware controllers to support such protocolsas ARINC 429, ARINC 629, MIL-STD-1553B, etc. Theirpaper describes the application of a formal tool, improve-HDL, in the design and verification of airborne electronichardware developed in a DO-254 context. imPROVE-HDLis a formal property checker that complements simulationin performing exhaustive debugging of VHDL/Verilog Reg-ister-Transfer-Level hardware models of complex avionicsprotocol controllers without the need to create testbenches.Reqtify tool is used to track the requirements throughout theverification process and to produce coverage reports. Accord-ing to the authors, using imPROVE-HDL coupled with Req-tify, avionics hardware designers can assure that their buscontrollers meet the most stringent safety guidelines outlinedin DO-254.

5.2 Tool questionnaire

To identify issues and concerns in tool qualification and cer-tification, and help understand the underlying problems, weconducted a survey to collect data on the experiences andopinions concerning the use of programmable logic toolsas applied to design and verification of complex electronichardware according to the RTCA DO-254 guidelines. Theobjective was to collect feedback, from industry and certifi-cation authorities, on assessment and qualification of thesetools.

The questionnaire has been developed and distributed dur-ing the 2007 National FAA Software & Complex ElectronicHardware Conference, in New Orleans, Louisiana, in July2007, attended by over 200 participants. In subsequentmonths, we have also distributed this questionnaire to the par-ticipants of other professional events. It has been made avail-able via DO-254 Users Group website (http://www.do-254.org/?p=tools). As a result of these activities a sample ofnearly forty completely filed responses was received. Eventhough this may not be a sample fully statistically valid, thecollected results make for several interesting observations.

123

Certification of software for real-time safety-critical systems: state of the art 159

The survey population, by type of the organization, inclu-ded the majority of respondents from avionics or engine con-trol developers (65%). Over 95% of respondents have techni-cal background (55% bachelor and 45% master degrees) andover 72% have educational background in electronics. While97% of respondents have more than 3 years of experience,59% have more than 12 years. The most frequent respon-dents’ roles relevant to the complex electronics tools include:

• use of the tools for development or verification of systems(62%)

• managing and acting as company’s designated engineer-ing representative (26%)

• development of the tools (2%)• development of components (12%).

The respondents’ primary interest was divided betweenverification (32%), development (27%), hardware (22%) andconcept/architecture (18%).

Considering criteria for the selection of tools for use inDO-254 projects (Fig. 3), as the most important have beenreported the following: the available documentation, ease ofqualification, previous tool use, and host platform, followedby the quality of support, tool functionality, tool vendor repu-tation, and the previous use on airborne project. Selection ofa tool for the project is based either on a limited familiariza-tion with the demo version (50%) or an extensive review andtest (40%). The approach to review and test the tool by train-ing the personnel and using trial period on a smaller projectseems to be prevailing.

For those who have experienced effort to qualify pro-grammable logic tools (only 14% of respondents), the qual-ity of the guidelines is sufficient or appropriate (62%), so

Fig. 3 Tool selection criteria in DO-254 projects (from left to right:vendor reputation, functionality, acquisition cost, compatibility withexisting tools, compatibility with development platform, reliability,availability of training, amount of training needed, documentation qual-ity, quality of support, previous familiarity with the tool, performanceon internal evaluations, host platform, compatibility with PLDs, previ-ous use on airborne products, tool performance, ease of qualification,other criteria)

is the ease of finding required information (67%), whilethe increase of workload was deemed negligible or mod-erate (80%). An interesting observation concerns the scaleof safety improvement due to qualification: marginal (43%),moderate (21%), noticeable (7%) and significant (29%). Sim-ilarly, the question about errors found in the tools may be asource for concern: no errors (11%), few and minor errors(50%), significant and numerous (17%). Despite all this, thesatisfaction level toward programmable logic tools was high:more than 96% of respondents marked their satisfaction levelas 4 out of 5.

Overall, it is obvious that software tools used in design andverification of complex electronics in safety-critical applica-tions should be scrutinized because of concerns that they mayintroduce design errors leading to accidents. However, theconducted survey indicated that the most important criteriafor tool selection are considered to be: available documen-tation, ease of qualification, and previous tool use, none ofwhich is technical.

6 Conclusion

The paper makes an attempt to show the role of software cer-tification and tool qualification in development of dependablesystems, both from the software and hardware perspective.An important observation is about the increasing role of soft-ware tools, which are used to create and verify both soft-ware and hardware. An extensive literature review has beenpresented, focusing on the issues of civil aviation guidancerequiring specified level of assurance for the airborne sys-tems, both from the software and hardware perspective.

In a view of conducted analysis, work should be done ondeveloping more objective criteria for tool qualification andconducting experiments with tools to identify their most vul-nerable functions that may be a source of subsequent designfaults and operational errors. As some of the papers point out,the lack of research investment in certification technologieswill have a significant impact on levels of autonomous con-trol approaches that can be properly flight certified, and couldlead to limiting capability for future autonomous systems.

Both DO-178B and DO-254 guidelines serve industrywell and promote rigor and scrutiny required by highly crit-ical systems. However, the relative vagueness of these guide-lines causes significant differences in interpretation byindustry and should be eliminated. RTCA called a new com-mittee, SC-205, with a charge to revise DO-178B guidance.Possibly, a common ground should be found between RTCADO-254 and DO-178B guidelines.

Acknowledgments The presented work was supported in part by theAviation Airworthiness Center of Excellence under contract DTFACT-07-C-00010 sponsored by the FAA. Findings contained herein are notnecessarily those of the FAA.

123

160 A. Kornecki, J. Zalewski

References

1. RTCA DO-178B, EUROCAE ED-12B (1992) Software consider-ations in airborne systems and equipment certification, RTCA Inc.,Washington, DC

2. RTCA DO-254, EUROCAE ED-80 (2000) Design assurance guid-ance for airborne electronic hardware, RTCA Inc., Washington, DC

3. Kesseler E (2004) Integrating air transport elicits the need to har-monise software certification while maintaining safety and achiev-ing security, Report NLR-TP-2004-255. Aerosp Sci Technol J8(4):347–358

4. CAP 670 Air Traffic Services Safety Requirements (2007) Part B,Section 3, Systems engineering. SW 01 regulatory objectives forsoftware safety assurance in ATS equipment, Safety RegulationGroup, Civil Aviation Authority, Norwich, UK

5. NATO (2005) Validation, verification and certification of embed-ded systems, Report TR-IST-027, NATO RTO Task Group IST-027/RTG-009

6. RTCA DO-278 (2002) Guidelines For communication, navigation,surveillance, and air traffic management (Cns/Atm) systems soft-ware integrity assurance, RTCA Inc., Washington, DC

7. NASA (2004) NASA-STD-8739.8 w/Change 1, Software assur-ance standard, National aeronautics and space administration,Washington, DC

8. IEEE (1992) IEEE Std 610.12 standard glossary of software engi-neering terminology. IEEE, Washington, DC

9. NASA (2004) NASA-STD-8719.13B w/Change 1, Software safetystandard, National aeronautics and space administration, Washing-ton, DC

10. NASA (2004) NASA software safety guidebook, NASA-GB-1740.13. National aeronautics and space administration, Washing-ton, DC

11. Nelson S (2003) Certification processes for safety-critical and mis-sion-critical aerospace software, Report NASA/CR-2003-212806,Ames Research Center, Moffet Field

12. Reifer DJ (1978) Airborne systems software acquisition engineer-ing guidebook for verification, validation and certification, Techni-cal Report ASD-TR-79-5028, TRW Defense and Space Systems,Redondo Beach

13. U.S. Department of Defense (2005) MIL-HDBK-516B, Depart-ment of Defense Handbook: Airworthiness Certification Criteria

14. U.S. Department of Defense (2000) MIL-STD-882D, standardpractice for system safety

15. Joint Services Computer Resource Management Group (1999)Software system safety handbook: a technical and managerialapproach

16. UK Ministry of Defence (2007) Def Stan 00-56 issue 4. Safetymanagement requirements for defence systems

17. Australian Ministry of Defence (1998) DEF(AUST) 5679, the pro-curement of computer-based safety critical systems, AustralianDefence Standard, Army Engineering Agency

18. Cant T, Mahony B, Atchison B (2005) Revision of Australiandefence standard DEF(AUST) 5679. In: Proceedings of 10th Aus-tralian workshop on safety-critical systems and software, Sydney,August 25–26, pp 85–94

19. Swedish Armed Forces (2005) M7762-000621-7 handbook forsoftware in safety-critical applications

20. Romanski G (2002) Certification of an operating system as a Reus-able Component. In: Proceedings of DASC’02, 21st digital avion-ics systems conference, Irvine, October 27–21, pp 5.D.3–5.D.1/9

21. Fachet R (2004) Re-use of software components in the IEC-61508certification process. In: Proceedings of IEE COTS & SOUP sem-inar, London, October 21, pp 8/1–17

22. Parkinson P, Kinnan L (2007) Safety-critical software developmentfor integrated modular avionics, White Paper, Wind River Systems,Alameda

23. International Electrotechnical Commission (1998) IEC 61508,Functional safety of electrical/electronic/programmable electronicsafety-related systems, Parts 1–9. Geneva

24. Rose G (2003) Safety critical software, CompactPCI Systems,April 2003

25. Kleidermacher D, Griglock M (2001) Safety-critical operating sys-tems. Embedded Syst Program 14(10):22–36

26. Kleidermacher D (2004) Operating systems: shouldering the secu-rity and safety burden, RTC Magazine, September 2004

27. Locke CD (2003) Safety-critical software certification: open sourceoperating systems less suitable than proprietary? COTS J 5(9):54–59

28. Moraes R et al (2007) Component-based software certificationbased on experimental risk assessment. In: Proceedings of LADC2007, 3rd Latin-American symposium on dependable computing,Morelia, Mexico, September 26–28, pp 179–197

29. Maxey B (2003) COTS integration in safety critical systems usingRTCA/DO-178B guidelines. In: Proceedings of ICCBSS 2003,2nd international conference on COTS-based software systems,Ottawa, ON, February 10–13, pp 134–142

30. Labrosse JJ (1993) MicroC/OS-II: the real-time kernel. R&DBooks, Lawrence

31. Romanski G (2001) The challenges of software certification.CrossTalk J Def Softw Eng 14(9):15–18

32. Medoff M (2007) Using certified operating systems effectively insafety critical embedded designs. Embed Syst Des. http://www.ghs.com/articles/GHS_certified_safety_critical_3_27_07.pdf

33. Halang W, Zalewski J (2003) Programming languages for use insafety related applications. Ann Rev Control 27:39–45

34. Goodenough JB (1980) The Ada compiler validation capability.ACM SIGPLAN Notices 15(11):1–8

35. Santhanam V (2003) The anatomy of an FAA-qualifiable Ada sub-set compiler. Ada Lett 23(1):40–43 (Proceedings of SIGAda’02,Houston, Texas, December 8–12, 2002)

36. Comar C, Dewar R, Dismukes G (2006) Certification & object ori-entation: the new Ada answer. In: Proceedings of ERTS 2006, 3rdembedded real-time systems conference, Toulouse, France, Janu-ary 25–27

37. Brosgol BM (2006) Ada 2005: a language for high-integrity appli-cations. CrossTalk J Def Syst 19(8):8–11

38. Amey P, Chapman R, White N (2005) Smart certification of mixedcriticality systems. In: Proceedings of Ada-Europe 2005, 10thinternational conference on reliable software technologies, York,UK, June 20–24, pp 144–155

39. Hatton L (2004) Safer language subsets: an overview and casehistory—MISRA C. Inform Softw Technol 46(7):465–472

40. Hatton L (2007) Language subsetting in an industrial context:a comparison of MISRA C 1998 and MISRA C 2004. Inform SciTechnol 49(5):475–482

41. Lindner A (1998) ANSI-C in safety critical applications: les-sons learned from software evaluation. In: Proceedings of SAFE-COMP’98, 17th international conference on computer safety,reliability and security, Heidelberg, Germany, October 5–7,pp 209–217

42. Subbiah S, Nagaraj S (2003) Issues with object orientation in ver-ifying safety-critical systems. In: Proceedings of ISORC’03, 6thinternational IEEE symposium on object-oriented real-time dis-tributed computing, Hakodate, Hokkaido, Japan, May 14–16

43. Berlejung H, Baron W (1996) Aspects of the development of safety-critical real-time software with the C programming language, Soft-waretechnik-Trends, Band 16, Heft 4, ss 21–25

44. Romanski G, Chelini J (1997) A response to the use of C in safety-critical systems, Softwaretechnik-Trends, Band 17, Heft 1, ss 38–43

45. Parkinson P, Gasperoni F (2002), High-integrity systems develop-ment for integrated modular avionics Using VxWorks and GNAT.

123

Certification of software for real-time safety-critical systems: state of the art 161

In: Proceedings of the 7th Ada-Europe international conferenceon reliable software technologies, Vienna, Austria, June 17–21,pp 163–178

46. Nilsen K (2006) Leveraging Java to achieve component reusabilityin safety-critical systems. COTS J 8(4):43–50

47. Nilsen K, Larkham A (2005) Applying Java technologies to mis-sion-critical and safety-critical development. In: Proceedings of13th safety-critical systems symposium, Southampton, UK, Feb-ruary 8–10, pp 211–223

48. Bollella G et al (2000) The real-time specification for Java.Addison-Wesley, Reading

49. Schoeberl M et al (2007) A profile for safety critical Java. In: Pro-ceedings of ISORC 2007, 10th IEEE international symposium onobject/component/service-oriented real-time distributed comput-ing, Santorini Island, Greece, May 7–9

50. Kwon J, Wellings A, King S (2002) Ravenscar-Java: a high integ-rity profile for real-time Java. Concurrency Comput Pract Experi-ence 17(5–6):681–713

51. Dautelle JM (2005) Validating Java for safety-critical applications.In: Proceedings of AIAA space 2005 conference, Long Beach, 30August–1 September

52. Hu EYS et al (2006) Safety critical applications and hard real-time profile for Java: a case study in avionics. In: Proceedings ofJTRES’06, 4th workshop on Java technologies for real-time andembedded systems, Paris, October 11–13, pp 125–134

53. Armbruster A et al (2007) A real-time Java virtual machinewith applications in avionics. ACM Trans Embed Comput Syst7(1):5:1–5:49

54. Brosgol BM, Wellings A (2006) A comparison of Ada and real-time Java for safety-critical applications. In: Proceedings of Ada-Europe 2006, 11th international conference on reliable softwaretechnologies, Porto, Portugal, June 5–9, pp 13–26

55. Kornecki A, Brixius N, Zalewski J (2007) Assessment of softwaredevelopment tools for safety-critical real-time systems, TechnicalReport DOT/FAA/AR-06/36, Federal Aviation Administration,Washington, DC

56. Kornecki A, Zalewski J (2005) Experimental evaluation of soft-ware development tools for safety-critical real-time systems. InnovSyst Softw Eng NASA J 1(2):176–188

57. Kornecki A, Zalewski J (2006) The qualification of software devel-opment tools from the DO-178B certification perspective. Cross-Talk J Def Softw Eng 19(4):19–23

58. Santhanam V et al (2007) Software verification tools assessmentstudy, Technical Report DOT/FAA/AR-06/54, Federal AviationAdministration, Washington, DC

59. Zalewski J, Kornecki A, Pfister H (2006) Numerical assess-ment of software development tools in real-time safety-critical sys-tems using Bayesian belief networks. In: Proceedings of IMC-SIT’06 international multiconference on computer science andinformation technology, Wisla, Poland, November 6–10, pp 433–442

60. Dewar R, Brosgol B (2006) Using static analysis tools for safetycertification, VMEbus Systems, pp 28–30, April 2006

61. Dewar RBK (2006) Safety critical design for secure systems, EETimes-India, July 2006

62. Anderson P (2008) Detecting bugs in safety-critical code.Dr Dobb’s J 406:22–27

63. Gasperoni F (2008) Code coverage: free software and virtualizationto the rescue. Boards Syst April:32–35

64. Santhanam U (2001) Automating software module testing for FAAcertification. Ada Lett 21(4):31–37 (Proceedings of SIGAda’01,Bloomington, MN, September 30–October 4, 2001)

65. Fey I, Stürmer I (2008) Code generation for safety-critical sys-tems—open questions and possible solutions. In: Proceedings of

the SAE World congress, Detroit, April 14–17, Paper No. 2008-01-0385

66. Intermational Organization for Standardization (2007) IEC 26262road vehicles—functional safety. Baseline 10

67. Conrad M (2007) Using simulink and real-time workshop embed-ded coder for safety-critical automotive applications. In: Proceed-ings of MBEES’07 Workshop on Modellbasierte EntwicklungEingebetteter Systeme III, Dagstuhl, Germany, January 15–18,pp 41–50; an updated version (for IEC 61508 Applications) appearsat: http://www.safetyusersgroup.com/

68. Erkkinen T (2004) Production code generation for safety-criticalsystems. In: Proceedings of the SAE World Congress, Detroit,March 8–11, Paper No. 2004-01-1780

69. Potter B (2008) Model-based design for DO-178B. MATLAB Dig17(3). http://www.mathworks.com/company/newsletters/digest/2008/may/DO-178B.html

70. Bhatt D et al (2005) Model-based development and the impli-cations to design assurance and certification. In: Proceedings ofDASC’05, 24th digital avionics systems conference, Washington,DC, 30 October–3 November

71. Stürmer I et al (2007) Systematic testing of model-based codegenerators. IEEE Trans Softw Eng 33(9):622–634

72. Sampath P et al (2008) Verification of model processing tools. In:Proceedings of the SAE World Congress, Detroit, April 14–17,Paper No. 2008-01-0124

73. Jaw LC et al (2008) Model-based approach to validation and verifi-cation of flight critical software. In: Proceedings of NAECON’08,IEEE National aerospace and electronic conference, Fairborn, July16–18

74. Denney E, Trac S (2008) A software safety certification tool forautomatically generated guidance, navigation and control code. In:Proceedings of NAECON’08, IEEE National aerospace and elec-tronic conference, Fairborn, July 16–18

75. Zoffmann G et al (2001) A classification scheme for software ver-ification tools with regard to RTCA/DO-178B. In: Proceedingsof SAFECOMP 2001, 20th international conference on computersafety, reliability and security, Budapest, Hungary, September 26–28, pp 166–175

76. Bunyakiati P, Finkelstein A, Rosenblum D (2007) The certificationof software tools with respect to software standards. In: Proceed-ings of 2007 IEEE international conference on information reuseand integration, Las Vegas, August 13–15, pp 724–729

77. Souyris J, Delmas D (2007) Exterimental assessment of Astreé onsafety-critical avionics software. In: Proceedings of SAFECOMP2007, 26th international conference on computer safety, reliabilityand security, Nuremberg, Germany, September 18–21

78. McCabe Software (2006) DO-178B and McCabe IQ, Warwick, RI79. Safety Critical Systems Club (2009) Tools directory, London, UK.

http://www.scsc.org.uk/tools.html80. Aldec Corp. (2007) DO-254 hardware verification: prototyping

with vectors mode. White Paper, Rev. 1.2, Henderson, Nevada81. Lange M (2008) Automated CDC verification protects complex

electronic hardware from metastability issues. VME Critical Syst26(3):24–26

82. Lange M (2007) Assessing the ModelSim tool for use in DO-254 and ED-80 projects, White Paper, Mentor Graphics Corp.,Wilsonville, May 2007

83. Baghai T, Burgaud L (2006) Reqtify: product compliance withRTCA/DO-254 document, TNI-Valiosys, Caen, France, May 2006

84. Dellacherie S, Burgaud L, di Crescenzo P (2003) Improve—HDL:a DO-254 formal property checker used for design and verifica-tion of avionics protocol controllers. In: Proceedings of DASC’03,22nd digital avionics systems conference, Indianapolis, October12–16, vol 1, pp 1.A.1–1.1-8

123