Ensuring tools can perform a design or verification activity to an acceptable level of confidence.
By Michelle Lange and Tammy Reeve, Patmos Engineering Services, and Jacob Wiltgen, Siemens EDA
DO-254, which is required for airborne electronics development, is a design assurance standard. Design assurance requires multiple layers of review and verification within the development process to ensure safe operation of the design being produced. This means when an engineer is doing design work, their work is always being reviewed and verified, usually in numerous ways — depending on the safety criticality as indicated by the design assurance level, or DAL.
When tools automate processes that an engineer would normally perform, then these tools need some checks and balances as well. This is where tool assessment, and in some cases, qualification, fits in. To quote from DO-254, “The purpose of tool assessment and qualification is to ensure that the tool is capable of performing the particular design or verification activity to an acceptable level of confidence for which the tool will be used.”
While tools make amazing designs possible, what happens when engineering teams need to “qualify” these tools? What does that even mean? How much work is it? Is it worth it? These are common questions asked by tool users subject to RTCA/DO-254 compliance. Companies, like Siemens, who provide tools that are of great benefit to the goal of safety (such as in the aerospace domain), must understand and support their tools in the context of these programs.
In this article we will describe some of the requirements related to tool qualification specific to the safety-critical programs governed by DO-254 compliance.
DO-254 11.4 presents a flowchart of the “Tool Assessment and Qualification” process (figure 1). In truth, it’s not as straightforward as it could be. In short, it means project teams have to do the following:
Fig. 1: DO-254 tool assessment and qualification flow chart.
In examining the description and flow chart, it is clear that tool qualification isn’t always a requirement. Independent assessment and relevant history are often presented instead of qualification. DO-254 describes independent assessment as a method that “verifies the correctness of the tool output using an independent means.” DO-254 describes relevant history as demonstrating that a tool has been “previously used and has been found to produce acceptable results.”
If you are not able to demonstrate independent output assessment and/or relevant history, you must qualify the tool. Depending on the tool type and DAL, its either a basic qualification (DAL C design tool or DAL A/B verification tool) or design tool qualification (DAL A/B design tool). Project teams looking for guidance may be disappointed since DO-254 does not say much about either of these.
For Basic Tool Qualification DO-254 says, “Establish and execute a plan to confirm that the tool produces correct outputs for its intended application using analysis or testing.” In common practice this means defining the tool functions as “Requirements” and testing these requirements to prove the tool works. For Design Tool Qualification, DO-254 points the user to “the strategies described in Appendix B of this document, the tool qualification guidance of RTCA DO-178B / EUROCAE ED-12B for software development tools or other means acceptable to the certification authority.” In other words, the applicant/tool user is left to figure this out on their own, perhaps looking at either the somewhat obscure methods presented in Appendix B or what’s now in DO-330 (a supplement to the newer version of document DO-178C for software, focusing exclusively on Tool Qualification). Whatever method proposed is subject to the scrutiny of the authorities, who are notoriously stringent about the efforts for Design Tool Qualification. (Hint: Don’t despair! Keep reading to learn how to avoid Design Tool Qualification).
Additionally, ED-80/DO-254 states “It is only necessary to assess those functions of the tool used for a specific hardware life cycle activity, not the entire tool.” In other words, you only need to assess what you use. As an example, a functional simulator today has a suite of features supporting multiple languages, debug and visualization, digital and mixed-signal designs. In the event the design is entirely digital, the features enabling simulation of analog circuits coded in SV RNM, Verilog-A, etc. would not be utilized and therefore would not be assessed. In addition, if a tool has both design and verification features, DO-254 allows you to separate those features, which may provide some advantage.
DO-254 states, “The tool assessment and qualification process may be applied to either a single tool or a collection of tools.” A tool flow, sometimes referred to as a toolchain, is comprised of a set of independent tools deployed together to perform a complex task. As it pertains to DO-254, this set of tools forms a toolchain, which takes project teams through planning, requirements, design modeling, design creation, verification and validation, and through backend processes such as RTL Synthesis and Place and Route. Commonly, the outputs of one tool or set of tools form the inputs to the next tool in the chain. Below is a graphic of a traditional FPGA development flow. At each stage, one or more development tools will be deployed to accomplish the objective.
Fig. 2: Sample tool flow.
As shown in figure 2, the development lifecycle consists of a tool chain and often has multiple, overlapping stages of verification to ensure the design does what it’s supposed to. As an example, project teams frequently want to take credit for requirements-based testing at an RTL level of design abstraction. Fortunately, this can be done, but you are required to demonstrate that the testing results at the RTL level are valid. To accomplish this, you run at least a sufficient subset of the tests in a gate level simulation environment and/or execute those tests on hardware. Figure 2 shows this concept, and these multiple layers of verification on the verification results (which is the tool output) mean that the tool output is “independently assessed” and therefore qualification is not needed. The trick is to ensure proper evidence, which usually requires tracing verification results (from multiple verification methods/stages) to/from requirements, comparing results from the various verification processes, and documenting all this.
First and foremost, design tool qualification in DO-254 programs can nearly always be avoided. All you need to do is ensure that somewhere in your downstream process, you have one or more steps in your development process that verifies the results delivered from such a tool. DO-254 itself even suggests this, stating, “Using such a design tool without independent assessment of the tool’s output or establishing relevant history is discouraged…” For example, if you are using a code generator on a DAL A/B design, you would have to both review the output from the tool (since code must be reviewed in a DO-254 process) and verify the generated code (through simulation or other means, along with target hardware testing – which should already be processes in your development flow). Performing these activities would provide independent assessment of the code generator’s tool output and thus allow you to avoid design tool qualification.
Synthesis, Compiler, and P&R tools migrate a design from one level of maturity to the next. A synthesis tool consumes a design model commonly described in one of the RTL languages (Verilog, System Verilog, VHDL), and synthesizes the design into a gate level netlist. DO-254 requires verification as the design matures from RTL to gates and all the way until the function is loaded on the airborne target. Given this, it is accepted across the industry that the testing performed throughout this process is inherently ensuring the design function is still behaving per the requirements, and therefore, tool qualification of Synthesis, Compilers, and Place and Route tools is not required.
There is a special category of verification tools whose purpose is to verify coverage. DO-254 states that tools “used to assess the completion of verification testing” do not require qualification. AC/AMC 20‑152A further clarifies and confirms the “exclusion of tool assessment/qualification activities for code coverage tools only when they are used to assess whether the code has been exercised by requirements-based testing/simulations (elemental analysis).”
For a fuller treatment on the terminology and requirements related to tool qualification specific to the safety-critical programs governed by DO-254 compliance, please download the whitepaper How Do You “Qualify” Tools for DO-254 Programs? The paper also provides some practical examples of tool qualification processes and strategies for commonly used tools and presents a practical guide on using Siemens verification tools in a toolchain utilized within the DO-254 development lifecycle.
For the absolute latest in policy, see what AC/AMC 20-152A has to say about tool qualification by taking this free short training module.
Michelle Lange supports the sales and marketing aspects of Patmos Engineering Services and other business ventures. Lange is a 30-year veteran of the electronic design automation industry, focusing in the area of avionics certification for the past 15 years.
Tammy Reeve, president of Patmos Engineering Services, has 20 years’ experience as an FAA DER and trainer for DO-254 and DO-178B/C programs, in addition to 10 years as an avionics engineer for both software and hardware programs. Reeve has won numerous awards and has been very active in the industry, including acting as co-secretary of the “Model Based” committee of SC-205, chairing the DO-254 Users Group, and being Session Chair for DO-254/178C tracks at several SAE industry events for many years. She has numerous designations and has performed work under the FAA, EASA, CAAC, Transport Canada, ANAC, and other worldwide certification agencies.
Leave a Reply