What do golf and automotive software have in common? To the novice, seemingly nothing. However, the commonalities are more profound than expected.
Consider common traits of golf and ISO 26262 as experienced by this author:
- Both seem easy to start up, but complexities befuddle beginners
- Both seem cheap to engage in initially, but become more expensive once they become serious pursuits
- Their official rule books have vagaries which take years to master
- Both can be played for fun with little effort, but when engaging for profit, the stress grows immensely
- Rules are monitored by players and referees and subjectivity abounds
- Capability improves directly with training, then practice
- Your second attempt will be more successful than your first
- Among many participants, few are experts
Like professional golf, ISO 26262 success requires a continual improvement of skills and an effort to minimize mistakes. But it helps to know what the common and not-so-common mistakes teams encounter when practicing ISO 26262.
Read on for the top 15 ISO 26262 mistakes and advice on how to prevent, or in the worst case, mitigate them.
First, it is important to realize that even experienced ISO 26262 aficionados consider ISO 26262 to be subjective, vague and generic; yet, automotive certification requires strict conformance combined with exceptionally high quality.
Remember: ISO 26262 is relatively new, being first published on the auspicious date of 11/11/11 (November 11, 2011).
So, regard with suspicion anyone who claims to be an expert on a standard that is merely a few years old. Conservatively, “moderately experienced” understanding will be as good as it gets for a few more years.
Now, given infinite resources of time and budget, ISO 26262 success can be achieved. However, the competitive landscape of automotive development changes rapidly. Therefore, the goal is to achieve compliance while meeting or exceeding the minimum standards while, hopefully, also meeting (and perhaps surpassing) the competition.
ISO 26262 success requires developing automotive electronics and achieving compliance via the most expedient and productive path possible, while avoiding any major mistakes. And let’s remember that bad luck is not the cause of mistakes, just as those same mistakes are not prevented through good luck.
Mistakes in automotive electronics are the result of a lack of understanding, planning for and applying ISO 26262’s true intent and hidden rules.
As professional golfer Arnold Palmer once quipped after a particularly spectacular tournament win: “Me, lucky? I’ve found that the more I practice, the luckier I become!” With the information herein, I hope that your own “luck” increases with your successful ISO 26262 practice and error-proofing your processes to eliminate the common mistakes.
So, what are the common mistakes in automotive development? Let’s begin with mistake #15.
Mistake #15: Inappropriate Tool/Platform Qualification
Whether you use a task-specific tool or a robust product development platform like Jama, your options will generally be software based, as they consist of digital instructions used in the development or verification of automotive software and/or hardware.
It’s also worth noting that global certification authority TÜV SÜD announced Jama’s ISO 26262 “Fit for Purpose” functional safety certification in 2016.
ISO 26262 tool qualification activities can be divided into a Tool Classification (analysis of the software) activity followed by a Tool Qualification activity.
Some organizations neglect to first perform the Tool Classification activity, which entails analyzing the intended usage of the tool (use cases). This analysis should be documented, analyzed and evaluated to determine if a failure within the tool can introduce or fail to detect errors in the hardware/software being developed. Tool Classification should yield a determination of the Tool Impact class (TI1 or TI2). Then, the Tool Error Detection class should be determined (TD1, TD2 or TD3).
As a result of this analysis, a required Tool Confidence Level should then be determined, for example, TCL1, TCL2, or TCL3. Tools with the lowest TCL (e.g. TCL 1) do not require further tool qualification. For TCL2 and TCL3, further tool qualification is required.
The selection of appropriate tool qualification methods should be dependent upon the required TCL plus the Automotive Safety Integrity Level (ASIL) of the safety-related software/hardware being developed or verified with the tool.
Many companies fail to fully perform Tool Classification and Tool Qualification activities, often doing too little work to qualify critical tools but sometimes qualifying simpler tools which need not undergo such qualification.
Mistake #14: Thinking Testing Directly Improves Quality
A common ISO 26262 misperception is believing, and then acting on the belief, that testing is done to directly improve automotive quality and safety. In truth, testing is done to assess quality. Testing itself does not directly improve automotive quality or safety; if it did, the poorest quality automotive electronics could achieve perfection by repeated testing. Instead, testing should be used to assess quality with feedback loops in the engineering development process to improve requirements, traceability, coverage, and robustness through a disciplined safety process commensurate with the ASIL—all actions teams can easily perform in a robust product development platform.
Mistake #13: Poor Management Visibility & Manual Reviews
ISO 26262 requires adherence to many safety-related objectives plus reviews of dozens of process steps, documents and artifacts. Yet managers rarely have visibility into the true project or review status. One answer? Automate the review process via compliant ISO 26262 checklists and automate the project management process with an ISO 26262-specific project tracking tool. Request senior leadership to require mangers brief them regularly on completion of major objectives and reviews during the engineering lifecycle. Again, a robust product development platform can greatly simplify these processes, and prevent the mistakes outlined in this paper.
Mistake #12: Not Detecting Design Errors Early
ISO 26262 is designed to minimize, mitigate and detect design errors. However, ISO 26262 is budget-agnostic within a given ASIL: Automotive producers are free to expend as much money as they desire in their pursuit of compliance.
The marketplace, however, is different: Studies show that the cost of fixing a defect in the formal test phase is 5-10 greater than detecting and correcting the defect during the development phase.
Development standards for hardware and software, peer reviews with detailed checklists and component-based testing are all important steps. However, an often-overlooked step is static logic analysis via a commercial static analysis tool (hint: look up “ISO 26262 static logic analysis”). Static analysis (analyzing the logic prior to testing) is different from dynamic analysis which executes the logic via real-time testing.
Both are important but static analysis is often overlooked, resulting in much greater expenditures than necessary. In the clear majority of cases, the cost of applying a code analysis tool is less than the downstream cost of not applying it.
Advanced static code analysis tools can provide continuous automated logic (C, C++, VHDL, etc.) reviews with each check-in to verify that the change did not cause a new problem (recursion or violation of cyclomatic complexity limits or standards).
Mistake #11: Not Using a Commercial Certifiable RTOS
In the past decade, complex safety-critical systems for avionics, industrial control, medical devices and now automotive software applications have increasingly made use of a real-time operating system (RTOS). Previous, or trivial, projects make do with a simple kernel or executive/poll-loop which controls program execution.
However, the trend toward rapid development, extensible and reusable designs, third-party libraries and drivers, communication protocols, partitioning, safety and ISO 26262 compliance all mandate the need for a commercial RTOS.
Consider performing Rate Monotonic Analysis (RMA) to further error proof your RTOS in the test readiness phase. For ISO 26262, there are 15-20 primary criteria applicable to RTOS selection.
RTOS compliance with ISO 26262 can take years and cost hundreds of thousands, hence the need for a proven-compliant RTOS.
Mistake #10: Lack of Automated Testing
Testing is a key aspect of ISO 26262, and projects tend to expend more hours on software testing than on software coding. Whereas software coding is (or should be) given the full benefit of modern software tools, testing is often an afterthought with very little consideration given to tools and productivity.
Quality does not inherently improve via testing; instead, quality is measured and then the process can be improved, which then improves quality.
Testing will be performed dozens, and potentially hundreds, of times on the same items over the life of the product and cover many evolutions.
The best regression-test strategy is to retest as much of the software as possible, automatically. Thus, testing should be automated for all but the simplest of products.
The test team should be involved in all requirements reviews, design reviews, and code reviews to ensure these artifacts are testable and to prepare automated test cases as early as possible—all complicated things a qualified product development platform can make easy.
Mistake #9: Lack of Path Coverage Capture During Functional Tests
Structural coverage (commonly denoted as “path coverage”) is required to an increasing degree for Level C, B, and A software. (Path coverage is not required for Level D, and no ISO 26262 process steps are required for Level E). However, path coverage is usually sought and obtained after the other forms of required ISO 26262 testing are achieved. This is extremely inefficient and unnecessary.
Instead, the software environment and test suite should be considered in advance and path coverage obtained during black-box testing of requirements (e.g. “functional testing”).
Remember, when you instrument the software to obtain path coverage data, you need to execute the tests twice: once with instrumentation and once without, to perform test result correlation and to affirm the instrumentation did not mask test failures. A closely related mistake is performing structural coverage simply to satisfy the higher ISO 26262 ASIL level requirements for such.
The purpose of structural coverage is #1) to assess whether all software meets its requirements, #2) to determine software performs correctly, and #3) show that there are no uncovered paths (per the ASIL)—that there is no dead code at higher ASIL levels.
Merely setting a debugger’s program counter, setting registers, and single-stepping to “obtain structural coverage” satisfies nothing; although it may seem to meet ISO 26262’s structural coverage requirement. It does not.
Mistake #8: Excessive Logic Iterations
It is normal for a new project to contain a logic baseline with multiple iterations. However, most projects greatly exceed any reasonable number of iterations because the software development is viewed as an iterative process instead of an engineering process. This can be particularly true with weak organizations practicing weak Agile/Lean methodologies. Agile/Lean are good and much can be gleaned from them; proper usage on safety-critical automotive electronics does not mean “informal”. Software creation does not necessarily imply software engineering; however, it should.
Excessive code iterations result from one or more of the following deficiencies:
- Insufficient requirement detail
- Insufficient coding standards
- Insufficient checklists
ISO 26262 coding standards and checklists are available from a variety of sources. QA should monitor for and report on excessive code versions since it may indicate poor design definition, weak requirements or overly complex requirements.Mistake #7: Inadequate and Non-Automated Traceability
Traceability, both top-to-bottom and bottom-to-top, is required for ISO 26262 certification; bottom-to-top is implicit in structural coverage necessary for higher ASIL levels. Top-to-bottom traceability ensures that system level requirements, software requirements, software code, and software tests are complete. Bottom-to-top traceability ensures that the only present functionality is that specified by requirements.
Traceability should be audited along the way to affirm appropriate reviews were performed. Traceability must be complete and audited as such to prove ISO 26262 compliance.
However, tremendous project management and productivity efficiencies may be had by achieving accurate traceability early in the project and continuously thereafter. Also, automotive development increasingly uses models via model based development (MBD).
When models are used, traceability must be provided through the model, meaning traceability must show the requirements upon which the model was based, the model elements which embody those requirements, and the code/tests which then follow those model elements.
And traceability will be required through the life of the product, often several decades for common logic modules. Except for the smallest project (under 2-3K LOC), traceability should be semi-automated via a custom or commercial tool such as Jama. Remember, traceability should never be left only to automation: It must still be reviewed.
Mistake #6: Insufficient Requirement Detail
It is not possible to objectively and accurately quantify within ISO 26262 the required level of detail necessary for complete automotive software and hardware requirements. However, the level of detail should be sufficient to allow for independent designers to achieve functionally equivalent designs and implementation without the need for major clarification or assumptions.
For decades researchers (Dr. Barry Boehm, Watts Humphrey, et al) have proven the number one cause of logic defects to be assumptions; such assumptions emanate from weak requirements.
In fact, ISO 26262 suggests tests be based on requirements, not merely the logic itself, meaning the requirements have sufficient granularity to cover most logic decisions.
These assumptions need to be documented so that maintenance of the logic is easier; the best documentation is standalone requirements, not merely comments embedded within the logic. Three years from now, the person (probably a different person) modifying the requirements and/or logic may not understand why the requirement is written that way without the assumptions being documented.
In general, the guideline of “1:25” should be adhered to at a minimum, meaning, a minimum of one requirement per twenty-five lines of high-order language source code.
Note that most automotive projects suffer from insufficient requirement detail. This means that dangerous, or at least inaccurate, assumptions are made by developers deviating from intended functionality.
An excellent means of ensuring that requirements are sufficiently detailed is to develop functional (requirements-based) test cases in parallel with the specification of those requirements, by a test engineer independent from the systems engineer specifying the requirements. The test engineer will essentially validate the requirements while writing the test cases, and the resultant test cases will be independently reviewed by the associated systems engineer to confirm correct intent of those very requirements.
Also, consider ways to catch and correct these types of issues during the initial requirements reviews. QA can apply statistical process controls around the peer-review process to monitor the variation of key characteristics (defect types), counts and review times that the team desires to achieve in these reviews and flag them for the team—an excellent opportunity for process-improvement.
Mistake #5: Inadequate Formal Planning & Gap Analysis
ISO 26262 requires formal planning documents. Most companies already have partially acceptable processes but don’t know how to take ISO 26262 credit for existing work.
Instead of starting over and adopting all new processes, companies should perform an ISO 26262 Gap Analysis to analyze the gaps in their existing processes and how best to close them.
Then procure or develop ISO 26262 process templates, checklists and a formal gap analysis to adopt industry best practices.
Mistake #4: Inadequate Quality Assurance
Quality assurance provides a distinct purpose for automotive certification: to independently provide proof that ISO 26262 guidelines were followed.
QA performs two key roles:
- Ensuring engineering plans, standards and checklists are in place which conform to all applicable certification guidelines
- Auditing engineering to assess whether those plans and standards were followed
Quality assurance personnel should be independent and it’s best if they do not perform engineering activities including testing. In smaller companies this may be difficult to achieve; the applicant then should state how the independence aspect will be achieved. Quality assurance also is the facilitator of process compliance as the team builds their evidence of successful transitions to the next engineering phase.
Mistake #3: Deferring Tool Selection
It is very common for engineering projects to select the processor and programming language prior to selecting important tools relevant to ISO 26262. Which tools? The development environment, RTOS, drivers, test suite/tools, modeling tools, structural coverage tools, requirements management and traceability tools like those built in to Jama Software’s product development platform, and configuration management tools.
Because these tools directly impact quality, productivity, schedule, budget and certifiability, tool selection must be approached holistically and be based upon the total software engineering environment, user skill-set and culture, criticality level, downstream reusability and realistic budget considerations.
It is best to first query industry experts to learn about the latest best-in-class tools prior to selecting processor, RTOS, and programming language; otherwise, incompatibilities will ensue which result in significantly negative budget and schedule impact.
Mistake #2: Neglecting Independence
ISO 26262 requires an increasing amount of “independence” as the ASIL level increases. Independence refers to the dissimilarity between the originator of an ISO 26262 required lifecycle step or artifact and the verifier of that same step/artifact. Verifier can refer to Reviewer in the case of items requiring review.
If the required degree of independence for the associated criticality level of the life-cycle artifact/step is not achieved, the entire ISO 26262 compliance is subjugated; the result is that step, and possibly the entire product, must be repeated or re-engineered.
And if a future iteration of the product has a more rigorous ASIL (quite common) and the degree of independence required for that level is not achieved (equally common), then significant re-work is required.
So, when applying independence, pay heed to the potential highest ASIL the product “might” someday require, and always apply more independence than required. The resultant higher level of review quality associated with greater independence is worth the slight decrease in reviewer efficiency.
Mistake # 1: Failing to consider all of ISO 26262 an Integrated Eco-System
ISO 26262 has ten parts, or “chapters”:
- Management of functional safety
- Concept phase
- Product development at the system level
- Product development at the hardware level
- Product development at the software level
- Production and operation
- Supporting processes
- Automotive Safety Integrity Level (ASIL)-oriented and safety-oriented analysis
- Guideline on ISO 26262
Some users pick and choose aspects of ISO 26262 to follow, as if it were a restaurant menu. It is not. Specifically, ISO 26262 requires users to adhere to all of ISO 26262’s applicable objectives, where “applicable” is determined by ASIL.
While we would like to say the aforementioned mistakes are the only ones we’ve seen or made with ISO 26262, such is not the case. In fact, we have made or observed dozens more.
Hopefully with the information herein, at least the reader can avoid these top mistakes and start to error-proof your process and tools through careful selection, clearly documented procedures, implementation of these lessons learned and training.
About Vance Hilderman
Vance Hilderman is the founder of two of the world’s largest avionics development services companies. He is also the developer of the world’s first training in DO-178 and trainer of over 8,000 engineers in 45 countries in DO-178, DO-254, DO-278, and DO-200A. He is also the primary author of the world’s first and bestselling book on DO-178 and DO-254.